This is an exercise in using spreadsheets and pivot tables to see how aggregation is necessary for finding insights in granular data.
To get started: Download this exercise's data as CSV. Then import to a spreadsheet.
The data comes from the Sun Sentinel's online database, "Speeding Cops in South Florida". Each row represents a trip associated with a cop vehicle's transponder. The transponder is tracked by toll booths. And that was enough for the Sun Sentinel to do their Pulitzer-award-winning analysis and investigation.
From the page:
A Sun Sentinel investigation of officers’ SunPass toll records found nearly 800 cops from a dozen agencies drove 90 to 130 mph on highways in Broward, Palm Beach and Miami-Dade counties from October 2010 through November 2011. This database shows transponders that exceeded an average of 90 mph at least three times during that time period.
Note that the data published by the Sun Sentinel simplifies the work they had to do, including measuring the distance between toll stations and filtering out irrelevant data. And of course, the massive legal fight to get the data.
The data for this exercise includes all published records from the Sentinel database that relate to the Miami Police – roughly 27,000 rows. It's been modified so that you have to recreate the average speed value yourself.
After importing the data into a spreadsheet, we want to find:
The worst speeder in the bunch? The same Miami cop caught going 120 mph by the state trooper. Our analysis showed he commuted at speeds above 100 mph on 114 days and only slowed down after his traffic stop hit the news.
First: Download the data as CSV
Second: Import it into a spreadsheet
For your convenience, I've posted an example of the spreadsheet with the columns needed to calculate average speed per trip.
You can refer to it but please build your own spreadsheet from scratch by importing the exercise's data file.
Pivot tables are an effective tool for quickly summarizing the facts from a mass of records