Downloading and Visualizing Denver Bike and Ped Data with Python and Kepler.gl
It’s been a while since the last time I published an article. I had a little bit of spare time in between jobs. So, I decided to write a new one that explains how to download publicly available bike data from the city of Denver and visualize it using KeplerGl library for Python.
By the way, I started working as a Lead Traffic Modeler at Ulteig. Please check out the company and if you are in the transportation business, hit me up!
This article has actually become the first part of a series. The reason I started doing all this to check whether there is a difference before and after COVID19 in terms of bike ridership and pedestrian counts. However, the CDOT make bike and pedestrian counts available two months prior to today’s date. Therefore, it only goes up to Feb 29th, 2020. I will definitely revisit this in the future to do that comparison!! Let’s start
First, let’s import the libraries we will use:
The next thing we will try to do is to automate bike and ped data downloads from CDOT’s website. To achieve that, we first need a list of stations that we are interested in. I selected pedestrian and bicycle sensor stations with continous counts. If you take a look at that stationUrl you can see that it is easily modifiable. We need to send a post request using that URL to get the list of available ped and bike stations.
There was a lesson for me to learn here. After a couple of failed attemtps, I realized that dtdapps.coloradodot does not require security certifications. So, the best solution is to bypass certificate validation completely by adding verify=False. Once you run the code block above, you should see something like the image below.
Yay! We have a station list now! Let’s get data for each station over time.
The code block above will go through each station ID and download stationData and then append everything to an empty data frame. The final data frame should look like this:
We need to add location information to each row by joining sf with sdf so that once we visualize everything there will be corresponding latitude, longitude, and mode information for each point.
Okay, this is everything we need! Using this data frame (export), we can visualize the total number of counts for each location over time. Let’s first create an empty map to make sure Kepler works in our notebook.
In this view, you can actually use every function of Kepler.Gl. If you want to try Kepler.Gl out with some random dataset, please check out this link. You can even upload datasets but I will just use the notebook to do that because I already have a configuration for my map. To learn more about Kepler.Gl for Jupyter, take a look at this link. Here is my config file:
Let’s create our map.
Alright! Our map is ready. If you want to see a live version, please click the link below:
Large-scale WebGL-powered Geospatial Data Visualization Tool
Kepler.gl is a powerful web-based geospatial data analysis tool. Built on a high performance rendering engine and…
Thanks for reading!