Big data is more than big. Big data is ginormous times humungous times colossal.

Big data is like an exponentially expanding recycle bin for all the bits and bytes we leave behind while living in a digital world. From posts to purchases to playlists, almost every click creates a data point that our computers, tablets and cell phones deliver to the cloud to form massive data sets. The same goes for the internet of things — everyday devices ranging from cars to watches to appliances that also capture and upload data.

Man holidng and looking at his smart phone

How much total data are we generating? The most commonly quoted number on the internet is 1.7 megabytes per person. Per second. That’s roughly equivalent to “Moby Dick.” The novel. Not the whale.

Big data has been called the “new oil” because of the way it drives today’s economy, spinning out new goods and services, streamlining business practices and speeding the rate of innovation.

But in the same way that oil must be refined, big data is only useful when meaning can be extracted from all of those bits and bytes gushing from a myriad of sources in a multitude of formats. This is what defines a big data set — the volume of data, but also the variety of its structure, the velocity of its growth and the necessity of powerful processing tools to store, integrate and analyze disparate files measured in petabytes (1 followed by 15 zeroes).

Much of this data never existed before or was only available in small samples collected tediously by hand. Now we generate it as naturally as we breathe. Combined with traditional sources such as the census, local government and academic research, the exploding volume of data from everyday life — available in real time and rich detail — make it possible to draw broader conclusions and produce more intelligent decisions than ever before.

By giving planners the resources to think bigger, dig deeper and work smarter, big data helps them better understand and address community needs.

All of that makes big data a game changer for community planning. By giving planners the resources to think bigger, dig deeper and work smarter, big data helps them better understand and address community needs ranging from housing costs to congestion to climate resiliency, said Petra Hurtado, research director at the American Planning Association (APA).

Table: benefits of big data for planning and managing cities

Take traffic studies, for example. “In the good old days, a student would go out and count traffic for us,” Hurtado said. But a kid with a clicker can only collect so much data. Now sensors powered by artificial intelligence can capture and categorize traffic data on a continuous basis.

The dynamic nature of big data makes planning more agile.

The dynamic nature of big data makes planning more agile. Instead of relying on old data sets from the past, constant flows of new data empower planners to spot changes and make adjustments more quickly and “not be nailed down to a five-year or a 10-year plan,” Hurtado said.

Much of this fresh-off-the-vine data originates from a source that almost never leaves our side — our cell phone. Phones and apps are like sponges that soak up data every second of the day about who we are, where we go and what we do — much of which is eventually shared by cell services, app makers and data brokers.

While this raises legitimate privacy concerns if the data is not securely anonymized and aggregated, most people surrender data voluntarily — if only because they don’t always read the terms and conditions when activating their phone or downloading an app.

From a planning context, the movement of people and the behaviors of people is valuable data.

“Cell phone data is a gold mine because you can access it easily and almost everyone has a cell phone,” Hurtado said. “From a planning context, the movement of people and the behaviors of people is valuable data.”

Despite the ability to cast an enormous net, big data comes with a caveat, cautioned Hurtado. Some populations — the unsheltered, the very old, the very young — might not be represented in the data because they lack a digital presence. Depending on what they want to learn, planners may have to collect their data “in person, the way we used to do it,” Hurtado said.

Whatever its pitfalls, big data more than makes up for them with possibilities. Every two years the amount of data in the digital universe doubles. All across the country an ecosystem of big data sources, portals and platforms is playing an expanding role in how communities plan for their future.

One of the latest applications for big data involves creating digital twins. A digital twin is a virtual representation of a system, object or location that is presented in real time. These three-dimensional computer models enable users to monitor performance, run simulations and make predictions.

“In the case of city digital twins, being able to quickly test ideas and make necessary adjustments before finalizing planning and policy decisions can allow planners to keep up with the accelerated pace of change that characterizes our world,” Hurtado stated in an article she co-authored for the APA.

The digital twin of a city is built on layers and layers of data. The terrain, buildings and infrastructure form the foundation. Real-time data about the movement of people and goods through the city — captured by sensors, cell phones and the rest of the internet of things — then brings the twin to life.

Phoenix is among a growing number of cities building out digital twins to meet specific goals such as reducing carbon emissions or improving transportation systems, but the poster city is Orlando, where the Orlando Economic Partnership (OEP) is working with a company called Unity to create a digital twin spanning 800 square miles across three counties.

Animated by a holographic display, the Orlando’s region digital twin will have “nearly unlimited uses and set a new standard for the future of urban planning,” according to the OEP. “By aggregating public and private sources of information, the digital twin will serve as a critical resource for all decision makers in the area.”

The Oregon Department of Transportation tapped big data a number of years ago when it sought to improve the state’s bicycle infrastructure.

The Oregon Department of Transportation (ODOT) tapped big data a number of years ago when it sought to improve the state’s bicycle infrastructure. Using data from an app for cyclers created by a company called Strava, the project led ODOT to begin using big data in bigger and better ways.

People riding bicycles on street filled with trees

“The Strava project definitely showed the value of [big data],” said Alex Bettinardi, senior integrated analysis engineer. “We’re using Strava and these types of information sources continually now.”

The original project tracked the routes taken by 35,000 riders throughout the state — a “night and day “improvement over the volume of data measured by a handful of fixed counting strips, Bettinardi said.

Knowing which stretches of road are most heavily traveled by cyclists allowed ODOT to place rumble strips closer to the fog line in certain spots rather than in the shoulder where they are hazardous to cyclists. ODOT also was able to schedule shoulder maintenance during periods of slack use. “In hindsight, this seems like easy stuff, but without data, it’s hard to make the call,” Bettinardi said.

Big data isn’t the next big thing anymore. It’s the norm. “There is a demand and expectation in our profession that we would make decisions based on data,” said ODOT principal engineer Chi Mai.

But that’s not as simple as it sounds given the analytics required to crunch the data and display the results in easily understood graphics. ODOT relies on the Regional Integrated Transportation Information System (RITIS) based at the University of Maryland to do the heavy lifting.

With more than 8,000 users across the country, RITIS is a leading platform for transportation analytics based on data supplied by Inrix — a company that collects traffic data from connected devices and vehicles — and users like ODOT. All state and local agencies in Oregon as well as universities can access RITIS under ODOT’s contract.

“We specify all the data we want them to ingest into their system and they produce the analysis and displays based on our queries,” Mai said. Back in the day it could take days or weeks for an old school IT team to respond to a query. “With RITIS, I can do everything within minutes and look at as many scenarios as I want,” she said.

An example of big data in action is a statewide freeway congestion overview ODOT prepared in 2020 and updated this spring. By using data to define the relationship between the economy and congestion, the overview is a guide to developing transportation policies and making transportation investments that support a sustainable economy.

Currently underway is an analysis of congestion caused by raising a freeway drawbridge spanning the Columbia River. While the bridge must remain down during peak travel times, those peaks last longer than they did when the Coast Guard established the windows for raising the bridge. By showing the current costs of congestion from raising the bridge while traffic is still at its peak, the analysis aims to make the case for extending the time when the bridge must remain down, Mai said.

Big data has underpinned the fight against blight in Detroit for the better part of a decade starting with the Motor City Mapping project — a database that cataloged all 380,000 land parcels in the city. Collected in 2013 and 2014 under auspices of the Detroit Blight Removal Task Force, the data was gleaned by volunteers who texted photos and descriptions of every parcel to the database using a custom cell phone app. Any Detroit resident could download the app and “blext” (a mashup of blight and text) updates to the database.

The survey data was integrated with other data about each parcel such as ownership, vacancy status and code violations to provide a complete picture of every blighted building and vacant lot in the city, pinpointing patterns and helping establish priorities.

The task force found 84,641 blighted structures and vacant lots. About 40,000 needed to be demolished and cleaned up immediately because of their condition. Since that time, Detroit has demolished more than 23,000 abandoned homes. Mayor Mike Duggan has vowed to complete the job over the next four years.

Motor City Mapping is no longer in an active phase, but it remains an example of using big data to make the community a partner in planning efforts. Data Driven Detroit, a data platform that assisted with the Motor City Mapping project, continues to engage the community by providing free public access to ready-made data sets, maps and graphics as well as delivering contracted services.

“We are a data intermediary for the Detroit region and serve in a translating role between anyone with a need for data and the massive amount of data that is out there,” said Noah Urban, co-executive director.

This approach is gaining ground across the country. Data Driven Detroit belongs to the National Neighborhood Indicators Partnership, which connects more than 30 organizations that work to ensure communities have access to data and the skills to use the information. Members range from Neighborhood Nexus in Atlanta to Data You Can Use in Milwaukee to the Urban Strategies Council in Oakland.

Access to big data gives groups and individuals a greater voice in planning and development in their neighborhood.

20% of Detroit living in the same household since the 80s, and 28.4% in Wayne County have at least one 65-year old resident.

Access to big data gives groups and individuals a greater voice in planning and development in their neighborhood. Data Driven Detroit is developing a Neighborhood Vitality Index that will present how residents perceive their neighborhood side-by-side with conventional statistics such as income, employment, education, etc.

Crime rates are a good example of conventional statistics that don’t tell the whole story. Crime rates only measure what’s being reported. They don’t say anything about how people perceive crime in their neighborhood, even though “how someone feels about crime is probably more important to them than what the crime rate actually is,” Urban said.

Integrating community perceptions “makes the data much more useful both for people who are trying to make policy but also for people who are trying to get their thoughts and opinions represented,” he said.

The Benefits of Big Data for Planning and Managing Cities

  • Enhances decisions by providing more detailed information.
  • Streamlines managing resources and evaluating existing policies and programs.
  • Enables real-time analysis that shows what’s happening moment by moment.
  • Provides evidence-based support for planning decisions in the form of statistics, maps and simulations.
  • Generates fresh insights and reveals new relationships by cross-analyzing and visualizing data sets.
  • Builds stronger public-private partnerships to obtain, aggregate, analyze and apply data to cities that benefit all parties involved.

Source: American Planning Association

Advertisement