Stories from the World of Municipal Analytics

Stories from the World of Municipal AnalyticsOur cities are getting smarterAlexander ShropshireBlockedUnblockFollowFollowingJun 2The field of analytics continues to gain momentum in municipalities across the United States as local governments have begun to take Big Data seriously as a means to uncover a greater understanding of its citizens, increase the effectiveness of its projects and policies, and save money.

More cities are adopting analytics practices than ever before, which yields an increased need for analytics professionals to oversee the adoption, deployment, and assessment of these new technologies at a local level.

Of the successfully implemented programs, many have pushed through multiple failures and iterations to fulfill their goals related to improving governance.

With sophisticated technology like machine learning and artificial intelligence comes the need to understand more complex assumptions and biases that may work against an objective that serves both comfortable and vulnerable populations.

Civic problems that could be addressed with the help of analytics may find their way to the desk of those in power from internal, funded problem sourcing efforts, but in the case of an engaged community, problems that need prioritization can often present themselves.

To address a civic issue in tandem with analytics professionals, it seems there must not only exist a clear need for the application of analytics but, more crucially, buy-in from stakeholders with decision-making authority, the right resources from a talent and technology perspective, and mature, vetted data.

Efforts to innovate in the public sector will often be met with bureaucracy and a low tolerance for risk, which is why smaller scale pilot iterations of analytics projects have an important place in demonstrating initial value.

Despite the obstacles inherent in the domain, I, for one, am extremely excited about the potential for municipal analytics and data science.

In fact, we have already seen plenty of successful examples throughout the country.

In order to spark your interest in the growing field, and with the hope that you’ll engage and understand your own municipality’s progress in Big Data applied to policy, I’ll be summarizing some of my favorite success stories to give you a sense of what’s already possible, inspired by leading national resources for the successful implementation of civic analytics.

Smoke Detectors + New OrleansSource: NOLA Office of Performance and AccountabilityArmed with the insight that the risk of death from fire in a home is cut in half when a home has smoke alarms, the New Orleans Fire Department decided to give out free, life-saving smoke alarms to residents in need.

The city has built out its data capabilities through its Office of Performance and Accountability, who built up a data team and decided to utilize a predictive analytics model to determine both the attributes of a home that would help the group understand each property’s need for a new smoke alarm and its overall fire fatality risk attributes.

No data of this sort had been collected by the office, but the project was able to take off when a researcher uncovered a question about smoke alarms on the US Census Bureau’s American Housing Survey and joined it with data from the Bureau’s American Community Survey.

The team was able to identify key features that helped to predict the need for a new alarm and highly susceptible houses.

They were able to use this to arrive at a block-by-block smoke detector outreach and distribution plan.

2 years and 8,000 new, risk-targeted smoke alarms later, the department was able to celebrate a job well done, affirm the power of analytics, and inspire the American Red Cross and DataKind to replicate their project’s across the nation.

See “Predicting Fire Risk: From New Orleans to a Nationwide Tool” (Ash Center, Harvard)Early Police Intervention + Nashville/CharlotteEarly police intervention systems, developed by the University of Chicago’s Center for Data Science and Public Policy, enable police departments to identify those likely to have an adverse incident in order to provide them with training, counseling, and similar resources.

The group tracks “adverse incidents,” which may arrive in the form of a citizen or colleague complaint, the use of force, accidents, or injuries.

The system is used to find predictors of adverse incidents within the department’s data.

This includes tracking a variety of instances of officer conduct that potentially reveal something about the officer’s underlying tendencies that may or may not have an effect on adverse incident occurrence.

The system learns how supervisors prioritize or deprioritize the system’s findings as well.

Traditional systems reveal a feasibility issue, as they might capture 80% of officers who have an adverse incident by flagging almost 70% of the entire police force.

Recommending that a department continuously retrain 70% of the force isn’t very useful given the already limited resources of many police departments.

The newer system designed at UChicago still flagged 80% of the force but was able to hone in on the 30% most in need of intervention and give each officer an aggregate risk score based on historical performance.

A system like this is clearly not meant to replace a police supervisor’s responsibility to make a judgment call but can help inform it, and the long-term impact of implementation is still being measured.

See “Police Project Update: Expanding and Implementing the Early Intervention System” (UChicago’s Center for Data Science and Public Policy)Landlord Discrimination + New York CityNew York City prohibits discrimination of prospective tenants based on their race, gender, and source of income…on paper.

Despite Title 8 of the Administrative Code of the City of New York, income-based discrimination is still among the most reported housing-related issues.

The New York City Mayor’s Office of Data Analytics and the Commission on Human Rights decided to build a model to more efficiently deploy investigative resources by identifying areas in which homeowners were most likely to illegally turn away potential tenants with housing vouchers.

The researchers hypothesized that targeting the larger property management firms guilty of such conduct would have trickle-down benefits, encouraging smaller potential offenders to follow the law.

By identifying the larger offenders, the model has worked to reveal illegal housing practices more effectively, allowing the Commission on Human Rights to better prioritize their workflow and improve NYC’s ability to curb housing discrimination.

See the project “The NYC Commission on Human Rights partnered with MODA to assist in identifying sites of potential income discrimination” (MODA NYC GitHub)Urban Damage + New OrleansMany areas that were damaged by Hurricane Katrina in 2005 in New Orleans were left untouched by the city for many years after the storm, and when Mayor Mitch Landrieu took office in 2010, the decision to take either renovation or demolition action on properties around the city to remedy damage was a key topic of discussion.

The city’s data hub, OPA, was tasked with developing a performance management tool using analytics to help approach the issue.

They developed BlightSTAT and a Blight Scorecard in response to the backlog of 1,500 properties awaiting a decision from housing code enforcement to either sell the house or move forward with demolition.

The system developed allowed mid-level supervisors to go out and score properties, increasing the speed and consistency of the evaluation process.

In this case, like many cases, human judgment is an irreplaceable factor in the civic analytics process.

The decision to demolish a building was informed by, not replaced by, an advanced data system.

Paper communication in this process was essentially eliminated by the technology in the Department of Code Enforcement.

In turn, the backlog of properties awaiting decisions was essentially eliminated as well.

See “Code Enforcement Abatement Tool” (City of New Orleans)See “Early detection of properties at risk of blight using spatiotemporal data” (UChicago’s Data Science for Social Good)Food Inspections + ChicagoChicago Restaurants Failing to Pass Inspections: 2015–2016 (https://chicago.

github.

io/food-inspections-evaluation/)Despite deep regulations for food preparation, storage, and service, monitoring food safety in a city is really challenging, but analytics models can help city officials efficiently allocate expertise and financial resources to the task.

Chicago has more than 7,300 restaurants, thousands of grocery stores, and other food vendors, all collectively responsible for the overall health of the city.

That said, there are only about 35 inspectors tasked with covering over 15,000 total food establishments.

Instead of simply hiring more inspectors, the city decided to improve its inspection performance alongside analytics.

Chicago’s Department of Innovation and Tech collaborated with the Department of Public Health to forecast a restaurant’s risk of failing an inspection.

The eventual model empowered the city to identify violations about 7 days earlier than the manual process was able to, and in the research process, the team created a transparent, community accessible food inspection tracker updated with real-time inspection results.

See “Delivering Faster Results with Food Inspection Forecasting” (Harvard’s Ash Center)See “Food Inspections” (Chicago Data Portal)Traffic Fatalities + New York CityNew technologies alongside rising urban populations have made way for a revolution in how residents traverse cities, and with such change, cities must adapt their systems and regulations to ensure the safety of citizens.

Ample transportation data and new analytics models provide cities a low-cost avenue to assess and improve systems of transportation.

As part of the global Vision Zero movement to address transportation issues and inefficiencies with data analytics, New York City collaborated with DataKind in 2015 to assess the risk and likelihood of certain outcomes of all new transportation improvement projects.

Projects that aim to improve traffic patterns and minimize delays may either correlate with improved safety or come at a tradeoff with safety.

After getting over the speedbump of limited useful data, the team was able to produce a model that maps exposure to more accurately judge the effectiveness of particular street designs.

The model lets the NYC Department of Transportation forecast active car counts in given areas at given times to help optimize traffic patterns and improve road safety.

See “Can Better Data Make Zero Traffic Deaths a Reality?” (Harvard’s Ash Center)Other Forward Thinking Cities Recognized by Bloomberg Philanthropies’ What Works CitiesA Bloomberg Philanthropies Map of Certified CitiesThe mayor of Boston, Marty Walsh, has installed a large screen that resembles Fenway Park’s scoreboard in his office, aggregating a selection of key city performance indicators such as average time to answer 311 calls, ambulance response time, crime statistics, and an overall success score to compare versus benchmarks and goals.

Mayor Sly James of Kansas City has implemented a quarterly citizen satisfaction survey to guide policy initiatives, the first already funded examples actively driven by the concerns of citizens have arrived in the form of bonds to finance the demolition of blighted buildings, and the renovation of roads, bridges, and sidewalks.

Los Angeles Mayor Eric Garcetti launched the Clean Streets Index, which mapped out blocks in need of cleaning and has since resulted in 82% less unclean streets just a year after launching the measurement initiative.

Mayor Greg Fischer of Louisville launched a civic analytics innovation hub and co-working space, Louielab, to encourage collaboration, and also prioritized open access to all public records as a city default.

Mayor Kevin Faulconer of San Diego helped implement an app called Get It Done to ease citizen reporting of city nuisances, and reporting ‘after’ photos of nuisance fixes back to citizens.

Mayor Muriel Bowser launched The Lab @ DC, an in-house data science research team crafted to apply scientific thinking to daily operations in the city who made their name by releasing a landmark study about the lack of effectiveness of body-worn cameras on the use of police force.

The Future: Smart Cities Need Smart GovernmentsClick the interactive visualization above for sample solutions & resultsThe brightest future for all will only arrive with collaboration between different types of stakeholders, and the different sector actors involved have both valuable lessons to teach each other and new learnings to gain from each other.

Smart city technologies, regardless of the originating sector, must be combined with sound policy initiatives.

For example, infrastructure investments, prone to long, expensive project life cycles, can become more nimble in response to changing environments.

It may serve a crowded city better to reallocate existing lanes to pedestrians and cyclists, or support standing up a privately-operated commuter bus service, rather than to craft a 10-year subway line infrastructure development project.

Projects based on such long-term visions are otherwise prone to circumstances or problems that may change mid-project, and the bulk of the value of the solution itself may be lost in time.

City governments shouldn’t be operating in a silo from private sector actors.

The public sector, for good reasons, will want to own many smart technology apps and programs, but there is definitely a case to allow space for private sector investment and financial return on government projects.

It’s up to cities to identify areas where it makes sense to step back and make room for corporations, universities, foundations, and nonprofits to play a role.

The “master planning” approach that many cities have abided by hasn’t seen as much cost-effective success as projects deployed by cities who set themselves up as resource centers and ecosystem hubs for potential public-private partnerships.

There’s also so much potential to improve civic engagement with smart, data-driven systems.

A true two-way dialogue between citizens and public officials can be sustained via interactive apps and social media.

Decision makers can open themselves up to a real-time stream of public concern, and use such insights as the basis for new policies.

Because of this, new technologies and analytics efforts need to be transparent to bolster community participation and trust.

A greater share of the population should be brought online for increased access to relevant information, and so that cities can understand both niche and majority opinion across demographics.

Better information systems and communication channels also have the potential to inspire more active community participation in the form of volunteering, mentoring, other positive community events.

To conclude, I think there are two areas that need to be brought into focus in order to really empower data-driven smart cities.

The public sector must find a way to compete with the private sector for tech talent through unique, creative incentive planning.

Cities must also prioritize cybersecurity in order to protect new systems from the risk that comes with tech-savvy bad actors, especially in the case of crucial systems like water supply and the power grid.

It seems that both areas need to be accounted for from the first day of a municipal technology undertaking in order to stand the test of time.

Despite the risks, I’m optimistic about the growing municipal adoption of data-driven systems and new technologies and can’t wait to continue reading about how different government bodies decide to step up to the opportunities at hand.

Thanks for reading,AlexFeel free to reach out with your thoughts and your questions or simply follow my data science journey on LinkedIn, GitHub, and Medium!.. More details

Leave a Reply