The ripple effect of big data and analytics is hitting economic development. There has been a resurgence in new tools that package economic data to make it more accessible to a wider audience. A lot of these tools are using aggregated data that is useful but it is often not granular enough to inform an individual EDO or city about how to improve its economy and what is working.
To do that we need better data that is more granular with details about specific projects and specific companies. Big Data relies on and pushes for this kind of transactional data. Much of this kind of economic data does exist but it is walled off by various bureaucratic walls. We are a long way from incorporating Big Data into economic development, and there are real risks with a pure Data Analytics approach to understanding economies and creating development strategies.
Chris Anderson, former editor in chief of Wired, argued in an essay, “The End of Theory: The Data Deluge Makes the Scientific Method Obsolete,” that with enough data and computing power and Analytics can find patterns where the traditional approach of theory and science cannot. To be clear, he thought that was a good thing. We already have too much magical thinking when it comes to economics and too often correlation is confused with causation, which leads to bad investments. It is even worse when a lot of the data we have is nonexistent or bad.
I want to offer four reasons why I think the field of economic development is making progress towards better data, metrics and evaluation.
- The Governmental Accounting Standards Board (GASB) has implemented new guidelines for disclosing tax abatements with the requirements taking effect for financial statements for periods beginning after December 15, 2015.
- EDA awarded a project to SRI International to develop and test measures, metrics, indicators, and methodologies to help EDA more effectively assess, evaluate, and report on the full impact of its economic development investments on regional economies across the Nation. Models must include testing on EDA grantees, an impact analysis design and how a valid statistical methodology will corroborate that the proposed metric model works. Unfortunately we have to wait until Q1-2017 to see the results.
- The National Science Board, the producer of the biennial report, Science and Engineering Indicators (SEI), convened a group of data producers, scientists, analysts, policymakers and consumers to get feedback on the current report and identify ways to improve it in the future including the indicators themselves as well as how the information can best be make available to support the variety of users. Approximately 65 hard core data geeks from yours truly to Vint Cerf (Google him) spent a day and half brainstorming on data and indicators. Besides the interaction with a group of really smart people, what inspired me about this event was that it showcased the depth of commitment collect, interpret, analyze and disseminate information. If you believe that “Data is the new oil,” then the U.S. is well positioned to be the top producer and refiner.
- States are stepping up their game on tracking and evaluation. Oklahoma recently passed legislation and issued an RFP for a statewide evaluation of business incentives, committing up to $250,000 per year for five years. The Oklahoma project is part of the Business Incentives Initiative by the Pew Charitable Trusts and the Center for Regional Economic Competitiveness where seven states are sharing best practices for collecting, managing, and analyzing data on economic development incentives. Indiana, Maryland, Michigan, Oklahoma, Tennessee, and Virginia, have signed on and it is clear that Oklahoma is taking it very seriously.