[ad_1]
Last Friday should have been Jobs Day, with the from the Bureau of Labor Statistics, but the federal statistical agencies are still working through delays due to the 43-day government shutdown that ended last month.
Instead, last week offered another raft of private-sector and administrative data on the labor market. Guy Berger provides an excellent rundown of the latest information.
The disruption to government statistics this year heightened interest in alternative data, but it’s a bigger issue than just bridging a temporary hiatus. It’s about the future of economic data. The IMF’s Finance & Development Magazine’s December issue is very timely and lays out the breadth of the debate.
In this post, I highlight some key points from the articles in the F&D Magazine, including those from my own. I encourage you to explore the entire issue.
Alternative Data Complements Traditional Statistics
My contribution to the issue is “Alternative Data and Monetary Policy,” with examples of how nontraditional data have been useful for the Fed:
To keep a finger on the pulse of the economy in real time, the Fed relies on a wide array of statistics generated by government agencies such as the Bureau of Labor Statistics (BLS) and the Department of Commerce. These statistics, typically based on representative surveys, are considered the gold standard by policymakers, investors, business leaders, and the public. Increasingly, though, the Fed has supplemented them with nontraditional sources of data, often supplied by private companies. The defining feature of these nontraditional sources is the data was not created for the purpose of making economic statistics; rather it originated in the process of running a business or a government program and then was repurposed for economic statistics.
This nontraditional data is often timelier or more granular and, as a result, can fill in some gaps in government statistics. It can also provide an added perspective on critical economic outcomes, such as employment. Finally, it can be used to improve the quality of traditional data sources. Nevertheless, nontraditional sources should be viewed as a complement to traditional data in informing policy, not as a substitute.
The recent government shutdown affirmed for me that alternative data are not comprehensive or reliable enough to serve as the core of economic data. But the current core must continue to evolve by drawing on the alternative data.
Alternative Data May Bring us Closer to the ’Ground Truth’
Kenneth Cukier, an editor at The Economist, painted a provocative picture of how new, more granular data could refine our metrics of the economic well-being in the future:
That same year [1974] marked a brutal recession in the US, which inspired a Yale University economist and former White House advisor, Arthur Okun, to create a new indicator to account for its toll on individuals, not the abstract unit of the economy as a whole.
His Economic Discomfort Index—later dubbed the “misery index”—became a staple of US politics. Ronald Reagan used it to defeat President Jimmy Carter for the presidency in 1980. But it is simply the sum of the unemployment and inflation rates. A modern metric for the AI era is easy to imagine.
It would gather all the ways people might express their misery, from shifting spending patterns—not buying fewer things (a blunt number) but actually switching from eating steak to ramen. Likewise, missed utility bills and overdue car payments. Then, incidents of road rage, erratic driving, and fender benders—not in aggregate but tracked down to the person. Apple watches can track the quality of sleep and stress during the day. Closed-circuit TV cameras in streets, shops, and offices that have facial-recognition capability can record individuals’ emotions. Toilets with biosensors can track users’ levels of hormones such as cortisol and epinephrine that spike in moments of anxiety.
This comes as close to ground truth as it gets.
Cukier acknowledges potential downsides to that degree of monitoring and, even if possible, adoption would take time, but I agree that traditional statistics have gaps. As an economist struggling to capture “affordability” concerns with aggregate price and income time series, I can see the appeal of more textured metrics.
Putting a Price on the Data Behind Alternative Statistics
Laura Veldkamp, an economics professor, in her piece, “The Hidden Price of Data,” described that we might measure the price of the data we are creating:
Every digital purchase, every app download, every click is a dual transaction: Consumers buy a good or service, and at the same time, they sell their data. The observable price—the amount of money that changes hands—is really the net price of these two exchanges. Firms get revenue and data; consumers get products and convenience.
… Together, these five approaches [to measuring the price of that data] describe an invisible asset class. Each captures an aspect of data value: labor devoted, revenue earned, the precision of actions, a market price, or implicit cost. None is infallible, feasible in all cases, or holistic in its measurement. Measurement is always imperfect. However, to make informed choices and craft sound policy, we must move data from the realm of intuition to the realm of quantification. Until then, the economy runs on a resource whose price we can only guess at, and whose value Silicon Valley can freely exploit.
Making the value of these data more transparent could strengthen that data source. Traditional statistics are suffering from declining survey response rates. People are less willing to share their information with surveyors. There may come a day when customers are unwilling to freely share their purchase information for the alternative statistics, too. Putting a price on the data we create in transactions could help avoid that outcome by enabling compensation for data creators.
The Future of Data Will Require Changing Institutions
Rebecca Riley, a professor of economics, in her piece “It’s Time to Modernize Measures of Growth,” discusses several ways traditional statistics must evolve to accommodate the digital economy and alternative data sources:
It is time to strengthen investment in our economic statistics infrastructure. We may be losing our ability to monitor the economy and make informed decisions because trillions of dollars of economic activity may be unmeasured or measured in insufficient detail. The importance of addressing this issue should not be understated, and neither should the challenges.
The obstacles include overcoming bureaucratic inertia, paying for the overhaul of economic accounting systems, and carrying out coordinated actions. If we don’t make headway on trusted statistics produced by national agencies with statistical rigor in an accountable, transparent manner—with impartiality and equal access—there will be plenty of noise to fill the gap in today’s data-rich world.
Modernizing government economic statistics will require additional resources, even if, in the long run, the use of alternative data might reduce costs. The challenges to “coordinated actions” between private-sector sources of the alternative data and government statistical agencies are formidable and extend beyond financial costs.
In Closing
Data helps us understand the economy, and increasingly, data drive the economy. The future of economic statistics is critical for the future of economic policy.
[ad_2]
Source link


