Blog article
See all stories »

Data Lessons From the Shutdown Blues

One of the lesser-reported consequences of the recent US government partial shutdown was its effect upon the reporting of key economic and financial markets data. Many regulatory agencies closed up shop during the impasse (one which could, but hopefully won’t, repeat itself). Turns out, the government is a crucial source of information, and as a result of the shutdown lasting more than a month, compilation and publication of critical performance results were delayed across several sectors.

A note from Deutsche Bank’s asset management unit (DWS) pointed out in mid-January how this data really runs the gamut. Statistics around “retail sales, business inventories, housing starts, and personal consumption expenditure (PCE) prices” were delayed, wrote DWS chief economist Josh Feinman. Other numbers like the employment report could be distorted even if released on time, he added. And that is to say nothing of prudential regulatory reporting that monitors bank balance sheet health and trading activity on listed markets.

Missing these inputs for one go-around probably didn’t do too much practical damage, at least for those who readied for it. Filling in the gaps and projecting those missing results can be done, even on unexpectedly incomplete information. Perhaps most importantly, the markets were still open to provide their own signals, even if the government wasn’t. Still, there are a couple reasons why portfolio managers and traders may have been asking questions of their data and operations teams afterwards - about how they did it, and how to do it better next time.


Event Aftermath

When new data goes absent, there are at least two perspectives to consider with respect to investment decision making. One is an exercise in portfolio analysis in a traditional sense. The other is girding for market events and reacting appropriately in the short term; after all, the shutdown didn’t happen in a vacuum.

Start with the latter issue. The shutdown’s data deficiency proved problematic early on, when a peculiar market event showed up halfway around the world. Shortly after New Year, a currency flash crash was triggered during the trading “witching hour” (the period early in the morning in New York when no major foreign-exchange markets are open, and liquidity is shallow). In this case, despite a bank holiday that day in Japan, the Japanese Yen surged wildly, gaining nearly four percent against the dollar and other rival currencies in only a few minutes.

Bloomberg reported that there were several potential causes and explanations: algorithms run amok and others doing their job by piling on; perhaps flight to safety after some somewhat shocking news from Apple about its iPhone production; or a squeeze as investors were unwinding substantial loss-making shorts all at once (which, it somewhat bizarrely turned out, also involved the Turkish lira).

For traders exposed to the yen not already on the move, picking among these explanations could imply different actions -- or caution. But the shutdown created trouble for counting the shorts, at least those put on US-based exchanges: the shuttered US Commodity Futures Trading Commission (CFTC) hadn’t collected market speculation figures around the Japanese fiat since December 21. It’s fair to say their absent guidance was a significant missing piece of the puzzle for those investigating the event.

Furthermore, this was just a minor, if well-publicized, crunch for regulatory data. A similar jump or dive could’ve just as easily occurred on an American equities market, when the Securities and Exchange Commission (SEC) was also closed. As the 2015 Flash Crash proved, that would’ve been a far larger mess.


Shock to the System

The other angle revolves around those missing consumer spending and industry numbers, and assessing whether (or how much) portfolio holdings should be repriced or rebalanced in early 2019. Those questions were surely already on portfolio managers’ minds given the direction markets took towards the end of 2018: towards greater volatility and a potential economic slowdown.

On a temporary basis, most portfolio management teams and those managing bank balance sheets could probably work around the problem, using a combination of substitute data factors, historical data, and running analytical tools. In fact, if they struggled or were incapable of doing so, their investors and clients should be worried. Predicting these outcomes -- rather than waiting on Department of Commerce or Treasury figures to confirm them -- is their job, after all.

That said, it was also likely a challenge for many valuation ops and data teams to match this new demand. In the very least, it was a shock to the system and a test for their data management practices. Pulling new sources of data in; reweighting or prioritizing data points as they run through analytics; and gauging for the output’s reliability as it is reported -- these are all potentially tricky issues to solve for.

They are trickier still when data may be distorted, and the economy is losing billions of dollars in productivity because of the shutdown, itself.


Being Prepared

Reacting to all of this requires a data infrastructure that can adjust on the fly, identify and alleviate data accuracy issues, and line up initiated trading activity with changes to the bigger picture. In these spots, stronger data management practices can lead directly to institutional precision. Without these capabilities, one suspects some firms waited out a few weeks by merely surviving, feeling around in the dark and essentially “fudging” it.

That works, but only until the hard questions are asked. During the next go-around, events may dictate they come much faster. So the lesson is clear: test your preparedness for them now.


Comments: (1)

A Finextra member
A Finextra member 17 May, 2019, 15:57Be the first to give this comment the thumbs up 0 likes

Excellent post. It is always tricky to work with data that contains idiosyncratic breaks in the time series. Another challenge is working with new data sets for which no historical precedent exists.

Member since




More from member

This post is from a series of posts in the group:

Data Management 101

A community blog about data and how to manage it

See all

Now hiring