In this article, we draw directly on insights shared by David Sacks, Brad Gerstner, and David Friedberg on Episode 260 of the All-In Podcast (Spotify link). Their discussion explores how real-time data and AI could modernize government institutions and shift the fundamental paradigm of public policy.
Government institutions are currently operating on a massive data delay, and that delay has real costs. From interest rate decisions to infrastructure planning, the gap between what policymakers see and what is actually happening in the economy can be months wide.
By integrating real-time AI and private sector data streams, public agencies could shift from reactive bodies making sudden, jarring adjustments to proactive systems capable of managing with precision. This shift is not just about efficiency. It is about preventing the avoidable economic friction that occurs when the government is forced to fly blind.
Why is Public Sector Data Outdated?
The core problem facing large government institutions, such as the Federal Reserve, is their reliance on outdated, survey-based data collection. While the private sector operates in near real time, the public sector often relies on manual data aggregation methods that can lag actual conditions by months.
David Sacks and Brad Gerstner highlighted that the way we measure the economy is fundamentally broken. For example, to measure housing and rental costs, the government currently surveys approximately 8,000 households. In some markets during 2021, rents were rising at close to 40%, while the Fed’s lagging survey data showed nothing close to that figure. Meanwhile, large corporate landlords like Starwood or Zillow had real-time data on millions of units.
This lag creates blind spots with serious consequences. A perfect example occurred in June 2021 when the cost of a shipping container from China jumped from $1,500 to $15,000. Because government data was slow to reflect this, policymakers were still reading a relatively calm picture while inflation was already building. This delay eventually forced aggressive, late adjustments that caused unnecessary volatility, market crashes, and bank failures.
What did that delay actually cost?
Gerstner was direct: the misallocation caused by the Fed’s failure to act in June 2021, he argued, cost the country trillions of dollars. Everything that followed, the market crash of late 2022, the wave of layoffs, the bank failures, was, in his view, avoidable. Had the Fed been working with accurate, timely data, it could have made smaller, earlier adjustments rather than the aggressive rate increases that ultimately caused so much collateral damage.
Can AI Help Balance the Scales of the Economy?
A significant part of this conversation involves the reported nomination of Kevin Warsh to lead the Federal Reserve. Warsh, a Harvard undergraduate and Stanford Law graduate who served as the youngest Fed governor in history at 35, is seen by some economists as a leader who understands that technology is a structural force capable of reshaping the economy’s rules.
The podcast hosts drew a parallel to the Alan Greenspan directed era of the 1990s, when internet-driven productivity gains allowed the economy to expand rapidly without triggering the inflation that standard models would have predicted.
AI, Gerstner argued, could be an even more powerful deflationary force, one that allows companies to produce more with fewer resources across nearly every sector.
If AI allows companies to do more with fewer resources, a modern central bank equipped with real-time data could theoretically allow the economy to sustain higher growth rates without the same overheating risk. Inflation could remain anchored not through restrictive policy but through genuine productivity gains, helping ordinary people borrow, buy homes, and build financial stability.
What Are the Risks of Sticking with Old Systems?
The 2021 episode illustrated the cost of lagging data. The risks of inaction only grow from here.
- The next crisis will move faster: AI-driven markets mean conditions can shift faster than manual surveys can keep up with. The next inflation spike may develop and resolve before the current data infrastructure can even detect it.
- Eroding public trust: If institutions fail to modernize, the perception that they cannot see what ordinary people are experiencing becomes more permanent. Better data is, in part, a trust restoration project.
- Competitive disadvantage: Other governments are investing in real-time economic intelligence. A central bank that relies on household surveys while its peers using live data feeds operate in a fundamentally different information environment, with consequences for currency stability and long-term credibility.
What Would a Digital-First Government Look Like?
To fix this institutional blindness, the podcast hosts propose a sweeping technological overhaul, which they call a “Manhattan Project” for data. The goal is to move the public sector into the digital age by automating intelligence gathering using the same tools that drive the private sector.
- Aggregating Trillions of Points: Instead of relying on manual surveys or occasional calls with a few CEOs, agencies should use AI to collect and synthesize trillions of digital data points continuously.
- Real-Time Benchmarks: Platforms like Truflation, an independent inflation index that tracks price changes across millions of items daily, already offer a model for what 21st-century public benchmarks could look like.
- Precision Management: With better data, policymakers could make smaller, more targeted adjustments rather than the broad, disruptive moves that become necessary when action is delayed. Much of the resource misallocation seen in recent cycles can be traced back to decisions made on information that was already six months old.
How Should Governments Build Real-Time Data Infrastructure Securely?
Real-time data infrastructure is only useful if it is also trustworthy. Organizations like Carbon60 are working with public institutions to bridge the gap between outdated manually aggregated data and secure, real-time intelligence without trading security for speed.
- Sovereign Cloud Infrastructure: To maintain public trust, data must remain under strict jurisdictional control. Carbon60 enables this through sovereign cloud solutions, ensuring that sensitive information is stored and processed within national borders and protected by local regulations.
- Performance at Scale: Carbon60 provides the high-performance, managed infrastructure required to ingest the trillions of data points mentioned by the All-In hosts. This allows agencies to move away from laggy surveys without sacrificing performance.
- Privacy and Compliance: By providing a walled garden for public data, sovereign clouds ensure that, as AI agents analyze information, the data remains private and secure against external threats and unauthorized access.
The transition to a data-first public sector represents the same evolution we are seeing in the corporate world. Just as companies are building centralized intelligence to manage their operations, government institutions must do the same for public policy.
That gap is closeable. Modernization at this scale requires infrastructure capable of handling the volume, maintaining sovereignty, and keeping sensitive data secure throughout. That is exactly the challenge Carbon60 solves.
Ready to move from legacy data to real-time intelligence? Get in touch with the Carbon60 team.

