The Fed Paves the Way for Running a “High-Pressure Economy” (Along with Higher Inflation)

Since the beginning of last year (see my February 4, 2015 commentary “U.S. Inflationary Pressures Remain Muted” and my March 1, 2016 Forbes commentary “Why Federal Reserve Tightening Is Still A Distant Event“), I have consistently asserted that the Fed’s ultimate tightening schedule would be slower than expected–from both the perspective of the Fed’s original intentions, as well as those of the fed funds futures market. Indeed, the most consistent theme since the beginning of the 2008-09 global financial crisis has been this: The tepid recovery in global financial conditions and global economic growth has consistently forced the Fed to ease more than expected; and since the “tapering” of the Fed’s quantitative easing policy at the end of 2013, to tighten less than expected. E.g. the October 2008 Blue Chip Economic Indicators survey of America’s top economists predicted the fed funds rate to rebound to 4.0% by late 2010. Subsequent forecasts were similarly early.

According to the CME Fed Watch, the probability of a 25 bps Fed rate hike on December 14 is now over 70%. I expect the December 14 hike to occur as the Fed has been prepping the market for one 25 bps hike for months; however–similar to what I asserted last year–I do not believe this rate hike will signal the beginning of a new rate hike cycle. Rather, the timing of the Fed’s third rate hike will again be data-dependent (more on that below). Fed funds futures currently peg the Fed’s third rate hike to not occur until more than a year from now, i,e. at the December 13, 2017 FOMC meeting. This is the most likely timing for the third rate hike, for the following reasons:

1. U.S. households remain in “deleveraging” mode. Haunted by the 2008-09 global financial crisis, record amounts of student loans outstanding (currently at $1.3 trillion), and a shorter runway to retirement age and lower income prospects, U.S. consumer spending growth since the bottom of the 2008-09 global financial crisis has been relatively tepid (see Figure 1 below), despite ongoing improvements in the U.S. labor market;


2. The developed world & China are still mired by deflationary pressures. While the Fed had not been shy about hiking rates ahead of other central banks in previous tightening cycles, the fact that all of the world’s major central banks–with the exception of the Fed–are still in major easing cycles means the Fed has no choice but to halt after its December 14, 2016 hike. Even the Bank of England–which was expected to be the first major central bank to hike rates–was forced to reverse its stance and renew its quantitative easing policy as UK policymakers succumbs to the rise of populism. In a world still mired by deflationary pressures, the U.S. could easily succumb to another deflationary cycle if the Fed prematurely adopts a hawkish stance;

3. The Fed is no longer in denial and finally recognizes the uniqueness of the 2008-09 deleveraging cycle that is still with us today. In a June 3, 2016 speech (titled “Reflections on the Current Monetary Policy Environment“), Chicago Fed President Charles Evans asserted why this isn’t a normal recovery cycle and because of that, argued why the Fed should foster a “high-pressure” economy (characterized by a tight labor market and sustained inflation above 2%) in order to ward off downside risks in both economic growth and inflation. Quoting President Evans: “I view risk-management issues to be of great importance today. As I noted earlier, I still see the risks as weighted to the downside for both my growth and inflation outlooks … So I still judge that risk-management arguments continue to favor providing more accommodation than usual to deliver an extra boost to aggregate demand … One can advance risk-management arguments further and come up with a reasonable case for holding off increasing the funds rate for much longer, namely, until core inflation actually gets to 2 percent on a sustainable basis.

President Evans’ speech was followed by similar dovish sentiment expressed by Fed Governor Daniel Tarullo in a September 9, 2016 CNBC interview, Fed Governor Lael Brainard in a September 12, 2016 speech at the Chicago Council on Global Affairs, as well as the September 2016 FOMC minutes. Finally, Fed Chair Janet Yellen explored the potential benefits of running a “high-pressure economy” after a deep recession in her October 14, 2016 speech at a recent conference sponsored by the Boston Fed. Quoting Chair Yellen:

If we assume that hysteresis is in fact present to some degree after deep recessions, the natural next question is to ask whether it might be possible to reverse these adverse supply-side effects by temporarily running a “high-pressure economy,” with robust aggregate demand and a tight labor market. One can certainly identify plausible ways in which this might occur. Increased business sales would almost certainly raise the productive capacity of the economy by encouraging additional capital spending, especially if accompanied by reduced uncertainty about future prospects. In addition, a tight labor market might draw in potential workers who would otherwise sit on the sidelines and encourage job-to-job transitions that could also lead to more-efficient–and, hence, more-productive–job matches. Finally, albeit more speculatively, strong demand could potentially yield significant productivity gains by, among other things, prompting higher levels of research and development spending and increasing the incentives to start new, innovative businesses.

Bottom line: The Fed continues to back off from committing to an official tightening schedule. After the December 14, 2016 rate hike, probability suggests the next rate hike to not occur until the December 13, 2017 FOMC meeting. Until the year-over-year PCE core rate rises to and maintains a rate of 2.0% or over, the Fed will not recommit to a new rate hike cycle. This also paves the way for higher U.S. inflation; as such, clients should continue to underweight U.S. long-duration Treasuries and overweight gold.

Leading Indicators Suggest Further Upside in Global Risk Asset Prices

Note: I know many of you reading this are either overweight cash or net short U.S. equities. Please don’t shoot the messenger: I am not personally biased to the upside – I am merely channeling what my models are telling me, and they are telling me to stay bullish.

In my January 31, 2016 newsletter, I switched from a generally neutral to a bullish position on global risk assets. Specifically:

  • For U.S. equities, I switched from a “slightly bullish” to a “bullish” position (after switching from a “neutral” to a “slightly bullish” stance on the evening of January 7th);
  • For international developed equities, a shift from “neutral” to “bullish”;
  • For emerging market equities, a shift from “neutral” to “slightly bullish”; and
  • For global REITs, a shift from “neutral” to “bullish.”

My bullish tilt on global risk assets at the time was primarily based on the following reasons:

  1. A severely oversold condition in U.S. equities, with several of my technical indicators hitting oversold levels similar to where they were during the September 1981, October 1987, October 1990, and September 1998 bottoms;
  2. Significant support coming from both my primary and secondary domestic liquidity indicators, such as the relative steepness of the U.S. yield curve, the Fed’s renewed easing bias in the aftermath of the December 16, 2015 rate hike, and a sustained +7.5% to +8.0% growth in U.S. commercial bank lending;
  3. Tremendous bearish sentiment among second-tier and retail investors (which is bullish from a contrarian standpoint), including a spike in NYSE short interest, a spike in the AUM of Rydex’s bear funds, and several (second-tier) bank analysts making absurd price level predictions on oil and global risk assets (e.g. Standard Chartered’s call for $10 oil and RBS’ “advice” to clients to “sell everything”).

In a subsequent blog post on February 10, 2016 (“Leading Indicators Suggest a Stabilization in Global Risk Asset Prices“), I followed up on my bullish January 31st prognostications with one more bullish indicator; i.e. the strengthening readings of our proprietary CBGDI (“CB Capital Global Diffusion Index”) indicator which “suggests–at the very least–a stabilization, if not an immediate rally, in both global equity and oil prices.

I have previously discussed the construction and implication of the CBGDI’s readings in many of our weekly newsletters and blog entries. The last two times I discussed the CBGDI in this blog was on May 15, 2015 (“Leading Indicators Suggest Lower U.S. Treasury Rates“) and on February 10, 2016 (“Leading Indicators Suggest a Stabilization in Global Risk Asset Prices“).

To recap, the CBGDI is a global leading indicator which we construct by aggregating and equal-weighting the OECD-constructed leading indicators for 29 major countries, including non-OECD members such as China, Brazil, Turkey, India, Indonesia, and Russia. Moreover, the CBGDI has historically led the MSCI All-Country World Index and WTI crude oil prices since November 1989, when the Berlin Wall fell. Historically, the rate of change (i.e. the 2nd derivative) of the CBGDI has led WTI crude oil prices by three months with an R-squared of 30%; and has led or correlated with the MSCI All-Country World Index, with an R-squared of over 40% (which is expected as local stock prices is typically a component of the OECD leading indicators).

The latest reading of the CBGDI has continued to improve upon the readings which we discussed several months ago (see Figure 1 below)–just 10 days after we turned bullish on global risk assets. Both the 1st and the 2nd derivatives of the CBGDI have continued to climb and are still in (slight) uptrends, suggesting a stabilization and in some cases, a re-acceleration (e.g. the economies of South Korea, New Zealand, Spain, and India) in global economic activity. So don’t shoot the messenger–but it appears that the rally in global risk assets coming out of the late-January-to-early-February bottom still has more room to run.


The U.S. Needs to Rejuvenate the Global Supercomputing Race

Technology, along with increasing access to cheap energy, is the lifeblood of a growing, modern economy. As we discussed in our December 2, 2012 article (“The Global Productivity Riddle and the Supercomputing Race“), fully 85% of productivity growth in the 20th century could be attributed to technological progress, as well as increasing accessibility/sharing of cheap energy sources due to innovations in oil and natural gas hydraulic fracturing, ultra-deep water drilling, solar panel productivity, and the commercialization of Generation III+ nuclear power plants and deployment of smart power grids.

Perhaps the most cited example where the combined effects of technological and human capital investments have had the most economic impact is the extreme decline in computing and communication costs. Moore’s Law, the ability of computer engineers to double the amount of computing power in any given space every 2 years, has been in effect since the invention of the transistor in the late 1940s. Parallel to this has been the rise of the supercomputing industry. Started by Seymour Cray at Control Data Corporation in the 1960s, the supercomputing industry has played a paramount role in advancing the sciences, most recently in computationally intensive fields such as weather forecasting, oil and gas exploration, human genome sequencing, molecular modeling, and physical simulations with the purpose of designing more aerodynamic aircrafts or better conducting materials. No doubt, breakthroughs in more efficient supercomputing technologies and processes is integral to the ongoing growth in our living standards in the 21st century.

Unfortunately, advances in both the U.S. and global supercomputing industry has lagged in the last several years. Every six months, a list of the world’s top 500 most powerful supercomputers is published. The latest list was compiled in June 2015; aside from providing the most up-to-date supercomputing statistics, the semi-annual list also publishes the historical progress of global supercomputing power, each country’s share of global supercomputing power, as well as a reasonable accurate projection of what lies ahead. Figure 1 below is a log chart summarizing the progression of the top 500 list from its inception in 1993.

Figure 1: Historical Performance of the World’s Top 500 Supercomputers

top500progressAs shown in Figure 1 above, both the sum of the world’s top 500 computing power, as well as the #1 ranked supercomputer, has remained relatively stagnant over the last several years. Just three years ago, there was serious discussion of the commercialization of an “exaflop” supercomputer (i.e. a supercomputer capable of 1 x 10^18 calculations per second) by the 2018-2019 time frame. Today, the world’s top computer scientists are targeting a more distant time frame of 2023.

From the U.S. perspective, the slowdown in the advent of the supercomputing industry is even more worrying. Not only has innovation slowed down at the global level, but the U.S. share of global supercomputing power has been declining as well. Three years ago, the U.S. housed 55% of the world’s top 500 supercomputing power; Japan was second, with 12% of the world’s supercomputing power. Rounding out the top five were China (8%), Germany (6%), and France (5%). Today, the U.S. houses only 46% of the world’s supercomputing power, with countries such as the UK, India, Korea, and Russia gaining ground.

Figure 2: Supercomputing Power Distributed by Country


Bottom line: Since the invention of the transistor in the late 1940s and the advent of the supercomputing industry in the 1960s, the U.S. has always led the supercomputing industry in terms of innovation and sheer computing power. With countries such as China and India further industrializing and developing their computer science/engineering expertise (mostly with government funding), U.S. policymakers must encourage and provide more resources to stay ahead of the supercomputing race. To that end, President Obama’s most recent executive order calling for the creation of a National Strategic Computing Initiative–with the goal of building an “exascale” supercomputer–is a step in the right direction. At this point, however, whether the industry can deploy an energy-efficient exascale supercomputer by the less ambitious 2023 time frame is still an open question.