The Coming Breakthroughs in the Global Supercomputing Race

In our August 31, 2015 article (“The U.S. Needs to Rejuvenate the Global Supercomputing Race“), we expressed our concerns regarding the state of the global supercomputing industry; specifically, from a U.S. perspective, the sustainability of Moore’s Law, as well as increasing competition from the Chinese supercomputing industry. Below is a summary of the concerns that we expressed:

  • Technological innovation, along with increasing access to cheap, abundant energy, is the lifeblood of a growing, modern economy. As chronicled by Professor Robert Gordon in “The Rise and Fall of American Growth,” U.S. productivity growth (see Figure 1 below; sources: Professor Gordon & American Enterprise Institute)–with the exception of a brief spurt from 1997-2004–peaked during the period from the late 1920s to the early 1950s; by 1970, much of today’s everyday household conveniences, along with the most important innovations in transportation & medicine, have already been invented and diffused across the U.S. Since 1970, almost all of the U.S. productivity growth could be attributed to the adoption and advances in the PC, investments in our fiber optic and wireless networks, along with the accompanying growth of the U.S. software industry (other impactful technologies since the 1970s include: the advent of hydraulic fracturing in oil & gas shale, ultra deepwater drilling in the Gulf of Mexico, as well as the commercialization of alternative energy and more efficient battery storage systems, as we first discussed in our July 27, 2014 article “How Fracking Saved the U.S. Economy“). This means that a stagnation in the U.S. computing or communications industries would result in an invariable slowdown in U.S/global productivity growth;

americanproductivitygrowth

  • The progress of the U.S. supercomputing industry, as measured by the traditional FLOPS (floating-point operations per second) benchmark, had experienced a relative stagnation when we last wrote about the topic in August 2015. E.g. in 2011, both Intel and SGI seriously discussed the commercialization of an “exascale” supercomputer (i.e. a system capable of performing 1 x 10^18 calculations per second) by the 2019-2020 time frame. As of today, the U.S. supercomputing community has pushed back its target time frame of building an exascale supercomputer to 2023;
  • At the country-specific level, the U.S. share of global supercomputing systems has been declining. As recent as 2012, the U.S. housed 55% of the world’s top 500 supercomputing systems; Japan was second, with 12% of the world’s supercomputing systems, with China (8%) in third place. By the summer of 2015, the U.S. share of the world’s top 500 supercomputing systems has shrunk to 46%, although both Japan and China remained a distant second at 8%. Today, the Chinese supercomputing industry has led an unprecedented surge to claim parity with the U.S, as shown in Figure 2 below.

Figure 2: China – Reaching Parity with the U.S. in the # of Top 500 Supercomputerstop500

Since the invention of the transistor in the late 1940s and the advent of the supercomputing industry in the 1960s, the U.S. has always been the leader in the supercomputing industry in terms of innovation, sheer computing power, and building the customized software needed to take advantage of said supercomputing power (e.g. software designed for precision weather forecasting, gene sequencing, airplane and automobile design, protein folding, and now, artificial intelligence, etc.). With U.S. economic growth increasingly dependent on innovations in the U.S. computing industry and communications network–and with China now threatening to surpass the U.S. in terms of supercomputing power (caveat: China’s HPC software industry is probably still a decade behind)–it is imperative for both U.S. policymakers and corporations to encourage and provide more resources for the U.S. to stay ahead of the supercomputing race.

Unlike the tone of our August 31, 2015 article, however, we have grown more hopeful, primarily because of the following developments:

  • Moore’s Law is still alive and well: At CES 2017 in Las Vegas, Intel declared that Moore’s Law remains relevant, with a second-half target release date for its 10 nano-meter microprocessor chips.  At a subsequent nationally-televised meeting with President Trump earlier this month, Intel CEO Brian Krzanich announced the construction of its $7 billion Fab 42 in Arizona, a pilot plant for its new 7 nano-meter chips. Commercial production of the 7nm chips is schedule to occur in the 2020-2022 time frame, with most analysts expecting the new plant to incorporate more exotic technologies, such as gallium-nitride as a semiconductor material. The next iteration is 5nm chips; beyond 5 nano-meters, however, a more fundamental solution to extend Moore’s Law will need to occur, e.g. commercializing a graphene-based transistor;
  • GPU integration into supercomputing systems: The modern-day era of the GPU (graphics process unit) began in May 1995, when Nvidia commercialized its first graphics chip, the NV1, the first commercially-available GPU capable of 3D rendering and video acceleration. Unlike a CPU, the GPU is embedded with multiple threads of processing power, allowing it to perform many times more simultaneous calculations relative to a CPU. Historically, the supercomputing industry had been unable to take advantage of the sheer processing power of the GPU, given the lack of suitable programming languages specifically designed for GPUs. When the 1.75 petaflop Jaguar supercomputer was unveiled by Oak Ridge National Laboratory in 2009, it was notable as Jaguar was one of the first supercomputers to be outfitted with Nvidia GPUs. Its direct successor, the 17.59 petaflop Titan, was unveiled in 2012 with over 18,000 GPUs. At the time, this was a concern for two reasons: 1) hosting over 18,000 GPUs within a single system was unprecedented and would doom the project to endless failures and outages, and 2) there were too few programming codes to take advantage of the sheer processing power of the 18,000 GPUs. These concerns have proven to be unfounded; today, GPUs are turning home PCs into supercomputing systems while Google just rolled out a GPU cloud service focused on serving AI customers;
  • AI, machine-learning software commercialization: Perhaps one of the most surprising developments in recent years has been the advent of AI, machine-learning software, yielding results that were unthinkable just five years ago. These include: 1) Google DeepMind’s AlphaGo, which defeated three-time European Go champion Fan Hui by 5-0 in 2015, and finally, the world Go champion Ke Jie earlieir this year, 2) Carnegie Mellon’s Libratus, which defeated four of the world’s top poker players over 20 days of playing, and 3) the inevitable commercialization of Level 5 autonomous vehicles on the streets of the U.S., likely by the 2021-2025 time frame. Most recently, Microsoft and the University of Cambridge teamed up to develop a machine learning system capable of writing its own code. The advent of AI in the early 21st century is likely to be a seminal event in the history of supercomputing;
  • Ongoing research into quantum computing: The development of a viable, commercial quantum computer is gaining traction and is probably 10-20 years away from realization. A quantum computer is necessary for the processing of tasks that are regarded as computationally intractable on a classical computer. These include: 1) drug discovery and the ability to customize medical treatments based on the simulation of proteins and how they interact with certain drug combinations, 2) invention of new materials through simulations at the atomic level. This will allow us to build better conductors and denser battery systems, thus transforming the U.S. energy infrastructure almost overnight, and 3) the ability to run simulations of complex societal and economic systems. This will allow us to more efficiently forecast economic growth and design better public policies and urban planning tools.

The U.S. Needs to Rejuvenate the Global Supercomputing Race

Technology, along with increasing access to cheap energy, is the lifeblood of a growing, modern economy. As we discussed in our December 2, 2012 article (“The Global Productivity Riddle and the Supercomputing Race“), fully 85% of productivity growth in the 20th century could be attributed to technological progress, as well as increasing accessibility/sharing of cheap energy sources due to innovations in oil and natural gas hydraulic fracturing, ultra-deep water drilling, solar panel productivity, and the commercialization of Generation III+ nuclear power plants and deployment of smart power grids.

Perhaps the most cited example where the combined effects of technological and human capital investments have had the most economic impact is the extreme decline in computing and communication costs. Moore’s Law, the ability of computer engineers to double the amount of computing power in any given space every 2 years, has been in effect since the invention of the transistor in the late 1940s. Parallel to this has been the rise of the supercomputing industry. Started by Seymour Cray at Control Data Corporation in the 1960s, the supercomputing industry has played a paramount role in advancing the sciences, most recently in computationally intensive fields such as weather forecasting, oil and gas exploration, human genome sequencing, molecular modeling, and physical simulations with the purpose of designing more aerodynamic aircrafts or better conducting materials. No doubt, breakthroughs in more efficient supercomputing technologies and processes is integral to the ongoing growth in our living standards in the 21st century.

Unfortunately, advances in both the U.S. and global supercomputing industry has lagged in the last several years. Every six months, a list of the world’s top 500 most powerful supercomputers is published. The latest list was compiled in June 2015; aside from providing the most up-to-date supercomputing statistics, the semi-annual list also publishes the historical progress of global supercomputing power, each country’s share of global supercomputing power, as well as a reasonable accurate projection of what lies ahead. Figure 1 below is a log chart summarizing the progression of the top 500 list from its inception in 1993.

Figure 1: Historical Performance of the World’s Top 500 Supercomputers

top500progressAs shown in Figure 1 above, both the sum of the world’s top 500 computing power, as well as the #1 ranked supercomputer, has remained relatively stagnant over the last several years. Just three years ago, there was serious discussion of the commercialization of an “exaflop” supercomputer (i.e. a supercomputer capable of 1 x 10^18 calculations per second) by the 2018-2019 time frame. Today, the world’s top computer scientists are targeting a more distant time frame of 2023.

From the U.S. perspective, the slowdown in the advent of the supercomputing industry is even more worrying. Not only has innovation slowed down at the global level, but the U.S. share of global supercomputing power has been declining as well. Three years ago, the U.S. housed 55% of the world’s top 500 supercomputing power; Japan was second, with 12% of the world’s supercomputing power. Rounding out the top five were China (8%), Germany (6%), and France (5%). Today, the U.S. houses only 46% of the world’s supercomputing power, with countries such as the UK, India, Korea, and Russia gaining ground.

Figure 2: Supercomputing Power Distributed by Country

top500countryshare

Bottom line: Since the invention of the transistor in the late 1940s and the advent of the supercomputing industry in the 1960s, the U.S. has always led the supercomputing industry in terms of innovation and sheer computing power. With countries such as China and India further industrializing and developing their computer science/engineering expertise (mostly with government funding), U.S. policymakers must encourage and provide more resources to stay ahead of the supercomputing race. To that end, President Obama’s most recent executive order calling for the creation of a National Strategic Computing Initiative–with the goal of building an “exascale” supercomputer–is a step in the right direction. At this point, however, whether the industry can deploy an energy-efficient exascale supercomputer by the less ambitious 2023 time frame is still an open question.

The CB Capital Global Diffusion Index Says Higher Oil Prices in 2015

We first introduced our CB Capital Global Diffusion Index (“CBGDI”) in our March 17, 2013 commentary (“The Message of the CB Capital Global Diffusion Index: A Bottom in WTI Crude Oil Prices“), when WTI crude oil traded at $93 a barrel. Based on the strength in the CBGDI at the time, we asserted that WTI crude oil prices have bottomed, and that WTI crude oil is a “buy” on any further price weakness. Over the next six months, the WTI crude oil spot price would rise to over $106 a barrel.

To recap, we have constructed a “Global Diffusion Index” by aggregating and equal-weighting (on a 3-month moving average basis) the leading indicators data for 30 major countries in the Organisation for Economic Co-operation and Development (OECD), along with China, Brazil, Turkey, India, Indonesia, and Russia. Termed the CBGDI, this indicator has historically led or tracked the MSCI All-Country World Index and WTI crude oil prices since the fall of the Berlin Wall. Historically, the rate of change (i.e. the 2nd derivative) of the CBGDI has led WTI crude oil prices by about three months with an R-squared of 30%, while tracking or leading the MSCI All-Country World Index slightly, with an R-squared of over 40% (naturally, as stock prices actually make up one component of the OECD leading indicators).

Our logic rests on the fact that the vast majority of global economic growth in the 20th century was only possible because of an exponential increase in energy consumption and sources of supply. Since 1980, real global GDP has increased by approximately 180%; with global energy consumption almost doubling from 300 quadrillion Btu to 550 quadrillion Btu today. That is–for all the talk about energy efficiencies–the majority of our economic growth was predicated on the discovery and harnessing of new sources of energy (e.g. oil & gas shale fracking). Until we commercialize alternative, and cheaper sources of energy, global economic growth is still dependent on the consumption of fossil fuels, with crude oil being our main transportation fuel. As such, it is reasonable to conclude that–despite the ongoing increase in U.S. oil production–a rising global economy will lead to higher crude oil prices.

This is what the CBGDI is still showing today, i.e. WTI crude oil prices should rise from the current $74 spot as the CBGDI still suggests significant global economic growth in 2015. The following monthly chart shows the year-over-year % change in the CBGDI and the rate of change (the 2nd derivative) of the CBGDI, versus the year-over-year % change in WTI crude oil prices and the MSCI All-Country World Index from March 1990 to November 2014. All four indicators are smoothed on a three-month moving average basis:

CBGDI September 2014As noted, the rate of change (2nd derivative) in the CBGDI (red line) has historically led the YoY% change in WTI crude oil prices by about three months. The major exceptions have been: 1) the relentless rise in WTI crude oil prices earlier last decade (as supply issues and Chinese demand came to the forefront), and 2) the explosion of WTI crude oil prices during the summer of 2008, as commodity index funds became very popular and as balance sheet/funding constraints prevented many producers from hedging their production.

The second derivative of the CBGDI bottomed at the end of 2011, and is still very much in positive territory, implying strong global oil demand growth in 2015. Most recently, of course, the WTI crude oil prices have diverged from the CBGDI, and are now down 20% on a year-over-year basis. While we recognize there are still short-term headwinds (e.g. U.S. domestic oil production is still projected to rise from 9 million barrels/day today to 9.5 million barrels/day next year), we believe the current price decline is overblown. We project WTI crude oil prices to average $80 a barrel next year. In addition to our latest CBGDI readings, we believe the following will also affect WTI crude oil prices in 2015:

  1. An imminent, 1-trillion euro, quantitative easing policy by the ECB: The ECB has no choice. With the euro still arguably overvalued (especially against the US$ and the Japanese yen), many countries in the Euro Zone remain uncompetitive, including France. On a more immediate basis, inflation in the Euro Zone has continued to undershoot the ECB’s target. A quantitative easing policy by the ECB that involves purchasing sovereign and corporate bonds will lower funding costs for 330 million Europeans and generate more end-user demand ranging from heaving machinery to consumer goods. While such a policy will strengthen the value of the U.S. dollar, we believe the resultant increase in oil demand will drive up oil prices on a net basis.
  2. The growth in shale oil drilling by the independent producers are inherently unpredictable. Over the last several years, the U.S. EIA has consistently underestimated the growth in oil production from fracking. With WTI crude oil prices having declined by nearly 30% over the last four months, we would be surprised if there is no significant cutback in shale oil drilling next year. Again, the EIA has consistently underestimated production growth on the upside, so we would not be surprised if the agency overestimates production growth (or lack thereof) on the downside as well.
  3. Consensus suggests that OPEC will refrain from cutting production at the November 27 meeting in Vienna. With U.S. shale oil drilling activity still near record highs (the current oil rig count at 1,578 is only 31 rigs away from the all-time high set last month), any meaningful production cut (500,000 barrels/day or higher) by OPEC will only encourage more U.S. shale oil drilling activity. More importantly, Saudi Arabia has tried this before in the early 1980s (when it cut its production from 10 million barrels/day in 1980 to just 2.5 million barrels/day in 1985 in order to prop up prices), ultimately failing when other OPEC members did not follow suit, while encouraging the growth in North Sea oil production. Moreover, OPEC countries such as Venezuela and Iran cannot cut any production as their budgets are based on oil prices at $120 and $140 a barrel, respectively. As a result, it is highly unlikely that OPEC will implement any meaningful policy change at the November 27 meeting.

With U.S. shale oil drilling activity still near record highs, we believe WTI crude oil prices are still biased towards the downside in the short run. But we believe the recent decline in WTI crude oil prices is overblown. Beginning next year, we expect U.S. shale oil drilling activity to slow down as capex budgets are cut and financing for drilling budgets becomes less readily available. Combined with the strength in our latest CBGDI readings, as well as imminent easing by the ECB, we believe WTI crude oil prices will recover in 2015, averaging around $80 a barrel.

The New 21st Century Capitalism: The Curious Case of Veronica Mars

The class struggle between the bourgeois capitalist and the proletariat—classified by Karl Marx as two distinct but interdependent social classes—is arguably Capitalism’s most famous “internal contradiction.” Marx argued 150 years ago that the constant competition for “surplus value” (whose definition is based on Marx’s misguided “Labor Theory of Value”) between capitalists and workers would result in ever-larger cycles and crises—ultimately bringing about the demise of the capitalist system.

Along with the constantly diminishing number of the magnates of capital, who usurp and monopolize all advantages of this process of transformation, grows the mass of misery, oppression, slavery, degradation, exploitation; but with this too grows the revolt of the working-class, a class always increasing in numbers, and disciplined, united, organized by the very mechanism of the process of capitalist production itself … Centralization of the means of production and socialization of labour at last reach a point where they become incompatible with their capitalist integument. This integument bursts asunder. The knell of capitalist private property sounds. The expropriators are expropriated.

Karl Marx gave a fine gift to those seeking only to dissect and disprove his arguments, and better understand the essence of capitalism. Political leaders who adopted Marxist thought in any practical manner would destroy their own countries and the lives of millions of people (Hugo Chavez also gave us a fine gift: the basket case that is Venezuela, despite $100 oil, provides ready ammunition for any dinner debate on the virtues of capitalism). In fact, Joseph Schumpeter would devote over 50 pages to discrediting Marxian thought in his seminal work “Capitalism, Socialism and Democracy.

Counter-arguments to silly notions such as the Labor Theory of Value, distinct social classes such as the bourgeois capitalist and the proletariat, the proletariats’ overrated organizational skills, and the inevitable result of revolution are common everywhere in the 21st century global economy. The quintessential example is Apple’s iPhone (Disclosure: I went long AAPL at $401.07 on April 23rd). The iPhone’s design—along with the organizational/marketing skills required to manufacture/sell such a product on a global scale—reigns supreme. Virtually all the “surplus value” accrue to the design, marketing, and organizational team at Apple—and rightly so—while the Chinese manufacturers and their laborers accrue little, if any. Furthermore, the sense of high drama required for any revolution—one sufficient to overthrow the bourgeois—is missing. The poetic sense of Revolution as conveyed by the likes of Thomas Jefferson and James Madison is simply not attractive in a society where the Forbes 400 list turns over once every 20 years, and where kids of all ages are addicted to video games. Capitalism, over time, naturally overthrows each and every bourgeois capitalist through the process of “creative destruction” and the lack of hereditary titles. In addition, the PC and internet revolutions—accompanied by the rise of global networks/corporations such as Amazon, eBay, and Google—have allowed all those that dare to become a capitalist and entrepreneur. In fact, recent trends suggest that the global capitalist/entrepreneurial spirit is growing stronger than ever. For example, employee compensation as a percentage of U.S. GDP has declined to a new 58-year low (as shown on the following chart). Yes, U.S. corporations and businesses are squeezing more out of its workers, even as economic growth recovers. At the same time, this also suggests that more Americans are becoming entrepreneurs and starting their own businesses.

USwages%GDP

With the further advent of automation, 3-D printing, and globalization (including the democratization of education, communications, and labor mobility), U.S. labor will continue to be squeezed in the long run. Consider this: I know folks being paid six-figures by pressing buttons to optimize an investment portfolio and to make/reconcile trades. They have little understanding behind the quantitative investment/optimization process; and certainly cannot replicate or design a new system on their own. These folks are what Ayn Rand would call “the second-handers” who are simply feeding off the scraps of the creative class—and thus do not deserve six-figures. The market will recognize this over time.

Make no mistake: The capitalists will continue to rise and become ever more important as the 21st century progresses. French economist Charles Gave (of GaveKal) coined the term “Platform Company” to describe high ROC companies focused on design, marketing, and organization, while spinning off their manufacturing responsibilities (again, the quintessential example is Apple Computer). Of course, such a business model only works in a world that abides by the rules of free trade. In the first decade of the 21st century, the platform company model was the Holy Grail for capitalists: Design and market; outsource manufacturing and produce everywhere. A platform company’s capital (the denominator in “ROC”) is mainly the people it hires. In most cases, the company did not have to invest in such capital; their parents and alma maters did most of the work. The success of the platform company model is exemplified by Apple’s extremely high margins (making up just 2.0% of total S&P 500’s sales; but 5.6% of all profits), along with its high growth rates in recent years (following table courtesy Goldman Sachs).

S&Pprofitsmargins

In the second decade of the 21st century, the platform company model itself is being turned on its head. The first inkling came with the creation of Wikipedia in 2001—giving volunteers from all over the world to create value and compete with private interests through mass collaboration. Termed “Wikinomics,” this trend of creating value through mass voluntary collaboration would result in efforts such as the Amazon Review System, the YouTube video sharing platform, and social networking. Corporations would seek to monetize the value of such mass voluntary collaboration. As long as the volunteers are happy (many of them are—either because they are only seeking to find their voice, or an audience for their future work), companies like Amazon, eBay, and Facebook could keep their competitive advantages by maintaining their strong network effects. The ROC of such business models is extremely high—as unlike say, Apple, YouTube has no need to pay salaries to content creators.

In other words, Marx’s prediction that—one day—the proletariat will overthrow the bourgeois capitalist has been turned on its head. Far from overthrowing the capitalists, the proletariat (I use this term very loosely) is happy to contribute to the capitalists’ success on a voluntary basis. Writing an Amazon review or producing a YouTube video is far from the backbreaking work of the 19th century; more important, such activities allows each of us to express our individuality—or as a first step to monetize our creativity. The success of the platform company and the Wikinomics models is now leading into something else—which I call the “Veronica Mars” model. The Veronica Mars Movie Project was the first Kickstarter effort to revive a cult classic through its fan base. To the surprise of many (including its creator Rob Thomas, and the franchise owner, Warner Brothers), the Kickstarter effort reached its fund-raising goal of $2 million in less than 10 hours. By the end of the 30-day fund-raising period, it has raised over $5.7 million through 91,000 fans. The Veronica Mars Movie Project set a precedent (good or bad—you decide); and subsequently actor/director Zach Braff has jumped on the Kickstarter bandwagon. His project “Wish I Was Here” reached his fund-raising goal of $2 million after only three days.

Yes, both the movie studios and the creators have spent enormous amounts of capital on such franchises. Nevertheless, on a stand-alone basis, the ROCs (which mostly accrue to Warner Brothers and other publicly owned studios) of these fan-funded projects are literally infinity. So yes, Mr. Byron Wein, while there are practical limits, the ROCs of U.S. corporations can continue to rise. Capitalists—through strong creative, marketing, organizational, and outreach efforts to customers/fans—can now have our cake and eat it too. Business school textbooks and HBS cases alike are being revised as we speak. To paraphrase former President Richard Nixon (with a slight twist)—like it or not—we are all capitalists now. Either become one; or die.

The Message of the CB Capital Global Diffusion Index: A Bottom in WTI Crude Oil Prices

Neoclassical economics cannot explain the spike in real global wealth per capita (nearly 10x) in the 20th century. The classic Cobb-Douglas Model attempts to explain global GDP growth through three major inputs: 1) “Total factor productivity,” 2) labor (L), and 3) capital (K). Both L and K can be quantified and explained. However, studies have shown that fully 70% to 80% of the increase in economic output during the 20th century came from “total factor productivity,” i.e. an exogenous factor that resembles technological growth and adoption—leading to increased overall productivity. Economists have a hard time explaining the origin of “total factor productivity.” We know that education (investment in human capital), venture capital, the capitalist system, and the sharing of ideas all play a role, but we do not truly understand why their benefits were unique to the 20th century (and to a lesser extent, the 19th century, when real global wealth per capita grew by 3x—please request a copy of our January 2013 newsletter for a more detailed analysis).

What we do know is that the vast majority of global economic growth in the 20th century was predicated on an exponential increase in energy consumption. In other words, productivity growth—20th century’s main economic driver—was mostly a result of increasing energy consumption. Every technological breakthrough, such as modern-day jets, computers, fiber optics, automobiles, etc. required the consumption of increasing amounts of energy. In some areas, we have made efficiency breakthroughs (e.g. the shrinkage of CPUs), but in other areas, not so much (e.g. the internal combustion engine). Such growth is especially amazing given the mass human failures of the 20th century, such as World War I & II, the rise of communism and Nazism, as well as the Korean and Vietnamese Wars. More important: If the 21st century global economy is to grow in the same trajectory as that of the 20th century, global leaders will need to find cheaper and alternative sources of energy—horizontal drilling and fracking notwithstanding.

In the meantime, global economic growth is still dependent on the consumption of fossil fuels, with crude oil being our main transportation fuel. As such, it is reasonable to conclude that—despite the increase in U.S. oil production—a rising global economy will lead to higher crude oil prices (China just surpassed the U.S. as the number one oil importer). A prediction for future oil prices thus requires an analysis of the performance of the global economy, given the globalized nature of oil. Using the Leading Indicators data for 30 major countries in the Organization for Economic Co-operation and Development (OECD), along with China, Brazil, Turkey, India, Indonesia, and Russia, we have constructed a “Global Diffusion Index” which have historically led or tracked the MSCI All-Country World Index and WTI crude oil prices since the fall of the Berlin Wall. We label it the “CB Capital Global Diffusion Index” (“CBGDI”), which is essentially an advance/decline line of the OECD leading indicators—smoothed on a three-month moving average basis. Historically, the rate of change  of the CBGDI has led WTI crude oil prices by about three months, with an R-squared of 30%, while tracking or leading the MSCI All-Country World Index slightly, with an R-squared of over 40% (not surprising, since stock prices are one component of the OECD Leading Indicators). Following is a monthly chart showing the year-over-year % change in the CBGDI, and the rate of change (the 2nd derivative) of the CBGDI, versus the year-over-year % change in the MSCI All-Country World Index and the year-over-year % change in WTI crude oil prices from March 1990 to February 2013. All four of these indicators have been smoothed on a three-month moving average basis:

CBGDI February 2013As noted on the above chart, the rate of change (second derivative) in the CBGDI (red line) has historically led the YoY% change in WTI crude oil prices by about three months. The major exceptions have been 1) the relentless rise in WTI crude oil prices earlier last decade (as supply issues came to the forefront), and 2) the explosion of WTI crude oil prices during the summer of 2008, as commodity index funds became very popular and as balance sheet/funding constraints prevented producers from hedging their production. The second derivative of the CBGDI troughed at the end of 2011, and has continued to rise—implying higher global stock and energy prices. While we recognize that U.S. crude oil production is set to rise by 1.4 million bbl/day over the next two years (not a trivial amount, as 1.4 million bbl/day is equivalent to total Indonesian oil consumption), recent strength in the CBGDI suggests that WTI crude oil prices have at least bottomed, for now. More important–unless celllulosic ethanol or second-generation biofuels are commercialized in the next several years (or unless room-temperature superconductors are discovered tomorrow)–this suggests that the secular bull market in oil prices is not over.

As we are finalizing this commentary, we understand that Cyprus may be experiencing a bank run, as the EU had suggested taxing bank deposits to pay for the country’s bailout. If implemented, this will set a very bad precedent and will have long-term (adverse) repercussions in the European banking system. Such a Black Swan scenario may weaken commodity and energy prices in the short-run, but we believe WTI crude oil is a “buy” on any further price weakness.

The Global Productivity Riddle and the Supercomputing Race

Neoclassical (mainstream) economists define productivity as economic output (usually GDP) per unit of input, with the latter typically being capital and labor. An analysis of economic growth in the 20th century, however, suggests that physical capital per worker accounted for at most 15% of this increase. Fully 85% of productivity growth in the 20th century cannot be explained by mainstream economists.

Nobel Laureate Robert Solow was the first to propose that fully 85% of productivity growth in the 20th century was simply “a measure of our ignorance,” which he labels as technological progress.  Robert Ayres, a renowned economist and physicist, asserts that the increasing consumption of energy in the 20th century explains nearly all of the productivity growth in the 20th century. We will explore this issue in a future newsletter, but not surprisingly, the future/survival of the global economy is intrinsically linked with the availability of (cheap) energy sources. With the exception of a few industries (computers, communications, healthcare) and advances over the last decade (solar PV, hybrid vehicles, carbon composites on the Boeing 787, the smart grid, etc.), productivity growth in industrial countries since the end of WWII has relied mostly on the increasing consumption of fossil fuels. While natural gas will act as a legitimate “bridge fuel” for at least the next decade, it is imperative to make ongoing investments in alternative energy technologies– as the “externalities” of burning fossil fuels remain high, especially in parts of China and other densely populated areas.

Speaking of advances in the computing industry, the world of supercomputing and supercomputing research remains an area of U.S. domination. Last month, the 40th semi-annual edition of the top 500 list of the world’s most powerful supercomputers was published at the SC2012 supercomputing conference in Salt Lake City. Last year’s biggest surprise (at least to those outside the supercomputing community) was the ascendance of the Japanese in the rankings. At the time, the number one supercomputer was the “K Computer” built by Fujitsu, using its own proprietary SPARC64 Vlllfx CPUs.  Powered by 88,128 CPUs (each with eight cores, for a total 705,024 cores), the “K Computer” is capable of a peak performance of 10.51 petaflops.

Beginning this year, however, the U.S. regained the supercomputing crown, when the IBM Blue Gene-powered “Sequoia” at the Lawrence Livermore Laboratory came online with a staggering peak performance of 16.33 petaflops. The latest November 2012 list ushered in a new number one ranked supercomputer–the AMD CPU/Nvidia GPU powered monster, “Titan”–coming in at 17.59 petaflops. More important, Titan is slated for civilian use. One of its first projects is to run simulations designed to improve the efficiency of diesel and biofuel engines.

On the other hand, the Chinese, which captured the supercomputing race in October 2010 with its Tianhe-1A supercomputer at the National Supercomputing Center in Tianjin (rated at 2.57 petaflops) has now sunk to 8th place.

From a geopolitical standpoint, the United States has re-occupied the top spot after ceding to the Japanese last year, and the Chinese the year before.  On a country basis, the U.S. houses 55% of the top 500 supercomputers, up from 43% just 12 months ago (by supercomputing power; note that the NSA – which houses some of the most powerful systems in the world – stopped reporting in 1998).  Japan is second, with 12% of the world’s supercomputing power.  Rounding out the top five are China (8%), Germany (6%), and France (5%). The UK, which ranked third just three years ago (with 5.5% of the world’s supercomputing power), is now in 6th place, housing just 4.5% of the world’s supercomputing power.

Aside from providing the most up-to-date supercomputing statistics, the semi-annual list also publishes the historical progress of global supercomputing power – as well as a reasonably accurate projection of what lies ahead.  Following is a log chart summarizing the progression of the top 500 list since its inception in 1993, along with a ten-year projection:supercomputing

Today, a desktop with an Intel Core i7 processor operates at about 100 gigaflops (note that we are ignoring the GPU in our graphics processor from our calculations) – or the equivalent of an “entry-level” supercomputer on the top 500 list in 2001, or the most powerful supercomputer in the world in 1993.  On the highest end, the power of the Titan Supercomputer is equivalent to the combined performance of the world’s top 500 supercomputers just four years ago.  Moreover,the combined performance of “Sequoia” and “Titan,” makes up more than 20% of the combined performance of all the supercomputers in the top 500 list today.  By the 41st semi-annual edition of the Top 500 supercomputers next June, the combined performance of the world’s 500 fastest supercomputers should exceed 200 petaflops (compared to 162 petaflops today, and just 74 petaflops a year ago).

Simulations that would have taken 10 years of computing hours for the most powerful supercomputer two years ago take just a year on Titan (roughly, since Linpack—the benchmark used to measure supercomputing performance—is not exactly representative of real-world supercomputing performance).  Tasks that take an immense amount of computing time today – such as precision weather forecasts, gene sequencing, airplane and automobile design, protein folding, etc. – will continue to be streamlined as newer and more efficient processors/software are designed.  By 2018-2019, the top supercomputer should reach a sustained performance of an exaflop (i.e. 1,000 petaflops)—this is both SGI’s and Intel’s goal.  IBM believes that such a system is needed to support the “Square Kilometre Array”—a radio telescope in development that will be able to survey the sky 10,000 times faster than ever before, and 50 times more sensitive than any current radio instrument—and will provide better answers to the origin and evolution of the universe.  The ongoing “democratization” of the supercomputing industry would also result in improvements in solar panel designs, better conductors, more effective drugs, etc.  As long as global technology innovation isn’t stifled, the outlook for global productivity growth – and by extension, global economic growth and standard of living improvements – will remain bright for years to come.  Advances in material designs would also propel the private sector’s efforts to commercialize space travel and reduce the costs of launching satellites.  Should the quantum computer be commercialized soon (note that quantum computing advances are coming at a dramatic rate) we should get ready for the next major technological revolution (and secular bull market) by 2015 to 2020.  Make no mistake: The impact of the next technological revolution will dwarf that of the first and second industrial revolutions.