Three Cryptocurrencies to Avoid as Initial Coin Offerings (ICOs) Enter Into a Frenzy

Bitcoin, cryptocurrencies, and the underlying technology that have enabled the growth of the cryptocurrency and the token market, Blockchain, have exploded into the public consciousness over the last six months. Since the beginning of the year, the price of a Bitcoin rose from $900 to nearly $3,000 at its peak a month ago. The surge of the price of Ethereum, the second most popular cryptocurrency by circulating market capitalization, was even more impressive. From $10 in the beginning of 2017, the price of Ethereum surged to $400 a month ago. Ethereum is currently trading at around $200, half its price of a month ago but still up 1,900% since the beginning of 2017.

Cryptocurrencies’ underlying technology, the Blockchain, originated from the concept of “linked timestamping” to better secure documents as first proposed by Stuart Haber and Scott Stornetta in 1990. The rationale was to create a more secure system than the public-key signature based time-stamping. This made the document’s timestamp impossible to change after the fact. A subsequent improvement replaced the concept of linking documents individually into a collection of blocks, all of which were then linked together in a chain. Instead of a linear connection, the documents within each block are linked together in a tree structure, which utilizes less resources for the positioning verification of a document in the history of a system. Bitcoin combined the idea of linked timestamping and the usage of computational puzzles to regulate the creation of new currency units. Under the Bitcoin regime, a hacker cannot realistically change the history of transactions unless he or she can compute computational puzzles faster than all Bitcoin participants combined. This ensures the security and integrity of the Bitcoin regime. This was a breakthrough in that for the first time, it solved the dilemma of potential “double spending” within a decentralized, digital payments system.

The idea of a secure, frictionless, and decentralized system for settling cross-border payments and storing/sharing information is beginning to gain momentum for global businesses. This is especially the case for emerging market countries, where there is a profound lack of large, reliable centralized financial clearing/payments systems.  E.g. a group of Indian banks that includes the State Bank of India and ICICI Bank recently agreed to use Microsoft’s Azure blockchain-as-a-service solution to host their distributed ledger systems.  Similarly, Bajaj, the Indian subsidiary of Allianz, just began to implement a Blockchain solution to speed up insurance claims for travelers and motorists. Similar Blockchain-based initiatives which aim to expedite international, cross-border capital transfers include:

  • 60 banks are now commercially deploying enterprise software developed by Ripple, a San Francisco startup. In April, BBVA successfully completed a series of money transfers between Spain and Mexico through the company’s proprietary distributed ledger technology. According to American Banker, the transfers took seconds, compared to the four days they normally take. So far, Japanese banks have had the greatest uptake, with a consortium of 59 Japanese banks having successfully completed a pilot with the software. 40% of all Japanese customer banking accounts will have access to the Ripple-based blockchain solution by October this year;
  • Backed by Goldman Sachs and Baidu, Boston-based Circle Internet Financial launched an international money transfer service last month allowing zero-cost, cross-border transfers between the U.S. and Europe through its Blockchain-based peer-to-peer payment network;
  • In May, Bank of Tokyo-Mitsubishi UFJ began testing its own cryptocurrency (MUFG coin), which will allow users to instantly transfer money on a peer-to-peer basis via the app or to purchase goods and services at affiliated stores. Currently, around 200 of the bank’s employees are testing the MUFG coins; the bank has plans to expand the trial across its branches by the end of 2017.

Despite my long-term constructive outlook on Blockchain, however, I believe the recent surge in prices and the number of initial coin offerings (ICOs) represents significant exuberance (for this article, I will use the terms “cryptocurrency” and “tokens” interchangeably). As shown in the following chart, the total market capitalization of all cryptocurrencies embarked on a parabolic move beginning in January 2017. From $5.6 billion as of January 1, 2015 ($4.4 billion of which is Bitcoin’s market capitalization), the aggregate market capitalization (based on circulating supply) of all cryptocurrencies surged to $115 billion as of June 14, 2017. As of this writing, the aggregate market capitalization (encompassing the value of 811 cryptocurrencies) totals $80 billion.


In addition to the influx of capital and the rising number and size of ICOs (the record $232 million Tezos ICO being the latest to take advantage of such investors’ exuberance), there are other red flags which suggest that investors should be cautious of the cryptocurrency space:

  • Lack of transparency: In a regular U.S. initial public offering (IPO), the SEC requires the company issuing shares to file a Form S-1, where information is provided regarding the use of IPO proceeds, the company’s business model and competition, as well as a prospectus disclosing the names of the company’s principals and financial information. There is no similar process in the cryptocurrency space. E.g. Tezos published an 18-page position paper and a 17-page white paper describing Tezos as a “generic and self-amending crypto-ledger” and that it “supports meta upgrades,” i.e. “the protocols can evolve by amending their own code.” In other words, Tezos aims to be the be-all and end-all of the cryptocurrency world, but details are lacking. Even during the “irrational exuberance” days of the technology bubble, there was much more disclosure. Presumably, the founders of Tezos will get filthy rich once the token starts trading;
  • The ease of new ICO creation: A recent article asserts the “hard cap” on the lifetime supply of Bitcoins (21 million) and Litecoins (84 million) is inherently deflationary, i.e. both Bitcoins and Litecoins may be considered a long-term store of value similar to gold, New York real estate, or farmland. While Bitcoin and Litecoin have been in existence since 2009 and 2011, respectively, such track records are still comparatively short in the context of monetary history. With new and more innovative cryptocurrencies being created all the time, there is no guarantee that either Bitcoin or Litecoin will retain their value over time. The creation of truly anonymous cryptocurrencies such as Zerocoin may also render existing cryptocurrencies such as Bitcoin and Litecoin less attractive over time.
  • Lack of recourse: An investor that has purchased the debt or equity of a company has a legal claim and resource on the company’s assets in the event of a bankruptcy filing. Should the value of a cryptocurrency plunge to zero for whatever reason, there is no legal resource whatsoever.

The excitement of the future impact and practical applications of the Blockchain technology has no doubt driven the latest exuberance in the cryptocurrency market and the hunger for new ICOs. While I believe Blockchain will change the way we fundamentally do business, store/exchange information, and move capital across borders, history has demonstrated that most early investments into a new technology typically do not work out. E.g. the vast majority of start-up auto companies in the early 20th century/internet companies in the late 1990s went out of business, leaving investors with nothing. Specifically, based on my research, I would avoid the following three cryptocurrencies/tokens:

Ripple (XRP): XRP is designed as a “bridge currency” for use when a transaction between two currencies on the Ripple protocol cannot be made because one or both currencies are rarely traded. In other words, the Ripple protocol is designed primarily to be a currency exchange, enabling “secure, instant and nearly free global financial transactions of any size with no chargebacks.” Should liquidity between currency pairs on the protocol increase over time, there will be a decreasing need for XRP. In addition, of the 100 billion XRPs issued, more than 60 billion is being held by Ripple Labs. While Ripple Labs has locked up 55 billion XRPs through a series of 55 “smart contracts” (with one contract expiring every month for a period of 55 months), it does not change the fact that the creators of Ripple still own the majority of XRPs and would stand to benefit the most from a surge in the cryptocurrency.

Numeraire (NMR): In February of this year, one million Numeraire tokens were issued to 12,000 data scientists as an incentive to create a profitable global long-short equity fund constructed with algorithms via a collaboration between the 12,000 data scientists. According to the issuing firm, Numerai, financial data is “transformed and regularized” and is then fed to its data scientists through an encryption process. This idea has three major red flags. Firstly, there are hundreds of quantitative funds which have been designing algorithms utilizing financial data for global equities for many years, if not decades. Financial data is a commodity and is readily available through Compustat or FactSet; unstructured data such as granular, real-time weather forecasts (which is highly valuable if one was speculating in agricultural futures) or Twitter feeds gauging real-time traffic at Starbucks’ locations are unique, but based on public information, Numeraire is purely using financial, and thus, commoditized data. Secondly, there is an asymmetry of information between the data scientists and the public, the latter of which could also purchase NMR. Presumably, the data scientists working on the algorithms should have a better understanding of their utility, and could buy or sell NMR in advance to take advantage of this information asymmetry. Finally, the top data scientist on the platform is currently limited to a $54,000 annual payout, which is hardly an incentive for the most talented data scientists to work on the project, even on a casual basis..

Litecoin (LTC): While the Litecoin protocol has lower transaction fees and faster times than that of Bitcoin, many speculators bought LTC recently due to an expected August 1st launch of an MIT project related to LTC. My sense is that this countdown page purportedly created by MIT is a hoax and its primary purpose is to pump up the price of LTC going into August 1st. The cryptocurrency space is unregulated. This means there is no process for prosecution if crypto-investors find out on August 1st that MIT isn’t launching a product, after all. Secondly, it is out of MIT’s character to create such suspense when they know this will lead to greater speculation in LTC ahead of the August 1st date. The group behind MIT’s Digital Currency Initiative, along with Charlie Lee, the creator of Litecoin, have said they do not know anything about the countdown page. Since any MIT student can create a subdomain on, the perpetrator may be an MIT student or someone related to an MIT student who is speculating on the price of LTC. Buyer beware.

The Coming Breakthroughs in the Global Supercomputing Race

In our August 31, 2015 article (“The U.S. Needs to Rejuvenate the Global Supercomputing Race“), we expressed our concerns regarding the state of the global supercomputing industry; specifically, from a U.S. perspective, the sustainability of Moore’s Law, as well as increasing competition from the Chinese supercomputing industry. Below is a summary of the concerns that we expressed:

  • Technological innovation, along with increasing access to cheap, abundant energy, is the lifeblood of a growing, modern economy. As chronicled by Professor Robert Gordon in “The Rise and Fall of American Growth,” U.S. productivity growth (see Figure 1 below; sources: Professor Gordon & American Enterprise Institute)–with the exception of a brief spurt from 1997-2004–peaked during the period from the late 1920s to the early 1950s; by 1970, much of today’s everyday household conveniences, along with the most important innovations in transportation & medicine, have already been invented and diffused across the U.S. Since 1970, almost all of the U.S. productivity growth could be attributed to the adoption and advances in the PC, investments in our fiber optic and wireless networks, along with the accompanying growth of the U.S. software industry (other impactful technologies since the 1970s include: the advent of hydraulic fracturing in oil & gas shale, ultra deepwater drilling in the Gulf of Mexico, as well as the commercialization of alternative energy and more efficient battery storage systems, as we first discussed in our July 27, 2014 article “How Fracking Saved the U.S. Economy“). This means that a stagnation in the U.S. computing or communications industries would result in an invariable slowdown in U.S/global productivity growth;


  • The progress of the U.S. supercomputing industry, as measured by the traditional FLOPS (floating-point operations per second) benchmark, had experienced a relative stagnation when we last wrote about the topic in August 2015. E.g. in 2011, both Intel and SGI seriously discussed the commercialization of an “exascale” supercomputer (i.e. a system capable of performing 1 x 10^18 calculations per second) by the 2019-2020 time frame. As of today, the U.S. supercomputing community has pushed back its target time frame of building an exascale supercomputer to 2023;
  • At the country-specific level, the U.S. share of global supercomputing systems has been declining. As recent as 2012, the U.S. housed 55% of the world’s top 500 supercomputing systems; Japan was second, with 12% of the world’s supercomputing systems, with China (8%) in third place. By the summer of 2015, the U.S. share of the world’s top 500 supercomputing systems has shrunk to 46%, although both Japan and China remained a distant second at 8%. Today, the Chinese supercomputing industry has led an unprecedented surge to claim parity with the U.S, as shown in Figure 2 below.

Figure 2: China – Reaching Parity with the U.S. in the # of Top 500 Supercomputerstop500

Since the invention of the transistor in the late 1940s and the advent of the supercomputing industry in the 1960s, the U.S. has always been the leader in the supercomputing industry in terms of innovation, sheer computing power, and building the customized software needed to take advantage of said supercomputing power (e.g. software designed for precision weather forecasting, gene sequencing, airplane and automobile design, protein folding, and now, artificial intelligence, etc.). With U.S. economic growth increasingly dependent on innovations in the U.S. computing industry and communications network–and with China now threatening to surpass the U.S. in terms of supercomputing power (caveat: China’s HPC software industry is probably still a decade behind)–it is imperative for both U.S. policymakers and corporations to encourage and provide more resources for the U.S. to stay ahead of the supercomputing race.

Unlike the tone of our August 31, 2015 article, however, we have grown more hopeful, primarily because of the following developments:

  • Moore’s Law is still alive and well: At CES 2017 in Las Vegas, Intel declared that Moore’s Law remains relevant, with a second-half target release date for its 10 nano-meter microprocessor chips.  At a subsequent nationally-televised meeting with President Trump earlier this month, Intel CEO Brian Krzanich announced the construction of its $7 billion Fab 42 in Arizona, a pilot plant for its new 7 nano-meter chips. Commercial production of the 7nm chips is schedule to occur in the 2020-2022 time frame, with most analysts expecting the new plant to incorporate more exotic technologies, such as gallium-nitride as a semiconductor material. The next iteration is 5nm chips; beyond 5 nano-meters, however, a more fundamental solution to extend Moore’s Law will need to occur, e.g. commercializing a graphene-based transistor;
  • GPU integration into supercomputing systems: The modern-day era of the GPU (graphics process unit) began in May 1995, when Nvidia commercialized its first graphics chip, the NV1, the first commercially-available GPU capable of 3D rendering and video acceleration. Unlike a CPU, the GPU is embedded with multiple threads of processing power, allowing it to perform many times more simultaneous calculations relative to a CPU. Historically, the supercomputing industry had been unable to take advantage of the sheer processing power of the GPU, given the lack of suitable programming languages specifically designed for GPUs. When the 1.75 petaflop Jaguar supercomputer was unveiled by Oak Ridge National Laboratory in 2009, it was notable as Jaguar was one of the first supercomputers to be outfitted with Nvidia GPUs. Its direct successor, the 17.59 petaflop Titan, was unveiled in 2012 with over 18,000 GPUs. At the time, this was a concern for two reasons: 1) hosting over 18,000 GPUs within a single system was unprecedented and would doom the project to endless failures and outages, and 2) there were too few programming codes to take advantage of the sheer processing power of the 18,000 GPUs. These concerns have proven to be unfounded; today, GPUs are turning home PCs into supercomputing systems while Google just rolled out a GPU cloud service focused on serving AI customers;
  • AI, machine-learning software commercialization: Perhaps one of the most surprising developments in recent years has been the advent of AI, machine-learning software, yielding results that were unthinkable just five years ago. These include: 1) Google DeepMind’s AlphaGo, which defeated three-time European Go champion Fan Hui by 5-0 in 2015, and finally, the world Go champion Ke Jie earlieir this year, 2) Carnegie Mellon’s Libratus, which defeated four of the world’s top poker players over 20 days of playing, and 3) the inevitable commercialization of Level 5 autonomous vehicles on the streets of the U.S., likely by the 2021-2025 time frame. Most recently, Microsoft and the University of Cambridge teamed up to develop a machine learning system capable of writing its own code. The advent of AI in the early 21st century is likely to be a seminal event in the history of supercomputing;
  • Ongoing research into quantum computing: The development of a viable, commercial quantum computer is gaining traction and is probably 10-20 years away from realization. A quantum computer is necessary for the processing of tasks that are regarded as computationally intractable on a classical computer. These include: 1) drug discovery and the ability to customize medical treatments based on the simulation of proteins and how they interact with certain drug combinations, 2) invention of new materials through simulations at the atomic level. This will allow us to build better conductors and denser battery systems, thus transforming the U.S. energy infrastructure almost overnight, and 3) the ability to run simulations of complex societal and economic systems. This will allow us to more efficiently forecast economic growth and design better public policies and urban planning tools.

The U.S. Needs to Rejuvenate the Global Supercomputing Race

Technology, along with increasing access to cheap energy, is the lifeblood of a growing, modern economy. As we discussed in our December 2, 2012 article (“The Global Productivity Riddle and the Supercomputing Race“), fully 85% of productivity growth in the 20th century could be attributed to technological progress, as well as increasing accessibility/sharing of cheap energy sources due to innovations in oil and natural gas hydraulic fracturing, ultra-deep water drilling, solar panel productivity, and the commercialization of Generation III+ nuclear power plants and deployment of smart power grids.

Perhaps the most cited example where the combined effects of technological and human capital investments have had the most economic impact is the extreme decline in computing and communication costs. Moore’s Law, the ability of computer engineers to double the amount of computing power in any given space every 2 years, has been in effect since the invention of the transistor in the late 1940s. Parallel to this has been the rise of the supercomputing industry. Started by Seymour Cray at Control Data Corporation in the 1960s, the supercomputing industry has played a paramount role in advancing the sciences, most recently in computationally intensive fields such as weather forecasting, oil and gas exploration, human genome sequencing, molecular modeling, and physical simulations with the purpose of designing more aerodynamic aircrafts or better conducting materials. No doubt, breakthroughs in more efficient supercomputing technologies and processes is integral to the ongoing growth in our living standards in the 21st century.

Unfortunately, advances in both the U.S. and global supercomputing industry has lagged in the last several years. Every six months, a list of the world’s top 500 most powerful supercomputers is published. The latest list was compiled in June 2015; aside from providing the most up-to-date supercomputing statistics, the semi-annual list also publishes the historical progress of global supercomputing power, each country’s share of global supercomputing power, as well as a reasonable accurate projection of what lies ahead. Figure 1 below is a log chart summarizing the progression of the top 500 list from its inception in 1993.

Figure 1: Historical Performance of the World’s Top 500 Supercomputers

top500progressAs shown in Figure 1 above, both the sum of the world’s top 500 computing power, as well as the #1 ranked supercomputer, has remained relatively stagnant over the last several years. Just three years ago, there was serious discussion of the commercialization of an “exaflop” supercomputer (i.e. a supercomputer capable of 1 x 10^18 calculations per second) by the 2018-2019 time frame. Today, the world’s top computer scientists are targeting a more distant time frame of 2023.

From the U.S. perspective, the slowdown in the advent of the supercomputing industry is even more worrying. Not only has innovation slowed down at the global level, but the U.S. share of global supercomputing power has been declining as well. Three years ago, the U.S. housed 55% of the world’s top 500 supercomputing power; Japan was second, with 12% of the world’s supercomputing power. Rounding out the top five were China (8%), Germany (6%), and France (5%). Today, the U.S. houses only 46% of the world’s supercomputing power, with countries such as the UK, India, Korea, and Russia gaining ground.

Figure 2: Supercomputing Power Distributed by Country


Bottom line: Since the invention of the transistor in the late 1940s and the advent of the supercomputing industry in the 1960s, the U.S. has always led the supercomputing industry in terms of innovation and sheer computing power. With countries such as China and India further industrializing and developing their computer science/engineering expertise (mostly with government funding), U.S. policymakers must encourage and provide more resources to stay ahead of the supercomputing race. To that end, President Obama’s most recent executive order calling for the creation of a National Strategic Computing Initiative–with the goal of building an “exascale” supercomputer–is a step in the right direction. At this point, however, whether the industry can deploy an energy-efficient exascale supercomputer by the less ambitious 2023 time frame is still an open question.

The Robot Revolution Invades U.S. Casual Dining

In my July 15, 2007 commentary, we foresaw the potential for a dislocation in the U.S. casual dining industry through technological innovation and adoption. At the time, the iPad did not yet exist; but technologies such as touchscreen, high-speed WiFi, and the necessary software systems for automation were beginning to form. At the time, we believe the appearance of Microsoft’s “Surface” technology will herald a trend of full-blown automation in the casual dining industry. Quoting our July 15, 2007 commentary:

For a “dislocation” technology in the restaurant industry, look no further than the Microsoft “Surface” technology – which is simply an amazing piece of technology.  At first glance, kids will simply think of this as a cool technology to view photos or transfer videos, but give it a couple of years and many restaurants will start utilizing this technology as part of their food/drink ordering and clean-up system.  Besides having the ability to order food or drinks, the “Surface” will know what you are drinking and will ask you if you need a refill when your cup is half-empty.  Once all your plates are empty or nearly empty, the “Surface” will also alert the busboy so he can come and remove your dishes.  At the same time, you will be able to start ordering desserts as well, or of course, pay for your meal (either through your credit card or through Paypal).  In five years, the number of waiters needed in the restaurant industry will be halved, and 15% tips will no longer be needed (assuming a reasonable 40% decline in cost each year, these US$10,000 machines/surfaces will only cost US$750 by the end of 2012).  What outsourcing or off-shoring cannot do (i.e. displace workers whose jobs tend to be localized), technology will.

We were a little early as we assumed adoption would begin by 2009-10. At the time, we did not foresee the severity of the 2008-09 financial crisis. With the over-expansion of many casual dining chains during 2004-06–and with liquidity, borrowing, and consumer spending suffering a major breakdown during the 2008-08 financial crisis–many casual diners simply stopped investing in new stores and technologies. Now that U.S. consumer discretionary spending is back to a 6-year high (and with stock prices of many casual diners, such as EAT, DIN, and CAKE, near all-time highs), investing in new technologies to streamline the ordering and payment process suddenly makes sense again.

For many casual dining chains, there are little points of differentiation among their brands. e.g. Olive Garden, T.G.I. Friday’s, Ruby Tuesday, etc. As we discussed in our May 26, 2013 commentary (“The Generational Divide in Casual Dining Trends“), many traditional, casual dining chains also suffer from three major strikes–at least among the Gen-Ys: 1) A perception of a lack of quality service, 2) A perception of serving cheap-quality food, and 3) An outdated décor. These three strikes are especially glaring when compared to the newer, healthier, and more convenient choices such as Panera Bread, Corner Bakery, or at the higher end of the scale, Cheesecake Factory, RockSugar, and of course, independent operators–especially those with high-end brand names or those serving more exotic (e.g. sushi) and adventurous cuisines. Simply put, what the Gen-Ys settled for when they were kids would not work today.

Seen in this context, there is not much many casual dining chains could do to differentiate their products to increase profit margins–major strategic or product offering shifts notwithstanding. The only option is cost-cutting through technology–in this case, automation technology that streamlines the ordering and payment processes.

Most appropriately, Austen Mulinder, CEO of Ziosk and a former Microsoft executive, is now implementing the idea of tabletop tablets to the casual dining industry. Ziosk’s goal is to “revolutionize the experience and economics of Casual Dining,” and claims that over 100 million guests have already been served under its system. Most notably, Ziosk recently won a national contract to provide tabletop tablets to Chili’s, which will likely accelerate adoption by other national chains.

Exhibit 1: Ziosk Tablet – Order, Pay, and Play Loyalty-Related Games at the Table

Ziosk tablet

Out of the current installation base of over 1,200 locations, Ziosk claims that 80% of customers interact with the device in one way or another. The most frequent use is for direct credit card payment, the 2nd for survey questionnaires, and the 3rd for ordering of drinks, desserts, and appetizers. Ziosk claims that dessert sales are 20% to 30% higher for those who use the device, with quicker table turns and increases in chain loyalty if guests opt for email signup. All in all, restaurants that choose to utilize this device tend to experience 3% higher core food and drink sales on average.

One of Ziosk’s major pitches for this device is that the cost is “less than free,” as the cost of these devices could be subsidized by gaming revenue generated on these devices. A final area of benefit is reduced labor intensity, assuming more customers choose to order and pay through these devices. While restaurants deny that these devices will replace waiters, we believe this is where the casual dining industry is heading. e.g. Some Tokyo restaurants are already doing away with waiters. Make no mistake: The robot revolution has now spread to the casual dining industry.

Exhibit 2: California, Texas and Florida are the Focus Expansion Areas in 2014

Ziosk locations

The New Energy Regime: The Diffusion of Power

While the civil society is often relegated to the back tier of social life … it is the primary arena in which civilization unfolds. There are no examples that I know of in history where a people first set up markets and governments, and then later created a culture. Rather, markets and governments are extensions of culture … The civil society is where human beings create social capital, and which is really accumulated trust–that is invested in markets and governance. If markets or governments destroy the social trust vested in them, people will eventually withdraw their support or force a reorganization of the other two sectors.” — Jeremy Rifkin, author of “The Third Industrial Revolution” and “The Hydrogen Economy.”

40 years ago this Wednesday, OPEC instituted its first embargo on countries (the U.S. and Netherlands) that supported Israel during the Yom Kippur war. WTI crude prices shot up by 400%. The 1973 OPEC Embargo represented a watershed in the global energy regime in two important ways: 1) It made clear to the world that America’s hegemony on oil policy and prices has ended. The “oil weapon” was successful in that it heralded an irreversible shift of oil policy and price setting power from the Texas Railroad Commission to OPEC (at its peak, the TRC controlled over 40% of U.S. crude oil production; ironically, the TRC’s prorationing model served as a blueprint for that of OPEC); 2) It made clear to the Nixon Administration that a coherent national energy policy is needed–made all the more urgent as U.S. crude oil production peaked in 1970 (although we won’t know that for some time).

Ever since the 1973 and the 1979 OPEC oil embargoes, the U.S. leadership has sustained the country’s energy needs through a combination of drilling more wells, better drilling and extraction technologies (e.g. horizontal drilling and shale fracking), importing more oil, and–at various times–experimenting with alternative sources such as wind, solar, biofuels, and even geothermal energy (California is the leading source of geothermal energy). In other words, there has been no national, coherent policy (aside from massive fuel-switching from oil to natural gas by U.S. utilities during the 1980s) other than continuing the old oil-based energy regime. The forays into what we called “alternative energy” have been half-hearted–in many ways, a gimmick to satisfy the growing chorus of Americans demanding cleaner energy and self-sufficiency. The fact that many of these technologies were not economically efficient did not help either.

This chorus grew louder as U.S. oil imports and crude oil prices continued to rise–peaking at over 10 million barrels/day in 2005 for the former, and nearly $150 a barrel in summer 2008 for the latter. The constant calls for a new energy regime is more than just a yearning or a tactical business decision for more efficient energy production.  Yes, as financiers, we need to be cognizant of economic returns and protecting our investors. And, for the first time ever, CB Capital is seeing sound investment opportunities in various “clean technologies” areas that are already or soon-to-be economical (more in latter blog posts and research reports). Rather, this ever-louder chorus–which began with a group of highly-committed “green” minority in the wake of the OPEC crisis and environmental movement–has turned into a broader social movement occurring in significant parts of U.S. civil society (in particular, the best-educated and most collaborative, the Gen-Ys). To paraphrase Jeremy Rifkin, the civil society–which is the foundation of our present form of government and dictates how our market functions–is setting the stage for the complete transformation of how Americans think about energy, and consequently, how we produce and consume energy in the future.

We contend that the 1st Energy Regime lasted from the beginning of recorded human history to the beginning of the First Industrial Revolution (1760 to 1780). Humans derived much of our energy sources from the sun–whether it is directly or indirectly (photosynthesis) through the consumption of plants or the burning of wood. Humans were more or less harmonized with nature on a daily basis. Societies that did not respect nature or who consumed resources in a non-sustainable manner simply disappeared (e.g. the Rapa Nui people on Easter Island). The 2nd Energy Regime–which began with the adoption of the steam engine and the replacement of wood by coal as a primary energy source–transformed how societies thought about and produced/consumed energy. Humans and societies became more disassociated and de-harmonized with nature. We no longer witness on a daily basis how our energy was produced (or how our cattle is slaughtered). We know that the production of fossil fuels has brought with it an environmental price–but as they say, out of sight is out of mind. We also became de-sensitized to wars and conflicts fought in the name of energy and oil security. It is also sheer madness that the U.S. is still deriving much of our crude oil imports from oppressive regimes and areas whose collective consciousness are still stuck in the pre-Enlightenment Middle Ages.

The U.S. is now ready for a 3rd Energy Regime–driven by both American civil society and sound economic principles. This 3rd Energy Regime has been 40 years in the making. By its 50th anniversary, we expect the U.S. energy infrastructure to have been completely transformed. We have in the past discussed the ongoing U.S. Energy Renaissance driven by horizontal drilling, fracking, and Lower Tertiary drilling; as well as more efficient energy consumption through the commercialization of additive manufacturing, self-driving cars, better battery storage technologies, and the eventual commercialization of the quantum computer. In ten years time, I expect a much-higher adoption of renewables (California is on its way to sourcing 33% of our energy through alternative sources), which would include solar, wind, and perhaps the re-introduction of hydrogen fuel cell vehicles into our auto fleet. While companies such as Google are continuing to make substantial investments in solar (over the last three years, Google has invested over $1 billion in solar generation), the largest clean tech investors are companies such as GE (which is known for only investing in high ROE projects), and interestingly, traditional fossil fuel companies. According to the Cleantech Group, three of the top ten largest corporate investors in clean tech are traditional fossil companies (see below table).

corporate clean tech investment

More importantly, commercializing technologies such as rooftop solar, more efficient battery storage, and smart grids would transform the U.S. power grid from a centralized to a decentralized one (see below diagram). In other words, power would literally flow back to individual family households. The Jeffersonian ideal and myth of the American self-sufficient yeoman farmer will thus come closer to realization.  We have argued that part of the 21st century narrative is a diffusion and democratization of power from governments, institutions, and corporations back to the individual–as long as said individual utilizes the tools and global networks available to him. The diffusion and democratization of power began with the Internet Revolution in the late 1990s. The commercialization and adoption of alternative energy in the second decade of the 21st century–what we call the New Energy Regime–will not only provide a more sustainable source of power and lifestyle to Americans, but will further empower the individual. We believe the concept of U.S. energy independence is not a pipe dream. In fact, U.S. energy independence is not good enough. We believe we could achieve energy independence down to the individual community or even household level as the energy grid is decentralized–as long as the American civil society continue to embrace renewable energy sources.

U.S. Consumer Deleveraging Definitively Over (and the U.S. Tech Restoration)

In our June 13, 2013 newsletter (please email us for a copy), we asserted that–after six years of U.S. household deleveraging–the confluence of many forces, such as growing U.S. energy independence, rising housing prices, and an improving private sector labor market, suggests U.S. consumer spending growth is still early in the cycle.

According to the BLS, the number of U.S. employed aged 16+ (seasonally adjusted) peaked at 146.6 million in November 2007. This fell to just 138.0 million at the December 2009 trough, i.e. the U.S. lost 8.6 million jobs during the recession. As of August 2013, the number of U.S. employed aged 16+ has recovered to 144.2 million. Since the beginning of 2013, the U.S. had added nearly 900,000 jobs, after adding 2.4 million jobs in 2012. Perhaps more important, the U.S. private sector has been driving most of this job growth. Assuming the private sector (which has recovered over 80% of its jobs lost during the recession) sustains its current growth rate, the U.S. economy is less than a year away from recovering all of its private-sector jobs lost during the recession. Yes, it did take four years, but we are making progress (e.g. the U.S. auto industry is operating at near full capacity and is embarking on a hiring spree).

From CB Capital’s vantage point, we are witnessing significant innovation and business activity among our tech clients–from Pasadena to Santa Monica to Orange County–all the way to San Diego. In our opinion, there is no question that U.S. tech and innovation will continue to lead this U.S./global economic recovery. The Schumpeterian forces driven by the U.S. capitalist system (yes, this does exist in California) is alive and well. As an aside, the sectors that led the last bull cycle, such as Emerging Markets, fossil fuels, precious metals, U.S. financials, etc., typically experience muted activity in the next bull cycle. This is not surprising. I first became bullish on gold and silver in late 2000–purchasing physical and mining stocks when they traded at $275 and $4.50 an ounce, respectively. This decision was driven by many factors, including the sustained lack of mining infrastructure investments, the Greenspan-led monetary easing policy, the downright hatred of precious metals as an investment, and both central bank selling and production hedging (short-selling) by major precious metals miners. Since late 2000, all of these trends–with the exception of global monetary easing–have reversed. A similar scenario has transpired in the fossil fuels industry, when I first became bullish on oil in August 2004. There are no prizes for following the herd or coming in last. and investors should not make–and do not deserve–outsized returns by investing in industries that have already enjoyed a decade-old bull market.

Make no mistake: The next trend is in U.S. tech and innovation. The world can only innovate and increase productivity through the forces of Schumpeterian growth, aided by the U,S. capitalist system and uniquely intelligent entrepreneurial, and risk-taking spirit. Our recent work with clients in the UAV, video gaming analytics, online real-time bidding, and software-defined networking space have further convinced us of this case. Much of this growth will be driven by tech entrepreneurs near the campus of Cal Tech, the area around Santa Monica (a.k.a. Silicon Beach), and all the way down the I-405 to San Diego.

This bullish cycle in U.S. tech (better 4-D seismic equipment and drilling technologies are responsible for much of the recent increase in U.S oil production) is a strong counter-force to U.S. consumer deleveraging. Along with ever-increasing domestic oil production, U.S. tech will lead the recovery of the U.S. private sector, as well as global economic growth.

Aside from a recovering U.S, private sector, perhaps one of the most encouraging signs for U.S. consumer spending growth is the state of U.S. household balance sheets. One of the main themes we have tracked since the early 2000s is one of the overleveraged U.S. household–and since early 2007–the inevitable deleveraging resulting from the housing crisis and the subsequent decline in household credit growth. As shown in Figure 1 below, the asset-to-liability ratio of U.S. households experienced a secular decline (i.e. U.S. households went on a credit spree) from over 14.0 in the early 1950s to just 5.7 at the end of 2007, and then to a post WWII low of 5.0 at the end of 1Q 2009 as the global financial crisis peaked.

US household net worthAfter endless rounds of global monetary easing, asset purchases, old-fashioned economic growth, and household/government deleveraging over the last five years, the asset-to-liability ratio has risen back to 6.5 at the end of 2Q 2013–its highest level since the end of 1Q 2002! Moreover, U.S. household net worth hit another record high of $74.8 trillion–$5.8 trillion higher than the pre-financial crisis peak of $69.0 trillion set at the end of 3Q 2007.

In other words, the balance sheets of U.S. households–despite anemic job growth and the prolific growth of student loans (now over $1 trillion outstanding)–are now in their best shape since early 2002! The “wealth effect” from this benign trend, as well as ongoing U.S. job growth, should provide a strong tailwind for further growth in U.S. consumer discretionary spending.

Timing, of course, is everything. History suggests that this ratio could rise even further. Should this occur, then U,S. household deleveraging (and a renewed cutback in consumer discretionary spending) may not be over. We are of the opinion that such fears are overblown. Firstly, much of the extraordinary growth in U.S. household debt over the last 20 years occurred in the housing mortgage sector. Indeed, for every percentage growth in U.S. household assets since 1952, U.S. households incurred 2.67 times as much mortgage debt, compared to 1.93 times of all other household debt. Note that mortgage debt growth as a ratio of household assets growth is down from a peak of 3.81 times at the end of 1Q 2009, due to rising home prices and significant write-downs of mortgage debt over the last five years. Moreover–despite rising mortgage rates–US. housing activity and prices are maintaining their positive momentum. For example, over the last 12 months (ending July 2013), the Case-Shiller Index for 20 major U.S. metropolitan areas rose 12.4%. Given ongoing U.S. job growth, rising U.S. energy independence (U.S. domestic production has risen another 1.3 million barrels/day over the last 12 months), and recent easing of bank lending standards, the positive momentum in U.S. housing activity and prices should last well into next year.

US debt growth

Because much of U.S. household debt growth was incurred in the mortgage sector, a rising housing market would significantly reduce the probability of a renewed deleveraging of U.S. household balance sheets. In addition, the build-up of U.S. household balance sheets would also reduce the number of households underwater in their mortgages. Such a benign trend will further support consumer spending, as well as reduce U.S. systemic risk. We have no doubt that U.S. consumer discretionary spending is still in the early stages of a sustainable upward trajectory. U.S. corporations and consumer brands (with the glaring exception of J.C. Penney) will realize this by early 2014, and will then take advantage of this trend through store expansion, differentiated marketing, investing in new products, and a sustained round of new hiring.

A Technological Revolution in the Making – The U.S. Giant Awakens

Note 1: We asserted in our June 18th commentary that WTI crude oil will definitively rise above $100 a barrel this summer, driven by the ongoing U.S. economic recovery, steady oil demand from China (the country’s short-term credit crunch is over), and pockets of strength in the Euro Zone. WTI crude oil is at $103 as I am writing this. And no, it is not due to unrest in Egypt, as Brent crude did not rise much on a proportionate basis. The narrowing of the spread between the price of Brent and WTI crude is also the best evidence of a U.S. economic recovery.

Note 2: In our late January newsletter, we asserted that gold was due for a major correction. We advocated a short position in gold. With gold at $1,660 an ounce at the time, we argued for a 12- to 18-month price target of $1,100 to $1,300 an ounce. Today, the price of gold sits at $1,220 an ounce. In just five months, the price of gold has hit our price target. Bottom line: We are revising our price target for gold. Our new position calls for a 6- to 12-month price target of $1,000 to $1,200 an ounce. The two most reliable psychological indicators for a tradeable bottom in any asset class are: 1) Panic, or 2) Indifference. The best time to invest in any asset class is after years of investors’ indifference. That–along with other screaming buy indicators–was the reason why I invested in physical gold and unhedged gold miners at under $275 an ounce in late 2000. Of course–unless the U.S. mints a new currency–the price of gold will never see $275 an ounce again. So far, we haven’t witnessed much investors’ panic; nor indifference towards gold. With U.S. real interest rates (the ECRI Future Inflation Gauge just hit a 7-month low, even as long-term Treasury rates are spiking up) hitting new cyclical highs, we believe there is at least one more major sell-off in gold before there could be a tradeable bottom. The Dow-to-Gold ratio today sits at 12.4. I would only consider investing if the Dow-to-Gold ratio rises to 15, or above.  Avoid gold, for now.

Now, let’s get on with our main commentary. About 400 years ago, Descartes famously remarked “I think, therefore I am.” Descartes tried to prove his own existence by linking his thoughts to his consciousness. In other words, Descartes argued that because he cannot be separated from his thoughts–and because thoughts exist–therefore, he exists.

But Descartes was wrong. Equating one’s consciousness with one’s thoughts is mere identification with one’s ego–a path to endless pain and suffering. We are at our most enlightened state when we live in the present. A glimpse of a beautiful object, attending a concert, or seeing your loved one for the first time in a long time–these could all quiet our minds for just enough time to witness the beauty and truth in our own existence.  Unfortunately, human beings–just like Descartes–have equated our identities with our own, rigid set of thought/belief systems for thousands of  years. Such unconsciousness on a global scale has led to mass intolerance, discrimination and hatred, directly resulting in mass genocide, global wars, and witch-burnings–down to petty arguments over politics and household chores. It is sheer madness. A madness that many societies (e.g. the Middle East) still have not awaken from.

On a more practical level, an individual cannot invest successfully unless he awakens from such unconsciousness. Just like the natural laws of the universe, there are certain axioms any investor needs to follow; however, these axioms only provide the larger framework, and as of today, are not yet complete. Sir Isaac Newton explored the meaning of gravity, but lost his entire fortune in the aftermath of the Great South Sea Bubble. Investors are taught from an early age to follow benchmarks ranging from valuation ratios, cash flows, inflation, and GDP growth, to central bank policy, energy policy, technological breakthroughs, and finally to more esoteric indicators such as the VIX, various investment surveys (useful from a contrarian perspective), and sentiment data via Twitter feeds and Google Trends. An investor who is unconscious–i.e. one who follows a rigid set of thoughts and belief systems–cannot make outsized returns, since most investors follow such rigid thoughts, and by definition, most investors cannot beat the market. Sir Isaac Newton tried to follow such physical laws whilst speculating in South Sea stock. Both the final run-up and the subsequent collapse would catch him completely off-guard. Later on, he would remark “I can calculate the movement of stars, but not the madness of men.” The final exponential run-up in technology stock prices in early 2000 offered yet another example. Investors who failed to acknowledge this “New Era” missed the bull market in 1996, 1997, 1998, and 1999; many blue-chip funds under performed and numerous money managers lost their jobs because their rigid set of belief systems prevented them from owning technology stocks (at the peak in March 2000, the NASDAQ Composite traded at a P/E of 260). Of course, they were eventually proven right. But being “early” in the financial markets is just a nicer way of saying one was wrong. Similarly, many investors who were caught in the 2001 to 2002 bear market did not realize the rules have changed yet again.

I made this same mistake when I began investing in college. I tried to predict stock prices with factor models using linear regression analysis. I studied modern portfolio theory and was fascinated by real options valuation models. I thought the bull market in technology stocks would go on forever. I was unconscious. Thankfully, I did not remain unconscious for long. I managed to catch the tail-end of the technology boom; sold all my technology stocks in early 2000 (and warned others to do the same), and was 100% short the NASDAQ by late March 2000. The lesson I learned: Regimes come and go; belief systems are overturned (even thousand year-old systems such as the Chinese dynastic system in 1911); and something faster, crazier and more unbelievable will always come along.

A study of human history yields an endless chronicle of conflicts, wars, famines, mass slaughters, rape & pillage, and general misery. For sure, such a dismal record has been punctuated by glimpses of human goodness and progress in mass consciousness. e.g. The unprecedented prosperity and the promotion of peace during the “New Kingdom” period in Ancient Egypt, the export of Greek culture during the Hellenistic period, the harnessing and control of new technologies during the Han Dynasty in China, and of course, the European Renaissance and the Enlightenment. But it was not until the adoption of the United States Declaration of Independence–inspired by the writings of John Locke, and documents such as the Magna Carta, the Petition of Right, and the English Bill of Rights–did a major society finally begin to embrace the concept of human equality, freedom, and other basic, “inalienable,” rights.

In my opinion, the 56 delegates who debated and signed the Declaration of Independence as part of the Second Continental Congress represented the gathering of the most talented, progressive, and yet pragmatic, men in all of history. The Declaration of Independence–riding on concepts clarified by Enlightenment philosophers such as John Locke, Voltaire, and Rousseau–is the definitive document which defines the United States of America to this day. Yes, the U.S. falls short in many places; that is to be expected as the U.S. represents an ideal–an ideal that all of us should continue to strive for. It is thus no accident that the U.S. remains the most attractive center for entrepreneurs, hard-working immigrants, innovators, and the world’s best and most creative minds–despite our shortcomings.

In the wake of the Pearl Harbor Attack by the Empire of Japan, Admiral Yamamoto is alleged to have remarked: “I fear all we have done is to awaken a sleeping giant and fill him with terrible resolve.” Ever since the collapse of the technology boom and subsequently, the events of September 11th, the U.S. has been rudderless. Over the last 12 years, both the U.S. political and corporate leadership have failed the world, and reneged on too many broken promises. However, not all was lost. There have also been flashes of brilliance. e.g. the completion of the Human Genome Project in 2003, D-Wave’s progress in the development of a quantum computer, the advent of 3-D printing (a trend which I have tracked since 2007), shale fracking and horizontal drilling in the energy industry, and nanomedicine and nanotechnology in general–leading to advances in targeted cancer treatments, more efficient conductors, and stronger/lighter-weight materials.

As we have covered in our newsletters and commentaries, we are confident that the U.S. is on the cusp of a new technological revolution. It takes strong leadership, a functional financial industry, the right markets, and a bit of luck to commercialize the many, revolutionary technologies that we have written about. The U.S. is already undergoing an energy revolution–the rise in domestic crude oil production over the next several years will surpass the last domestic oil boom in the 1950s and 1960s. The 1950s/1960s domestic oil boom drove U.S. manufacturing and industry to unprecedented heights–and led to the creation of the U.S. middle class. The rise of 3-D printing, along with advances in 3-D scanning technology, means we could create our own tailored t-shirts in our own homes. I envision a timeline of just five years. Eventually, we will be able to “print” more complex objects with more differentiated parts. Together with cheap natural gas prices, the U.S. is already experiencing a renaissance in “in-shoring” and “in-sourcing,” beginning with low-labor content goods.

Slowly but surely, the U.S. giant is awakening. The economic recovery since 2009 is merely a precursor–a big, giant yawn. Our expertise and networks in healthcare, technology, and energy has placed CB Capital right in the center of the next technological boom, driven by American ingenuity, focus, and honest hard work. We are looking forward to the ride.

The New 21st Century Capitalism: The Curious Case of Veronica Mars

The class struggle between the bourgeois capitalist and the proletariat—classified by Karl Marx as two distinct but interdependent social classes—is arguably Capitalism’s most famous “internal contradiction.” Marx argued 150 years ago that the constant competition for “surplus value” (whose definition is based on Marx’s misguided “Labor Theory of Value”) between capitalists and workers would result in ever-larger cycles and crises—ultimately bringing about the demise of the capitalist system.

Along with the constantly diminishing number of the magnates of capital, who usurp and monopolize all advantages of this process of transformation, grows the mass of misery, oppression, slavery, degradation, exploitation; but with this too grows the revolt of the working-class, a class always increasing in numbers, and disciplined, united, organized by the very mechanism of the process of capitalist production itself … Centralization of the means of production and socialization of labour at last reach a point where they become incompatible with their capitalist integument. This integument bursts asunder. The knell of capitalist private property sounds. The expropriators are expropriated.

Karl Marx gave a fine gift to those seeking only to dissect and disprove his arguments, and better understand the essence of capitalism. Political leaders who adopted Marxist thought in any practical manner would destroy their own countries and the lives of millions of people (Hugo Chavez also gave us a fine gift: the basket case that is Venezuela, despite $100 oil, provides ready ammunition for any dinner debate on the virtues of capitalism). In fact, Joseph Schumpeter would devote over 50 pages to discrediting Marxian thought in his seminal work “Capitalism, Socialism and Democracy.

Counter-arguments to silly notions such as the Labor Theory of Value, distinct social classes such as the bourgeois capitalist and the proletariat, the proletariats’ overrated organizational skills, and the inevitable result of revolution are common everywhere in the 21st century global economy. The quintessential example is Apple’s iPhone (Disclosure: I went long AAPL at $401.07 on April 23rd). The iPhone’s design—along with the organizational/marketing skills required to manufacture/sell such a product on a global scale—reigns supreme. Virtually all the “surplus value” accrue to the design, marketing, and organizational team at Apple—and rightly so—while the Chinese manufacturers and their laborers accrue little, if any. Furthermore, the sense of high drama required for any revolution—one sufficient to overthrow the bourgeois—is missing. The poetic sense of Revolution as conveyed by the likes of Thomas Jefferson and James Madison is simply not attractive in a society where the Forbes 400 list turns over once every 20 years, and where kids of all ages are addicted to video games. Capitalism, over time, naturally overthrows each and every bourgeois capitalist through the process of “creative destruction” and the lack of hereditary titles. In addition, the PC and internet revolutions—accompanied by the rise of global networks/corporations such as Amazon, eBay, and Google—have allowed all those that dare to become a capitalist and entrepreneur. In fact, recent trends suggest that the global capitalist/entrepreneurial spirit is growing stronger than ever. For example, employee compensation as a percentage of U.S. GDP has declined to a new 58-year low (as shown on the following chart). Yes, U.S. corporations and businesses are squeezing more out of its workers, even as economic growth recovers. At the same time, this also suggests that more Americans are becoming entrepreneurs and starting their own businesses.


With the further advent of automation, 3-D printing, and globalization (including the democratization of education, communications, and labor mobility), U.S. labor will continue to be squeezed in the long run. Consider this: I know folks being paid six-figures by pressing buttons to optimize an investment portfolio and to make/reconcile trades. They have little understanding behind the quantitative investment/optimization process; and certainly cannot replicate or design a new system on their own. These folks are what Ayn Rand would call “the second-handers” who are simply feeding off the scraps of the creative class—and thus do not deserve six-figures. The market will recognize this over time.

Make no mistake: The capitalists will continue to rise and become ever more important as the 21st century progresses. French economist Charles Gave (of GaveKal) coined the term “Platform Company” to describe high ROC companies focused on design, marketing, and organization, while spinning off their manufacturing responsibilities (again, the quintessential example is Apple Computer). Of course, such a business model only works in a world that abides by the rules of free trade. In the first decade of the 21st century, the platform company model was the Holy Grail for capitalists: Design and market; outsource manufacturing and produce everywhere. A platform company’s capital (the denominator in “ROC”) is mainly the people it hires. In most cases, the company did not have to invest in such capital; their parents and alma maters did most of the work. The success of the platform company model is exemplified by Apple’s extremely high margins (making up just 2.0% of total S&P 500’s sales; but 5.6% of all profits), along with its high growth rates in recent years (following table courtesy Goldman Sachs).


In the second decade of the 21st century, the platform company model itself is being turned on its head. The first inkling came with the creation of Wikipedia in 2001—giving volunteers from all over the world to create value and compete with private interests through mass collaboration. Termed “Wikinomics,” this trend of creating value through mass voluntary collaboration would result in efforts such as the Amazon Review System, the YouTube video sharing platform, and social networking. Corporations would seek to monetize the value of such mass voluntary collaboration. As long as the volunteers are happy (many of them are—either because they are only seeking to find their voice, or an audience for their future work), companies like Amazon, eBay, and Facebook could keep their competitive advantages by maintaining their strong network effects. The ROC of such business models is extremely high—as unlike say, Apple, YouTube has no need to pay salaries to content creators.

In other words, Marx’s prediction that—one day—the proletariat will overthrow the bourgeois capitalist has been turned on its head. Far from overthrowing the capitalists, the proletariat (I use this term very loosely) is happy to contribute to the capitalists’ success on a voluntary basis. Writing an Amazon review or producing a YouTube video is far from the backbreaking work of the 19th century; more important, such activities allows each of us to express our individuality—or as a first step to monetize our creativity. The success of the platform company and the Wikinomics models is now leading into something else—which I call the “Veronica Mars” model. The Veronica Mars Movie Project was the first Kickstarter effort to revive a cult classic through its fan base. To the surprise of many (including its creator Rob Thomas, and the franchise owner, Warner Brothers), the Kickstarter effort reached its fund-raising goal of $2 million in less than 10 hours. By the end of the 30-day fund-raising period, it has raised over $5.7 million through 91,000 fans. The Veronica Mars Movie Project set a precedent (good or bad—you decide); and subsequently actor/director Zach Braff has jumped on the Kickstarter bandwagon. His project “Wish I Was Here” reached his fund-raising goal of $2 million after only three days.

Yes, both the movie studios and the creators have spent enormous amounts of capital on such franchises. Nevertheless, on a stand-alone basis, the ROCs (which mostly accrue to Warner Brothers and other publicly owned studios) of these fan-funded projects are literally infinity. So yes, Mr. Byron Wein, while there are practical limits, the ROCs of U.S. corporations can continue to rise. Capitalists—through strong creative, marketing, organizational, and outreach efforts to customers/fans—can now have our cake and eat it too. Business school textbooks and HBS cases alike are being revised as we speak. To paraphrase former President Richard Nixon (with a slight twist)—like it or not—we are all capitalists now. Either become one; or die.

The Global Productivity Riddle and the Supercomputing Race

Neoclassical (mainstream) economists define productivity as economic output (usually GDP) per unit of input, with the latter typically being capital and labor. An analysis of economic growth in the 20th century, however, suggests that physical capital per worker accounted for at most 15% of this increase. Fully 85% of productivity growth in the 20th century cannot be explained by mainstream economists.

Nobel Laureate Robert Solow was the first to propose that fully 85% of productivity growth in the 20th century was simply “a measure of our ignorance,” which he labels as technological progress.  Robert Ayres, a renowned economist and physicist, asserts that the increasing consumption of energy in the 20th century explains nearly all of the productivity growth in the 20th century. We will explore this issue in a future newsletter, but not surprisingly, the future/survival of the global economy is intrinsically linked with the availability of (cheap) energy sources. With the exception of a few industries (computers, communications, healthcare) and advances over the last decade (solar PV, hybrid vehicles, carbon composites on the Boeing 787, the smart grid, etc.), productivity growth in industrial countries since the end of WWII has relied mostly on the increasing consumption of fossil fuels. While natural gas will act as a legitimate “bridge fuel” for at least the next decade, it is imperative to make ongoing investments in alternative energy technologies– as the “externalities” of burning fossil fuels remain high, especially in parts of China and other densely populated areas.

Speaking of advances in the computing industry, the world of supercomputing and supercomputing research remains an area of U.S. domination. Last month, the 40th semi-annual edition of the top 500 list of the world’s most powerful supercomputers was published at the SC2012 supercomputing conference in Salt Lake City. Last year’s biggest surprise (at least to those outside the supercomputing community) was the ascendance of the Japanese in the rankings. At the time, the number one supercomputer was the “K Computer” built by Fujitsu, using its own proprietary SPARC64 Vlllfx CPUs.  Powered by 88,128 CPUs (each with eight cores, for a total 705,024 cores), the “K Computer” is capable of a peak performance of 10.51 petaflops.

Beginning this year, however, the U.S. regained the supercomputing crown, when the IBM Blue Gene-powered “Sequoia” at the Lawrence Livermore Laboratory came online with a staggering peak performance of 16.33 petaflops. The latest November 2012 list ushered in a new number one ranked supercomputer–the AMD CPU/Nvidia GPU powered monster, “Titan”–coming in at 17.59 petaflops. More important, Titan is slated for civilian use. One of its first projects is to run simulations designed to improve the efficiency of diesel and biofuel engines.

On the other hand, the Chinese, which captured the supercomputing race in October 2010 with its Tianhe-1A supercomputer at the National Supercomputing Center in Tianjin (rated at 2.57 petaflops) has now sunk to 8th place.

From a geopolitical standpoint, the United States has re-occupied the top spot after ceding to the Japanese last year, and the Chinese the year before.  On a country basis, the U.S. houses 55% of the top 500 supercomputers, up from 43% just 12 months ago (by supercomputing power; note that the NSA – which houses some of the most powerful systems in the world – stopped reporting in 1998).  Japan is second, with 12% of the world’s supercomputing power.  Rounding out the top five are China (8%), Germany (6%), and France (5%). The UK, which ranked third just three years ago (with 5.5% of the world’s supercomputing power), is now in 6th place, housing just 4.5% of the world’s supercomputing power.

Aside from providing the most up-to-date supercomputing statistics, the semi-annual list also publishes the historical progress of global supercomputing power – as well as a reasonably accurate projection of what lies ahead.  Following is a log chart summarizing the progression of the top 500 list since its inception in 1993, along with a ten-year projection:supercomputing

Today, a desktop with an Intel Core i7 processor operates at about 100 gigaflops (note that we are ignoring the GPU in our graphics processor from our calculations) – or the equivalent of an “entry-level” supercomputer on the top 500 list in 2001, or the most powerful supercomputer in the world in 1993.  On the highest end, the power of the Titan Supercomputer is equivalent to the combined performance of the world’s top 500 supercomputers just four years ago.  Moreover,the combined performance of “Sequoia” and “Titan,” makes up more than 20% of the combined performance of all the supercomputers in the top 500 list today.  By the 41st semi-annual edition of the Top 500 supercomputers next June, the combined performance of the world’s 500 fastest supercomputers should exceed 200 petaflops (compared to 162 petaflops today, and just 74 petaflops a year ago).

Simulations that would have taken 10 years of computing hours for the most powerful supercomputer two years ago take just a year on Titan (roughly, since Linpack—the benchmark used to measure supercomputing performance—is not exactly representative of real-world supercomputing performance).  Tasks that take an immense amount of computing time today – such as precision weather forecasts, gene sequencing, airplane and automobile design, protein folding, etc. – will continue to be streamlined as newer and more efficient processors/software are designed.  By 2018-2019, the top supercomputer should reach a sustained performance of an exaflop (i.e. 1,000 petaflops)—this is both SGI’s and Intel’s goal.  IBM believes that such a system is needed to support the “Square Kilometre Array”—a radio telescope in development that will be able to survey the sky 10,000 times faster than ever before, and 50 times more sensitive than any current radio instrument—and will provide better answers to the origin and evolution of the universe.  The ongoing “democratization” of the supercomputing industry would also result in improvements in solar panel designs, better conductors, more effective drugs, etc.  As long as global technology innovation isn’t stifled, the outlook for global productivity growth – and by extension, global economic growth and standard of living improvements – will remain bright for years to come.  Advances in material designs would also propel the private sector’s efforts to commercialize space travel and reduce the costs of launching satellites.  Should the quantum computer be commercialized soon (note that quantum computing advances are coming at a dramatic rate) we should get ready for the next major technological revolution (and secular bull market) by 2015 to 2020.  Make no mistake: The impact of the next technological revolution will dwarf that of the first and second industrial revolutions.