books

Life After Google

by George Gilder

324 passages marked

Cover of Life After Google

From our minds open potential metaverses, infinite dimensions of imaginative reality-counter-factuals, analogies, interpretive emotions, flights of thought and creativity. The novelist Neal Stephenson, who coined the term metaverse,² and Jaron Lanier, who pioneered "virtual reality," were right to explore them and value them. Without dimensions beyond the flat universe, our lives and visions wane and wither.

This analogy of the "flat universe" had come to me after reading C. S. Lewis's essay "Transposition," which posed the question: If you lived in a two-dimensional landscape painting, how would you respond to someone earnestly telling you that the 2D image was just the faintest reflection of a real 3D world? Comfortable in the cave of your 2D mind, you had 2D theories that explained all you experienced in flatland-the pigments of paint, the parallax relationships of near and far objects, the angles and edges. The math all jibed. "Three dimensions?" you might ask. "I have no need for that hypothesis."

Around the time of Back to the Future: The Ride in the early 1990s, I was prophesying the end of television and the rise of networked computers.4 In the 1994 edition of Life after Television, I explained, "The most common personal computer of the next decade will be a digital cellular phone with an IP address...connecting to thousands of databases of all kinds." As I declared in scores of speeches, "it will be as portable as your watch and as personal as your wallet; it will recognize speech and navigate streets; it will collect your mail, your news and your paycheck." Pregnant pause. "It just may not do Win dows. But it will do doorsyour front door and doors of perception."

On a deeper level, the world of Google-its interfaces, its images, its videos, its icons, its philosophy-is 2D. Google is not just a company but a system of the world. And the Internet is cracking under the weight of this ideology. Its devotees uphold the flatuniverse theory of materialism: the sufficiency of deterministic chemistry and mathematics. They believe the human mind is a suboptimal product of random evolutionary processes. They believe in the possibility of a silicon brain.

In defense, Silicon Valley seems to have adopted what can best be described as a neo-Marxist political ideology and technological vision.

Marxism is much discussed as a vessel of revolutionary grievances, workers' uprisings, divestiture of chains, critiques of capital, catalogs of classes, and usurpation of the means of production. At its heart, however, the first Marxism espoused a belief that the industrial revolution of the nineteenth century solved for all time the fundamental problem of production.

The first industrial revolution, comprising steam engines, railways, electric grids, and turbines-all those "dark satanic mills"-was, according to Marx, the climactic industrial breakthrough of all time.

In The German Ideology (1845), Marx fantasized that communism would open to all the dilettante life of a country squire: "Society regulates the general production and thus makes it possible for me to do one thing today and another tomorrow, to hunt in the morning, to fish in the afternoon, rear cattle in the evening, criticize after dinner, just as I have in mind, without ever becoming hunter, fisherman, shepherd or critic."

Marx was typical of intellectuals in imagining that his own epoch was the final stage of human history. William F. Buckley used to call it an immanentized eschaton, a belief the "last things" were taking place in one's own time. The neo-Marxism of today's Silicon Valley titans repeats the error of the old Marxists in its belief that today's technology-not steam and electricity, but silicon microchips, artificial intelligence, machine learning, cloud computing, algorithmic biology, and robotics--is the definitive human achievement. The algorithmic eschaton renders obsolete not only human labor but the human mind as well.

AI is believed to be redefining what it means to be human, much as Darwin's On the Origin of Species did in its time. While Darwin made man just another animal, a precariously risen ape, GoogleMarxism sees men as inferior intellectually to the company's own algorithmic machines

Life after Google makes the opposing case that what the hyperventilating haruspices Yuval Harari, Nick Bostrom, Larry Page, Sergey Brin, Tim Urban, and Elon Musk see as a world-changing AI juggernaut is in fact an industrial regime at the end of its rope. The crisis of the current order in security, privacy, intellectual property, business strategy, and technology is fundamental and cannot be solved within has the current computer and network architecture.

In business, the ability to conduct transactions is not optional. It is the way all economic learning and growth occur. If your product is "free," it is not a product, and you are not in business, even if you can extort money from so-called advertisers to fund it.

These centralized fortresses violated the Coase Theorem of corporate reach. In a famous paper, the Nobellaureate economist Ronald Coase calculated that a business should internalize transactions only to the point that the costs of finding and contracting with outside parties exceed the inefficiencies incurred by the absence of real prices, internal markets, and economies of scale.6 The concentration of data in walled gardens increases the cost of security. The industry sought safety in centralization. But centralization is not safe.

The company store was not a great advance of capitalism during the era of so-called "robber barons," and it is no better today when it is dispersed through the cloud, funded through advertising, and combined with a spurious sharing of free goods. Marxism was historically hyperbolic the first time round, and the new Marxism is delusional today. It is time for a new information architecture for a globally distributed economy.

Alphabet, Google's holding company, is now the second-largest company in the world. Measured by market capitalization, Apple is first. Joined by Amazon, and Microsoft, followed avidly by Facebook in seventh, the four form an increasingly feared global oligopoly.

This increasing global dominance of U.S. information companies is unexpected. Just a decade ago leading the list of the companies with the largest market caps were Exxon, Walmart, China National Petroleum, and the Industrial and Commercial Bank of China. No Internet company made the top five. Today four of the top five are American vessels of information technology.

That phrase, "system of the world," which I borrow from Neal Stephenson's Baroque Cycle novel about Isaac Newton and Gottfried Wilhelm Leibniz, denotes a set of ideas that pervade a society's technology and institutions and inform its civilization."

In his eighteenth-century system of the world, Newton brought together two themes. Embodied in his calculus and physics, one Newtonian revelation rendered the physical world predictable and measurable. Another, less celebrated, was his key role in establishing a trustworthy gold standard, which made economic valuations as calculable and reliable as the physical dimensions of items in trade.

Since Claude Shannon in 1948 and Peter Drucker in the 1950s, we have all spoken of the information economy as if it were a new idea.

Newton's biographers typically underestimate his achievement in establishing the information theory of money on a firm foundation.

All wealth is the product of knowledge. Matter is conserved; progress consists of learning how to use it.4 Newton's knowledge, embodied in his system of the world, was what most critically differentiated the long millennia of economic doldrums that preceded him from the three hundred years of miraculous growth since his death. The failure of his alchemy gave him--and the world-precious knowledge that no rival state or private bank, wielding whatever philosopher's stone, would succeed in making a better money. For two hundred years, beginning with Newton's appointment to the Royal Mint in 1696, the pound, based on the chemical irreversibility of gold, was a stable and reliable monetary Polaris.

Under Newton's gold standard, the horizons of economic activity expanded. Scores of thousands of miles of railway lines spread across Britain and the empire, and the sun never set on the expanding circles of trust that underlay British finance and commerce. Perhaps the most important result of free commerce was the end of slavery. Reliable money and free and efficient labor markets made ownership of human laborers unprofitable. Commerce eclipsed physical power.

In the Google era, Newton's system of the world-one universe, one money, one God-is now in eclipse. His unitary foundation of irreversible physics and his irrefragable golden money have given way to infinite parallel universes and multiple paper moneys manipulated by fiat. Money, like the cosmos, has become relativistic and reversible at will. The three hundred years of Newtonian prosperity having come to an end, the new multiverse seems unable to repeat the miracle of a golden age of capitalism. It is now widely held that citizens are essentially owned by the state on which they depend. Slavery, in the form of servitude to governments, is making a comeback as money transactions become less trustworthy.

Fortunately the lineaments of a new system of the world have emerged. It could be said to have been born in early September 1930, when a gold-based Reichsmark was beginning to subdue the gales of hyperinflation that had ravaged Germany since the mid-1920s.

The site of the unnoticed birth was Königsberg, the historic sevenbridged Gothic city on the Baltic. The great mathematician Leonhard Euler had proved in the early eighteenth century that all seven bridges could not be traversed without crossing at least one of them twice.

Mathematicians gathered in Königsberg that September for a conference of the Society of German Scientists and Physicians to be addressed by one of the giants of their field, David Hilbert. Himself a son of Königsberg and about to retire from the University of Göttingen, Hilbert was the renowned champion of the cause of establishing mathematics at the summit of human thought.

Hilbert had defined the challenge in 1900: to reduce all science to mathematical logic, based on deterministic mechanical principles. As he explained to the society, "The instrument that mediates between theory and practice, between thought and observation, is mathematics; it builds the connecting bridge and makes it stronger and stronger. Thus it happens that our entire present-day culture, insofar as it rests on intellectual insight into and harnessing of nature, is founded on mathematics."

And what was mathematics founded on? Responding to the Latin maxim ignoramus et ignorabimus ("we do not know and will not know"), Hilbert declared: "For us [mathematicians] there is no ignorabimus, and in my opinion none whatever in natural science. In opposition to the foolish ignorabimus our slogan shall be: 'We must know, we will know""-Wir müssen wissen, wir werden wissen-a declaration that was inscribed on his tombstone.

Preceding the conference was a smaller three-day meeting on the "Epistemology of the Exact Sciences" addressed by the rising mathematical stars Rudolf Carnap, a set theorist; Arend Heyting, a mathematical philosopher; and John von Neumann, a polymathic prodigy and Hilbert's assistant. All were soldiers in Hilbert's epistemological campaign, and all, like Hilbert, expected the pre-conference to be a warmup for the triumphalist celebration of the main conference.

But the triumphs of information theory and technology had put an end to the idea of a determinist and complete mathematical system for the universe.

Gödel's argument was iconoclastic. But his method of proving it was providential. He devised a set of algorithms in which all the symbols and instructions were numbers. Thus in refuting the determinist philosophy behind the mathematics of Newton and the imperial logic of Hilbert, he opened the way to a new mathematics, the mathematics of information. From this demarche emerged a new industry of computers and communications currently led by Google and informed by a new mathematics of creativity and surprise.

As the philosopher William Briggs observes, "Gödel proved that axiomatizing never stops, that induction-intuition must always be present, that not all things can be proved by reason alone." This recognition would liberate von Neumann himself. Not only could men discover algorithms, they could compose them. The new vision ultimately led to a new information theory of biology, anticipated in principle by von Neumann and developed most fully by Hubert Yockey,10 in which human beings might eventually reprogram parts of their own DNA.

More immediately, Gödel's proof prompted Alan Turing's invention in 1936 of the Turing machinethe universal computing architecture with which he showed that computer programs, like other logical schemes, not only were incomplete but could not even be proved to reach any conclusion. Any particular program might cause it to churn away forever. This was the "halting problem." Computers required what Turing called "oracles" to give them instructions and judge their outputs."

Gödel's insights led directly to Claude Shannon's information theory, which underlies all computers and networks today. Conceiving the bit as the basic unit of digital computation, Shannon defined information as surprising bits-that is, bits not predetermined by the machine.

Shannon's canonical equation translated Ludwig Boltzmann's analog entropy into digital terms. Boltzmann's equation, formulated in 1877, had broadened and deepened the meaning of entropy as "missing information". Seventy years and two world wars later, Shannon was broadening and deepening it again. Boltzmann's entropy is thermodynamic disorder; Shannon's entropy is informational disorder, and the equations are the same.

Gödel's work, and Turing's, led to Gregory Chaitin's concept of algorithmic information theory. This important breakthrough tested the "complexity" of a message by the length of the computer program needed to generate it. Chaitin proved that physical laws alone, for example, could not explain chemistry or biology, because the laws of physics contain drastically less information than do chemical or biological phenomena. The universe is a hierarchy of information tiers, a universal "stack," governed from the top down.

Cleaving all information is the great divide between creativity and determinism, between information entropy of surprise and thermodynamic entropy of predictable decline, between stories that capture a particular truth and statistics that reveal a sterile generality, between cryptographic hashes that preserve information and mathematical blends that dissolve it, between the butterfly effect and the law of averages, between genetics and the law of large numbers, between singularities and big data-in a word, the impassible gulf between consciousness and machines.

This new system of the world was consummated by the company we know as Google. Google, though still second in the market cap race is by far most important, paradigmatic company of our time. Yet I believe the Google system of the world will fail, indeed be swept away in our time (and I am seventy-eight!). It will fail because its every major premise will fail.

As with the gentle image of cloud computing, Google's theory of money and prices seems at first utterly benign and even in some sense deeply Christian. For Google ordains that, at least within the realm under its direct control, there shall be no prices at all. With a few small (but significant) exceptions, everything Google offers to its "customers" is free. Internet searches are free. Email is free. The vast resources of the data centers, costing Google an estimated thirty billion dollars to build, are provided essentially for free.

This philosophy rules our economic lives in America and, increasingly, around the globe. With its development of "deep learning" by machines and its hiring of the inventor-prophet Raymond Kurzweil in 2014, Google enlisted in a chiliastic campaign to blend human and machine cognition. Kurzweil calls it a "singularity," marked by the triumph of computation over human intelligence. Google networks, clouds, and server farms could be said to have already accomplished much of it.

By exploring search-what Page called "the intersection between computer science and metaphysics"-Google was plunging into profound issues of philosophy and neuroscience³ Search implies a system of the world: it must begin with a "mirror world," as the Yale computer scientist and philosopher David Gelernter puts it, an authentic model of the available universe.4 In order to search something with a computer, you must translate its corpus into digital form: bits and bytes defined by Shannon as irreducible binary units of information. Page and Brin set out to render the world, beginning with its simulacrum, the Worldwide Web, as a readable set of digital files, a "corpus" of accessible information, an enormous database.

As the years passed, Google digitized nearly all of the available books in the world (2005), the entire tapestry of the world's languages and translations (2010), the topography of the planet (Google Maps and Google Earth, 2007), down to the surfaces and structures on individual streets (StreetView) and their traffic (Waze, 2016). It digitized even the physiognomies of the world's faces in its digital facial recognition software (2006, now upgraded massively and part of Google Photos). With the capture of YouTube in 2006, Google commanded an explosively expanding digital rendition of much of the world's imagery, music, and talk.

Rollerblading down the corridors of Stanford's computer science pantheon in the madcap spirit of Claude Shannon, the Google founders consorted with such academic giants as Donald Knuth, the conceptual king of software, Bill Dally, a trailblazer of parallel computation, and even John McCarthy, the founding father of artificial intelligence.

By 1998, Brin and Page were teaching the course CS 349, "Data Mining, Search, and the World Wide Web." Sun founder Andy Bechtolsheim, Amazon founder Jeff Bezos, and Cisco networking guru Dave Cheriton had all blessed the Google project with substantial investments. Stanford itself earned 1.8 million shares in exchange for Google's access to Page's patents held by the university. (Stanford had cashed in those shares for $336 million by 2005).

Accessed through a password system named Gaia, after the earth goddess, this digital mirror world and its uncountable interactions comprised a dynamic microcosm worthy of a googolplex. As Page put it, "We don't always produce what people want; it's really difficult. To do that you have to be smart-you have to understand everything in the world. In computer science, we call that artificial intelligence.

An ordinary company of the previous system might have sold access to this information or collected royalties on licenses for the software needed to reach it. By developing efficient and hassle-free transactional systems, optimizing its computer processing, and driving down costs as it expanded in scale, Google might have garnered massive profits over the years. As little as a penny a search on its forty-two-kilohertz (thousand-searches-a-second) find-and-fetch engine would produce some $13 billion of revenues per year, most of that falling to the bottom line. But as prices dropped, purchases would mount and accumulated profits would rise on the model of all capitalist growth.

Google, however, was not a conventional company. It made the fateful and audacious decision to make all its content and information available free: in economic terms, a commons, available to all, in the spirit of the Internet pioneer Stewart Brand, whose slogan was "Information wants to be free."

The Google philosophy smacks of disdain for the money-grubbing of bourgeois society. As the former engineering director, Alan Eustace, puts it, "I look at people here as missionaries, not mercenaries." Google doesn't sweat to supply goods and services for cash and credit. It provides information, art, knowledge, culture, enlightenment, all for no charge.

Yet, as everyone now knows, this apparently sacrificial strategy has not prevented Google from becoming one the world's most valuable companies. Still in first place as of this writing is Apple, twenty years older, riding on the crest of the worldwide market for its coveted iPhones, but Google is aiming for the top spot with its free strategy. In 2006, it purchased Android, an open source operating system that is endowing companies around the globe, including itself, with the ability to compete with the iPhone.

Apple is an oldstyle company, charging handsomely for everything it offers. Its CEO, Tim Cook, recall, is the author of the trenchant insight that "if the service is 'free, you are not the customer but the product." Apple stores make ten times more per square foot than any other retailer. If the market turns against its products, if Samsung or Xiaomi or HTP or LG or Lenovo or Techno or Zopo or whatever Asian knockoff pops up in the market fueled by Google at an impossibly low price, Apple may slip rapidly down the list.

Page's and Brin's crucial insight was that the existing advertising system, epitomized by Madison Avenue, was linked to the old information economy, led by television, which Google would overthrow. The overthrow of TV by computers was the theme of my book Life after Television. If Google could succeed in its plan to "organize the world's information" and make it available, the existing advertising regime could be displaced.

Brin and Page began with the idea of producing a search engine maintained by a nonprofit university, operated beyond the corruption of commerce. They explained their view of advertising in their 1998 paper introducing their search engine:

Steven Levy's definitive book on Google describes the situation as Google developed its ad strategy in 1999: "At the time the dominant forms of advertising on the web were intrusive, annoying and sometimes insulting. Most common was the banner ad, a distracting color rectangle that would often flash like a burlesque marquee. Other ads hijacked your screen.

Google led the world in grasping that this model is not only unsustainable but also unnecessary. Brin and Page saw that the information conferred by the pattern of searches was precisely the information needed to determine what ads viewers were likely to welcome. From its search results, it could produce ads that the viewer wanted to see.

According to Levy, Google concluded that "the advertisement should not be a twoway transaction between publisher and advertiser but a three-way transaction including the user." But in practice, following its rule "to focus on the user and all else will follow," Google made it a one-way appeal to the user.

Google understood that unless the user actually wanted the ad, it would not serve the advertiser either and would therefore ultimately threaten the advertising intermediaries as well. In the terms of Life after Television, the promise of the Internet under Google's scheme would be that "no one would have to read or see any unwanted ads." Ads would be sought, not fought. To accomplish this goal, Google designated its ads as "sponsored links" and charged only for successful appeals measured by click-throughs. They used the same measure to calculate an ad's effectiveness and quality, forcing advertisers to improve their ads by removing those that did not generate enough clickthroughs.

Google's new free economic model has penetrated even its corporate lunch rooms, the company having made the remarkable discovthat a cafeteria can be far more efficient if it does not bother to charge its patrons. At first Google set up a system of terminals to collect money from its employees for their food. The system itself cost money, and it led to queues of valuable Google engineers wasting company time as they waited to pay. Cheaper and easier and altogether trans-capitalistically cooler was simply giving away the food.

In 2009, the Stanford philosopher Fred Turner published a paper titled "Burning Man at Google: A Cultural Infrastructure for New Media Production," in which he unveiled the religious movement behind Google's system of the world.

Like Google, Burning Man might be termed a commons cult: a communitarian religious movement that celebrates giving-free offerings with no expectation of return-as the moral center of an ideal economy of missionaries rather than mercenaries. It conveys the supriority of "don't be evil" Google, in contrast to what Silicon Valley regards as the sinister history of Microsoft in the North.

Burning Man's website, like Google's, presents a decalogue of communal principles. Authored by the founder Larry Harvey in 2004, the "10 Principles of Burning Man" would seem on the surface incompatible with the ethos of a giant corporation raking in money and headed by two of the world's richest men:

Brin and Page see no contradiction between Burning Man's ethos and Google's. They attend Burning Man often, as does Eric Schmidt, whose hiring was allegedly eased by the knowledge that he was a fellow devotee. Google's headquarters, Building 43 in Mountain View, is often decorated with photographs of the desert rites. The first Google logo bore a burning man stick figure.

Echoing the 10 Principles of Burning Man is Google's corporate page presenting "Our Philosophy," a guide to its system of the world in the form of "ten things we know to be true." These ten principles, like Burning Man's, seem unexceptionable on the surface, but each item harbors a subversive subtext.

As Jerry Bowyer writes in Forbes, "If advertising dies [as support for media], then what we call media dies too. The whole system which started with newspapers, moved on to radio, then TV, and then various forms of blogging and streaming is basically the same business model:

For aggregating audiences and eyeballs, nothing works so well as giving services away for "free." Sergey Brin asked the crucial question early in Google's history: "How does the strategy change if the price is zero?" The answer turned out to be: "We win the entire market." In 2014, Google summoned Jeremy Rifkin to its lecture series to sum it all up. He heralded a "zero marginal cost society." Under the new regime, the price of every incremental good and service, from search to software, from news to energy, will plunge toward "free" as every device and entity in the world is subsumed in an Internet of Things, where exponential network effects yield a new economy of leisure and abundance.³ Rifkin assured his audience that it is indeed a Google world.

But not only is "free" a lie, as we've seen, but a price of zero signifies a return to the barter system, a morass of incommensurable exchanges that the human race left behind in the Stone Age. You pay not with money but with your attention.

Larry Page's burning ambition in starting Google, according to Doug Edwards, "Google Employee Number 59," was to "stop the world wasting his time." He may well have succeeded by now, save for the occasional subpoena from an officious regulator somewhere. But for the rest of us, all the free stuff leads to transactional tricks and traps:

It's the "Free World," and it is reaching past your wallet, spurning your earned money, to seize your time-which is actually your life.

Slowly but surely the advertising model is decaying. According to a 2014 study quoted by Needham & Company's Laura Martin, over the past seventy years daily media usage has doubled from five hours to ten hours per person. Free porn is both a vessel and a symbol of the addictive properties of free stuff. Meanwhile, ads delivered per person have remained stable at around 350 per day. Ads viewed per hour of media use, including print media, have dropped by half. In a world of digital devices, people are learning to cancel, mute, or avoid advertisements that they do not want to see. As soon as the next generation of innovators creates a new payment and security model, this trend will accelerate.

Beyond the suppliers of ads that no one wishes to see, Google's main role is intermediator. Although Google's list of business principles leads off with "The customer comes first," Google has few end customers at all. Beyond the coddled purchasers of its ads, Google's customer base is tiny compared with Amazon's, which unlike Google was never shy about collecting money.

A blogger called Daniel Colin James came to my attention through a post on my Telecosm Lounge message board. Writing at a blog called Hacker Noon-"where hackers start their afternoons"-James has cogently documented Google's advertising vulnerabilities. His revelations begin with Apple's decision late in 2015 to introduce an ad blocker in its iPhone. This was a major blow against the online strategy of "aggregate and advertise," which was widely alleged to be Google's path to permanent near-monopoly. Since the iPhone is the source of some 75 percent of all Google's mobile ad revenues, Apple's move struck at the heart of Google's mobile strategy. Beyond its free, open source, "sharing-economy" Android platform, Google's response did not arrive until a year later. Then it chose deceptively to copy Apple.

Google's industry-leading advertising Analytics tools apparently revealed that its users liked the idea of blocking ads. Customers come first, so in its Chrome browser, Google introduced its own ad-blocker.

This success followed Amazon's coup in cloud services. Although Google by all measures commanded the world's leading cloud deployment, somehow Amazon defeated them in marketing cloud services by 57 percent to 16 percent as of 2017. This advance in collecting money from real customers must have been baffling to Google. It fought back, as it normally does, with a stream of YouTube speeches and technical presentations demonstrating the superiority of Google's cloud offerings, its global SQL reach, its facile user interfaces, its instant responses, its MapReduce, Hadoop, and "Spanner" big-database schemes, its massive fiber deployments and worldspanning data centers, its idealism, its tech conference éclat. But somehow when people had to choose a cloud service, they were turning not to Google but to Amazon Web Services. Who would have thunk it?

Google, meanwhile, under its new CEO, Sundar Pichai, pivoted away from its highly publicized "mobile first" mantra, which had led to its acquisitions of Android and Ad Mob, and toward "AI first."

If Google had been a university, it would have utterly outshone all others in AI talent. It must have been discouraging, then, to find that Amazon had shrewdly captured much of the market for AI services with its 2014 Alexa and Echo projects. It launched actual hardware to bring AI to everyone's household in the form of elegantly designed devices that answered questions and ordered products while eschewing ads.

The most important effect of free, though, is not avoidance of liabilities to real customers. It is escape from the challenges of security.

This very lack of concern with security, however, will be Google's undoing. For every other player on the Net, the lack of security is the most relevant threat to its current business model. The problem will be solved. Some thousands of companies you've never heard of are investing billions right now in that effort. Collectively they will give birth to a new network whose most powerful architectural imperative will be security of transactions as a property of the system rather than an afterthought. So fundamental will security be to this new system that its very name will be derived from it. It will be the cryptocosm.

security means not some average level of surveillance at the network level but the safety of your own identity, your own device, and your own property. You occupy and control a specific time and space. You cannot be blended or averaged. Just as you are part of a biological ledger, inscribed through time in DNA codes and irreversible by outside power, your properties and transactions compose an immutable ledger. Just as you are bound in time, every entry in the cryptocosmic ledger is timestamped.

The second rule of the cryptocosm derives from the first: "Centralization is not safe." Secure positions are decentralized ones, as human minds and DNA code are decentralized. Darwin's mistake, and Google's today, is to imagine that identity is a blend rather than a codethat machines can be a singularity, but human beings are random outcomes.

The third rule is "Safety last." Unless the architecture achieves its desired goals, safety and security are irrelevant. Security is a crucial asset of a functional system. Requiring the system to be safe at every step of construction results in a kludge: a machine too complex to use.

The fourth rule is "Nothing is free. This rule is fundamental to human dignity and worth. Capitalism requires companies to serve their customers and to accept their proof of work, which is money.

The fifth rule is "Time is the final measure of cost." Time is what remains scarce when all else becomes abundant: the speed of light and the span of life. The scarcity of time trumps an abundance of money.

The sixth rule: "Stable money endows humans with dignity and control." Stable money reflects the scarcity of time. Without stable money, an economy is governed only by time and power.

The seventh rule is the "asymmetry law," reproducing biological asymmetry. A message coded by a public key can be decrypted only by the private key, but the private key cannot be calculated from the public key. Asymmetric codes that are prohibitively difficult to break but easy to verify give power to the people. By contrast, symmetrical encryption gives power to the owners of the most costly computers.

The eighth rule is "Private keys rule." They are what is secure. They cannot be blended or changed from on top any more than your DNA can be changed or blended from above.

The ninth rule is "Private keys are held by individual human beings, not by governments or Google." Private keys enforce property rights and identities. In a challenge-response interaction, the challenger takes the public key and encrypts a message. The private responder proves identity by decrypting, amending, and returning the message encrypted anew with his private key. This process is a digital signature.

The tenth rule is "Behind every private key and its public key is the human interpreter." A focus on individual human beings makes meaningful security.

Google is hierarchical. Life after Google will be heterarchical.

In a 2005 technical article, Google's operations chief, Hölzle, explained why. The price of high-end processors "goes up nonlinearly with [their] performance," he observed. That is, Intel's high-end microprocessors cost increasingly more than they are worth in incremental output. The chips hit what might be called Mundie's Wall. When he was Microsoft's technical chief, Craig Mundie said:

Nick Tredennick, the designer of a favorite "hot chip" of yore, the Motorola 68,000 microprocessor behind Steve Jobs's Macintosh computer, used to say that the industry seeks to exploit the "leading edge wedge." Three overlapping design targets converged in this fertile crescent of chip design: zero delay (fast hot chips), zero power (cool low-energy devices), and zero cost (transistors going for billionths of a penny).² Between the 1980s and 2017, chips have been migrating from the hot fast end toward the cool cheap end, a trend Dally has led.

While the maps come from Google, the processing comes from Nvidia GPUs. These chips compute the car's response to lidar, radar, ultrasound, and camera signals that free the missile to descend from the outer space of Elon Musk's domains and enter the ever-changing high-entropy world beyond Google Maps.

Machine learning, Dally points out, is mostly accomplished by graphics processing chips from Nvidia. Some advances in artificial intelligence spring from improvements in algorithms, but the real source of these capabilities is the explosive improvement in computer speed achieved through a combination of Moore's Law and parallel processing. Nvidia's graphics processors are the climax of Dally's long career as a prophet of parallel processing, which began thirty years ago at Virginia Tech, where he studied the virtues of multiple processors functioning together.

At a Hot Chips conference at Stanford in August 1991, Dally and Norm Jouppi first emerged as foils in fashioning the future philosophies of computation. Dally introduced his revolutionary massively parallel J-machine, and Jouppi, now at Google, then at Digital Equipment, touted the promise of revving up existing processor pipelines to "five instructions per-clock" cycle.

Twenty-six years later, Dally and Jouppi are still at it. At the August 2017 Hot Chips in Cupertino, all the big guys were touting their own chips for what they call "deep learning," the fashionable Silicon Valley term for the massive acceleration of multi-layered pattern recognition, correlation, and correction tied to feedback that results in a cumulative gain in performance. What they call "learning" originated in earlier ventures in AL. Guess, measure the error, adjust the answer, feed it back are the canonical steps followed in Google's data centers, enabling such applications as Google Translate, Google Soundwriter, Google Maps, Google Assistant, Waymo cars, search, Google Now, and so on, in real time.

As recently as 2012, Google was still struggling with the difference between dogs and cats. YouTube was famous for its cat videos, but it could not efficiently teach its machines to recognize the cats. They could count them; the data center dogs could dance; but it took sixteen thousand microprocessor cores and six hundred kilowatts. And it still was a dog, with a 5 percent error rate-not an impressive portent for Google's human facerecognition project or for car vision systems that need flawlessly to identify remote objects in real time.

In a joint appearance in 2012 in Aspen, Peter Thiel chided Eric Schmidt: "You don't have the slightest idea of what you are doing." He pointed out that the company had amassed some $50 billion in cash at the time and was allowing it to sit in the bank at near-zero interest rates while its vast data centers still could not identify cats as well as a three-year-old could./ Thiel is the leading critic of Silicon Valley's prevailing philosophy of "inevitable" innovation. Page, on the other hand, is a machinelearning maximalist who believes that silicon will soon outperform human beings, however you want to define the difference. If the haphazard Turing machine of evolution could produce human brains, just imagine what could be accomplished by Google's constellation of eminent academics devoting entire data centers full of multi-gigahertz silicon to training machines on petabytes of data. In 2012, though, the results seemed underwhelming.

Simultaneously with the dogs and cats crisis in 2012, the leader of the Google Brain research team, Jeff Dean, raised the stakes by telling Urs Hölzle, Google's data center dynamo, "We need another Google."

The man who built the first crude graphics processor, the precursor of all of Google's data center neural networks, was Frank Rosenblatt, a psychology professor at Cornell. In 1958 he described his "perceptron" to the New Yorker: "If a triangle is held up to the perceptron's eye [photosensor], the association units connected with the eye pick up the image of the triangle and convey it along a random succession of lines to the response units [now called neurons], where the image is registered.... [A]ll the connections leading to that response are strengthened [i.e., their weights are increased], and if a triangle of a different size and shape is held up to the perceptron, its image will be passed along the track that the first triangle took. If a square is presented, however, a new set of random lines is called into play.... The more images the perceptron is permitted to scan, the more adroit its generalizations.... It can tell the difference between a dog and a cat

Four years later, Ray Kurzweil, then sixteen, visited Rosenblatt after Kurzweil's MIT mentor Marvin Minsky exposed the limitations of the onelayer perceptron that Rosenblatt had built. Rosenblatt told Kurzweil that he could surmount these limitations by stacking perceptrons on top of one another in layers. "The performance improves dramatically," he said. Rosenblatt died in a boating accident eight years later, never having built a multilayered machine.

Now at Google, that omission was being remedied. Dally assigned Nvidia's software guru Frank Canizaro to work with Ng on upgrading Nvidia's proprietary software CUDA (Compute Unified Device Architecture) for use in its CUDA Deep Neural Network library (cuDNN).

Dally was proud of this achievement. The Nvidia machine was roughly 150 times as cost-effective as Google's previous setup. And that's not even taking into account the GPUs' enormous advantage in energy efficiency. Nvidia processors soon pervaded Google's data centers, giving unprecedented performance in the matrix multiplications and accumulations at the heart of machine learning.

None of this would faze Dally, except for one crucial change at Google. At the 2017 Hot Chips conference, the company, in a do-ityourself mood, indicated that it would henceforth replace Nvidia's devices with its own special-purpose silicon. Jeff Dean celebrated Jouppi's souped-up "Tensor" "matrix multiplier," which eschewed graphics and floating point, focusing on the machine learning functions alone. It's a matrix multiplier ASIC (application-specific integrated circuit). Without their Tensor processing unit, say the Google guys, they would have had to double the size of their data centers.

Since writing his college thesis in the late 1970s, Dally has rebelled against the serial step-by-step computing regime known as the von Neumann architecture. After working on the "Cosmic Cube" under Chuck Seitz for his Ph.D. at Caltech (1986), Dally has led design of parallel machines at MIT (the J-machine and the M-machine), introduced massive parallelism to Cray supercomputers (the T-3D and 3E), and pioneered parallel graphics at Stanford (the Imagine project, a streaming parallel device incorporating programmable "shaders," now ubiquitous in the industry's graphic processors from Nvidia and others).

In all these projects Dally was warring against the conventional computer architecture of step-by-step serial processing-associated with a memory problem called the "von Neumann bottleneck." You live in the real world, right? The real world offers intrinsically parallel problems such as images that flood the eye all at once, whether you're driving a car in the snow or summoning a metaverse with computergenerated graphics or pattern-matching in "machine learning" argosies across the seas of big data.

The von Neumann bottleneck was recognized by von Neumann himself. In response, he proposed a massively parallel architecture called cellular automata, which led to his last book before his death at age fifty-seven. In The Computer and the Brain, he contemplated parallel solution called neural networks, which were based on a primitive idea of how billions of neurons might work together in the human neural system.

Von Neumann concluded the brain is a non-von machine nine orders of magnitude slower than the gigahertz he prophesied for computers back in 1957. Amazingly, von Neumann anticipated the manymillion-fold "Moore's Law" speedup that we have experienced. But he estimated that the brain is nine orders of magnitude (a billion times) more energy-efficient than a computer. This is a delta larger even than that claimed by the guys from Google Brain for their Tensor chip. In the age of Big Blue and Watson at IBM, the comparison remains relevant. When a supercomputer defeats a man in a game of chess or Go, the man is using maybe fourteen watts of power, while the computer and its networks are tapping into the gigawatt clouds on the Columbia.

In the age of Big Data, the von Neumann bottleneck has philosophical implications. The more knowledge that is put into a von Neumann machine, the bigger and more crowded its memory, the further away its average data address, and the slower its functioning.

Hitting the energy wall and the light-speed barrier, the chip's architecture will necessarily fragment into separate modules and asynchronous and more parallel structures. We might term these processors time-space "mollusks"-Einstein's word for entities in a relativistic world. Setting the size of the integrated circuit cell will be a measure comparable in the microcosm to light-years in the cosmos.

There is a critical difference between programmable machines and programmers. The machines are deterministic and the programmers are creative.

That means that the AI movement, far from replacing human brains, is going to find itself imitating them. The brain demonstrates the superiority of the edge over the core: It's not agglomerated in a few airconditioned nodes but dispersed far and wide, interconnected by myriad sensory and media channels. The test of the new global ganglia of computers and cables, worldwide Webs of glass and light and air, is how readily they take advantage of unexpected contributions from free human minds in all their creativity and diversity, which cannot even be measured by the metrics of computer science.

As the Silicon Valley legend Carver Mead of Caltech has shown in his decades of experiments in neuro-morphic computation, any real artificial intelligence will likely have to use not silicon substrates but carbon-based materials. With some 200,000 compounds, carbon is more adaptable and chemically complex than silicon by orders of magnitude. Recent years have seen an efflorescence of new carbon materials, such as the organic light-emitting diodes and photodetectors now slowly taking over the display market. Most promising is graphene, a one-atom-deep sheet of transparent carbon that can be curled up in carbon nanotubes, layered in graphite blocks, or architected in C-60 "Buckyballs."

Graphene has many advantages. Its tensile strength is sixty times that of steel, its conductivity two hundred times that of copper. There is no band gap to slow it down, and it provides a relatively huge sixty-micron mean-free path for electrons. As the nanotech virtuoso James Tour of Rice University has demonstrated in his laboratory, graphene, carbon nanotube swirls, and their compounds make an array of nano-machines, vehicles, and engines possible. They offer the stillremote promise of new computer architectures such as quantum computers that can actually model physical reality and thus may finally yield some real intelligence.

The current generation in Silicon Valley has yet to come to terms with the findings of von Neumann and Gödel early in the last century or with the breakthroughs in information theory of Claude Shannon, Gregory Chaitin, Anton Kolmogorov, and John R. Pierce. In a series of powerful arguments, Chaitin, the inventor of algorithmic information theory, has translated Gödel into modern terms. When Silicon Valley's AI theorists push the logic of their case to explosive extremes, they defy the most crucial findings of twentieth-century mathematics and computer science. All logical schemes are incomplete and depend on propositions that they cannot prove. Pushing any logical or mathematical argument to extremes-whether "renormalized" infinities or parallel universe multiplicities-scientists impel it off the cliffs of Gödelian incompleteness.

Chaitin's "mathematics of creativity" suggests that in order to push the technology forward it will be necessary to transcend the deterministic mathematical logic that pervades existing computers. Anything deterministic prohibits the very surprises that define information and reflect real creation. Gödel dictates a mathematics of creativity.

One of the supremely seminal ideas of the twentieth century is the Markov chain. Introduced by the Russian mathematician and information theorist Andrey Markov in 1913, it became a set of statistical tools for predicting the future from the present.

From physics to economics, science has long had trouble coming to terms with time. Until Markov, the theory of probability, like the theory of physics, mostly avoided temporal considerations. As Amy Langville and Philipp von Hilgers write in a canonical essay, the dominant probability concepts failed to differentiate between serial and parallel processes, between "a thousand throws of a single die and a thousand dice each thrown once." Addressing the temporal dependencies between events, how one thing leads to another, Markov chains trace the probabilistic transitions from one state or condition to another, step by step through time.

Markov followed the lead of the nineteenth-century intellectual giants James Clerk Maxwell and Ludwig Boltzmann, who had pioneered this statistical mode of thought in physics. They invented probabilistic tools to describe physical phenomena, such as the hidden behavior of atoms and molecules, waves and particles, which could not be seen or measured by the scientific instruments of their day. Their statistical laws of thermodynamics provided theoretical physics a much-needed arrow of time derived from the concept of entropy.

Remarkably, the first man to expound and use these statistical tools, several years before they were publically formulated by Markov, was Albert Einstein. In 1905, calculating the hidden behavior of molecules in Brownian motion, he showed that they occupied a chain of states that jiggled at a rate of around two gigahertz following a "random walk," as in Markov's concept. Showing the movements of atoms without seeing or measuring them, Einstein translated from what is now termed a Markov sequence of observable states of a gas to his proof of the then-still-hidden Brownian motion of the molecules.

Markov kept his head down during the Russian Revolution while working on his theory. By the time of his death in 1922, he had turned his precursors' improvisations into a full-fledged system. Markovian techniques, which pervade the science of information theory, are behind the dominant advances of the Google era, from big data and cloud computing to speech recognition and machine learning.

In an early triumph, a statistical study of Pushkin's poem Eugene Onegin, Markov showed that linguistic properties could be grasped mathematically and predicted without knowing the particular language. In focusing on patterns of vowels and consonants, Markov came close to anticipating Claude Shannon's information metric. Shannon's theory treated all transmitters across a communications channel as Markov processes.

Refining and extending Markov's discoveries through the twenti eth century and into our own era were a series of transformative think ers. Some, like Shannon, are widely celebrated. Andrew Viterbi is best known as a co-founder of Qualcomm, but perhaps his greatest feat was to develop a recursive algorithm for efficiently computing complex chains, overcoming the computing costs that grew exponentially with the size of the chain.

The precocious MIT star Norbert Wiener, author of Cybernetics (1948), extended Markov sequences from discrete to continuous phenomena and contributed the idea of pruning improbable results.4 This advance helped calculations of rocket or airplane trajectories during World War II, using Markov math to predict the future location of moving objects by observing their current positions.

By every measure, the most widespread, immense, and influential of Markov chains today is Google's foundational algorithm, PageRank, which encompasses the petabyte reaches of the entire World Wide Web. Treating the Web as a Markov chain enables Google's search engine to gauge the probability that a particular Web page satisfies your search.

To construct his uncanny search engine, Larry Page paradoxically began with the Markovian assumption that no one is actually searching for anything. His "random surfer" concept makes Markov central to the Google era.

PageRank treats the Internet user as if he were taking a random walk across the Web, which we users know is not what we are doing.

Beyond Web pages, Markov models treat the world as a sequence of "states"-phonemes, words, weather conditions, consumer choices, transactions, security prices, sensor data, DNA bases, sports results, health indices, CO, levels, bomb trajectories, Turing machine steps, chess positions, gambling prospects, computer performance, commodity markets, traffic reports-you name it-linked to other states by "transition probabilities." I drew three kings; what is the likelihood of a fourth? It snowed today; what is the probability that it will rain tomorrow? The opening price of an Amazon share is $1,421 at nine A.M.; what will be the price at 9:01? The transition probabilities may be calculated from previous data and updated with new observations. The Markovian world of random wanderings among the states is governed by the probability weights

This approach freed analysts of the burden of figuring out people's intentions or plans or of working out the logical connections between events. All you need is a record of states and the probabilities between them. All else can be assumed to be random. In his contributions to the Central Limit Theorem in probability, Markov showed that all random events or data, independent or not, ultimately conform to normal distributions. Chains with dependencies over time are a trac table part t of the mathematical universe. This is consistent with what we know about statistics: they predict group behavior without accounting for individual decisions or free will.

Hierarchical hidden Markov models enable multiple levels of abstraction, from phonemes up a neural network tree to words and phrases and meanings and models of reality. Ray Kurzweil, a Google vice president and Markov enthusiast, maintains that in recognizing speech or other patterns, hierarchical hidden Markov models are a guide to the mind: "essentially computing what is going on in the neocortex of a speaker-even though we have no direct access to that person's brain.... We might wonder, if we were to look inside the speaker's neocortex, would we see connections and weights corresponding to the hierarchical hidden Markov Model computed by the software?" In his book How to Create a Mind, he concludes that "there must be an essential mathematical equivalence to a high degree of precision between the actual biology [of the brain] and our attempt to emulate it; otherwise these systems would not work as well as they do

A defining property of a Markov chain is that it is memoryless. The history is assumed to be summed up by the current state and not by any past history of the chain. This feature greatly simplifies the computational process. Following a Markov model, a browser pursues a "random walk" of transitions from one position to another, bouncing off "reflecting states" (unwanted sites), moving through "transitional states" (Utah, Nevada), stopping at "absorbing states" (Google Mountain View headquarters!), all without needing to factor in intentionality or plan.

All the titans of the cloud from Amazon to Facebook have made heuristic use of Markov models to decide what customers are saying and to predict what they will do next. But the most impressive Markov warriors and Siren Servers are not at Google or Amazon or Facebook.

Remember Leonard Baum of the Institute for Defense Analyses?

The entire big data movement has its roots in the research of that industry-leading cohort at IBM, which took advantage of the company's vast troves of speech examples and world-class computer power to recognize human language better than anyone else. Applying the Markov tools to money and investing, the Renaissance team saw that if you can predict the next word in a sentence, you can predict the likely next price for stocks, commodities, or currencies. With clusters of supercomputers running at sufficient velocity, you could beat every shortterm market you could access and measure. In 2009 Simons retired and named Mercer and Brown co-CEOs of his company.

Relying on its world-leading complement of mathematicians and physicists, Renaissance "avoids hiring anyone with even the slightest whiff of Wall Street bona fides," comments James Owen Weatherall in The Physics of Wall Street. Instead, it takes in vast troves of information from analyst reports, government reports, newspaper stories and newswires, in addition to prices and trades wherever it can find them.

I prefer to believe a non-Markov model will win. Why? Newton's insight that the "white light is a mixture, but the colored beams are pure" inspired Jean-Baptiste Joseph Fourier, a century later, to use Newton's infinite mathematical series to describe the effect of prisms.

As Peter Drucker said, "It is less important to do things right than to do the right things."

The Siren Server search for momentary correlations falls within the ambit of Stanford Nobel physicist Robert Laughlin's critique of the science of frothy phase changes. Parsing the chaotic ebullition of water as it comes to a boil, for example, is a fool's errand called "chaos theory."

Bagehot's Britain operated under Newton's gold standard and system of the world. The currencies that central banks manage today have no anchor in gold and thus suffer from the self-referential circularity of all logical systems not moored to reality outside of themselves.

At IBM, by contrast, Mercer and his colleagues under Jelinek achieved a permanent advance in computer science, information theory, and speech recognition. Their discoveries are behind the Siri system in your iPhone, hands-free calling in your car, and the growing success of machine translation. They enabled the ever-improving responsiveness of voice interfaces to the cloud computing technologies in the new generation of Internet progress.

In the process, Mercer and his team pioneered the field of big data, which dominates the current computer paradigm. Competing with Kurzweil and other pioneers of AI-based systems attempting to duplicate human experts-from chess to translation-the IBM team faced the possibility of refutation and failure. Their advances, therefore, exhibited the Popperian power of falsifiable knowledge, the source of all new wealth under capitalism.

The working assumption of the prevailing system of the world is that what looks random is random. As Shannon knew, however, in principle a creative pattern of data points, reflecting long and purposeful preparation and invention in the real world of imagination and will, is indistinguishable from a random pattern. Both are high-entropy, unexpected. Parsing of random patterns for transitory correlations fails to yield new knowledge. You cannot meaningfully study the market with an oscilloscope registering the time domain gyrations. You need a microscope, exploring inside the cells of individual companies to find the pure tones of true technology advance.

Since Einstein used the concept to calculate the spontaneous gigahertz jiggling of molecules, Markov chains accelerated to gigahertz frequencies have enabled scientists to dominate a world economy ruled by chaotic money creation from central banks. Now, in the Google system of the world, technologists imagine that computer velocity conveys computer intelligence, that if you shuffle the electrons fast enough you can confer consciousness and creativity on dumb machines.

The idea, however, that human brains, the world's most compact and effective thinking systems, are actually random machines is not really very bright. Markov models work by obviating human intelligence and knowledge. Whether analyzing speech without knowing the language (Shannon and Baum), gauging the importance of webpages without knowledge of either the pages or the evaluators (Page and Brin), measuring the performance of computing machines while ignoring 99 percent of the details of the system (A. L. Scherr), investing in stocks and bonds with no awareness of the businesses that issue them (Renaissance), or identifying authors without any knowledge of what they wrote or even the language they write in (Markov himself), these procedures are marked and enabled by their total lack of intelligence. You use big data statistics and Markov probability models when you don't know what is really happening. Markov models are idiot savants that can predict either a random pattern or a planned process without the slightest understanding of either. For its future, the industry must move beyond them.

All wealth is ultimately a product of long-term investment based on knowledge and discovery. There is no way to escape the inexorable conflict between savers who want liquidity and investors who constantly destroy it with enduring investments.

Google, on the other hand, escapes market irrationality and price discovery through its strategy of giving most of its goods away for free.

In 1965 I. J. Good, whom Turing taught to play Go at Bletchley Park while they worked on cracking the Enigma cipher, penned the first (and still the pithiest) warning:

"Thus," Good declared, "the first ultra-intelligent machine is the last invention that man need ever make, provided that it is docile enough to tell us how to keep it under control." The message of the Asilomar experts was that keeping it under control is still an unsolvable problem. When a new supreme intelligence emerges, it is hard to see how an inferior human intelligence can govern it. As Musk put it, "It's potentially more dangerous than nukes." Stephen Hawking pronounced: "The development of full artificial intelligence could spell the end of the human race."

Tegmark explains why a "breakout," in which the machines take over the commanding heights of the society and economy, is almost inevitable. When Homo sapiens came along, after all, the Neanderthals had a hard time, and virtually all animals were subdued. The lucky ones became pets, the unlucky lunch.

The advocates of super-AI believe that it can propel human intelligence out into the cosmos in the form of silicon digital devices, escaping the limits of space exploration by vulnerable carbon-based human beings. Ultimately the breakout will sweep into the galaxy, with the intelligent machines contriving ever more powerful rockets bearing ever more miraculous minds and bionic bodies. Tegmark speculates about what that will look like: "after spending billions of years as an almost negligibly small perturbation on an indifferent lifeless cosmos, life suddenly explodes onto the cosmic arena as a spherical blast wave expanding near the speed of light, never slowing down, and igniting everything in its path with the spark of life." In Tegmark's new creation story, digital machines become the dominant form of life.

Conducting what he calls a "semantic search" of all ten of Minsky's compendious books--that is, searching for specific associative meanings rather than blind "key words"-Kurzweil was able to get answers instantly from the deceased AI legend. Kurzweil has used the same program to explore his own works and rediscover insights that had slipped away over time, presumably displaced in memory by the newer concepts of his semantics program. If you have Gmail on your smartphone, you have seen the fruits of Kurzweil's semantic breakthroughs in the three proposed responses underneath each new email you receive.

As Kurzweil acknowledges, semantic search is an "extension of human intelligence" rather than a replacement for it. A human being reinforced by AI prosthetics is less likely, not more likely, to be ambushed by a usurper digital machine. Semantic search delays the machine-learning eschaton.

Because Go is a game of pure strategy without differentiated pieces like chess, a computer can exhaust the solutions more efficiently than in chess, with its smaller solution space. The Asilomar eschatologists miss the difference between computingspeed and intelligence, between programmable machines and programmers.

Tegmark makes the case as well as it can be made that the attainments of AI programs-"Watson" the quiz-show winner and occasionally superior medical diagnostician; Big Blue the chess champion; Google's DeepMind game players, which learned to outperform human players from scratch in dozens of electronic games; the face-recognizers; the natural language translators; the self-driving car programs-portend a super-intelligence that will someday be so superior to the human mind that we will no more comprehend its depths than a dog grasps the meaning of our own cerebrations. It is just a matter of time. Although shunning the dystopian interpretation, Kurzweil boldly offers a date: 2049. Tegmark likes to quote Edward Robert Harrison: "Hydrogen, given enough time, turns into people." People, given enough time, presumably turn into Turing machines, and Turing machines are essentially what people used to call "God." He isn't shy about the godlike powers this super-Al will have: "Whatever form matter is in, advanced technology can rearrange it into any desired substances or objects, including power plants, computers, and advanced life forms."

Life 3.0 and Asilomar are declarations of principles for a posthuman age. The conclusion is that the last significant human beings are the inventors of super-intelligent AI. People like Hassabis, Norvig, LeCun, and Page. Pay them tribute while you can and hope that they will be indulgent if you sign up for their movement. Life 3.0 is siliconbased and machine-generated.

The problem is not AI itself, which is an impressive technology with much promise for improving human life. What transforms "super-AI" from a technology into a religious cult is the assumption that the human mind is essentially a computer, a material machine.

Although Kurzweil and Tegmark are both smart enough or canny enough to dismiss the existence of extraterrestrial minds, most of the movement is intoxicated by the view that we are not alone. The usual conclusion is that intelligent life on other planets is so easy, so determined by material forces, that it is "inevitable." Expressing this assurance is SETI, the "search for extraterrestrial intelligence," a collective effort conducted on hundreds of thousands of computers around the globe searching through electromagnetic debris for a glint of mind elsewhere in the universe. Nothing has turned up in thirty-five years or so, but Yuri Milner, the great Russian physicist-investor, has pumped another $100 million into the cause in his "Breakthrough Listen" project.

All these pursuits reflect a breakdown of terrestrial intelligence.

All the AI scenarios assume the premise of AI super-intelligence with anthropomorphic consciousness, will, feelings, imagination, creativity, and independence. But in presenting every cockamamie view they can imagine, Tegmark and the other Al champions never come close to demonstrating that voltages, transistor gates, memory capacitors, and flip-flops can somehow know or learn anything, let alone become willful and conscious or independent of their human programmers.

The debating-point response of the super-AI proponents is that the human mind consists of electrical and chemical components that are unintelligent in themselves. But here we encounter the Gödel-Turing difficulty of self-reference. By referring back to their own brains, which they don't really understand, the AI scientists plunge directly into the selfreferential Gödel perplex. By using their own minds and consciousness to deny the significance of consciousness in minds, they refute themselves.

Consciousness and free will are self-reference without determinism.

The blind spot of AI is that consciousness does not emerge from thought; it is the source of it. As Leibnitz, imagining a computer blown up to the size of a building, observed in the seventeenth century, inside the machine (the determinist scheme), you find cogs and gears but no cognition. The oracle programmer must be outside. How a software programmer can miss the essence of his own trade is a mystery, but Chesterton understood the myopia of the expert:

The materialist superstition is a strange growth in an age of information. Writing from his home, which he named "Entropy House,"

Gauged by the unexpected deformation of a regularity, information is neither fully determined nor fully random. As Shannon put it, information is stochastic, adapting a Greek word that means "to aim at." It combines probabilities with skills, and randomness with structure. Information is maximized in a high-entropy message borne by a low-entropy carrier, such as the modulated code-bearing light in a fiber-optic line.

After von Neumann, Shannon was the most important figure in the establishment of the system of the world that Google now embodjes. I would like to say that he showed the way out. But Shannon himself ended up enmeshed in the same materialist superstition that afflicts the Google Age. "I think man is a machine of a very complex sort," he wrote, "different from a computer, i.e., different in organization. But it could be easily reproduced-it has about ten billion nerve cells.....

An unconscious body is simply a hermetically logical system, which as both Gödel and Turing proved is necessarily incomplete and in need of an "oracle." Knowledge of this incompleteness is the human condition, felt intuitively and manifested in consciousness. The "I" emerges in the domain of faith beyond the machines of logic.

Real science shows that the universe is a singularity and thus a creation. Creation is an entropic product of a higher consciousness echoed by human consciousness. This higher consciousness, which throughout human history we have found it convenient to call God, endows human creators with the space to originate surprising things.

Al cannot compete with the human intelligence that connects symbols and objects. AI cannot do without the human minds that provide it with symbol systems and languages; programs it; structures the information it absorbs in training, whether word patterns or pixels; provides and formulates the big data in which it finds numerical correlations; and sets up the goals and reward schemes and target sequences that allow it to iterate, optimize, and converge on a solution.

Thinking is conscious, willful, imaginative, and creative. A computer running at gigahertz speeds and playing a deterministic like chess or Go is only a machine. The idea that it is superhuman makes sense only if the abacus or calculator is superhuman. Artificial intelligence refers to the output of computer algorithms that consist of ingeniously arranged electronic elements-currents, voltages, inductances, and capacitancesthat gain their meaning from Boolean logical schemes, tree structures, and "neural nets." They achieve their utility from human languages and other symbol systems, including the computer languages and mathematical reasoning that program them.

The 2017 Asilomar conference called to mind a conference held at the same place in February 1975, at which scientists warned about the future of technology-in that case, genetic engineering. They feared that experiments enabling molecular biologists to splice DNA from two different organisms, producing novel recombinant DNA molecules and chimeras, would threaten all human life. Within a decade, so the attendees prophesied, "scientists will be able to create new species and carry out the equivalent of 10 billion years of evolution in one year."

More than four decades later, the hopes and fears of the 1975 Asilomar conference are nowhere near to coming true. The roots of nearly half-century of frustration reach back to the meeting in Königsberg in 1930, where von Neumann met Gödel and launched the computer age by showing that determinist mathematics could not produce creative consciousness. Von Neumann stepped forward to become the oracle of the age we are now consummating.

Reflecting on the 1975 conference, the eminent chemist-biologist Michael Denton concludes, "The actual achievements of genetic engineering are rather more mundane..., a relatively trivial tinkering rather than genuine engineering, analogous to tuning a car engine rather than redesigning it, an exploitation of the already existing potential for variation which is built into all living systems...." Thousands of transgenic plants have been developed with results "far from the creation or radical reconstruction of a living organism."4 All that the first Asilomar conference managed to achieve was triggering an obtuse paranoia about "genetically modified organisms" that hinders agricultural progress around the world.

Among the Deep Learners and Google brains at the AI Asilomar was Vitalik Buterin, a twenty-three-year-old college dropout with the same etiolated, wide-eared, boy-genius look that characterized Gödel and Turing. The assembled masters of the high-tech universe may have understood him about as well as the mathematicians in Königsberg understood the twenty-four-year-old Gödel in 1930, though the audience at Asilomar had advance notice of the significance of Buterin's work.

Buterin's vision for the blockchain has long been broader than cryptocurrency. Ethereum's contribution, its co-founder Joe Lubin predicts, will be an Internet without "a single powerful entity that controls the system or controls gatekeeping into the system. Wired magazine speculated in 2014 that smart contracts, such as Buterin designed Ethereum to facilitate, "could lead to the creation of autonomous corporations-entire companies run by bots instead of humans." If you were convening a summit of futuristic technologists in 2017, it would have been hard to avoid inviting the prophetic protagonist of Ethereum.

Perhaps Buterin, who launched Bitcoin Magazine while working as research assistant to the cryptographer Ian Goldberg, is the truest legatee of Shannon's vision. Like Shannon he can move seamlessly between the light and dark sides of information, between communication and cryptography. Shannon's information theory, like Turing's computational vision, began with an understanding of codes. His first major paper, "A Mathematical Theory of Cryptography" (1945) proved that a perfect randomized one-time pad constitutes an unbreakable code, a singularity. The theory of information deals with a continuum between white noise (purely random) and perfect order (predictable and information-free). Shannon's paper focused attention on the fertile domains of redundancy in between, which he dubbed "stochastic."

They decided to support him anyway. In November 2013, Buterin wrote the Ethereum white paper, and on June 5, 2014, Peter Thiel announced a new group of twenty Thiel Fellows, which included Buterin. A year later, Ethereum went live, with the announcement, "What bitcoin does for payments, Ethereum does for anything that can be programmed." It was another step in the decentralization of the Internet.

Just as Ethereum was entering the larger world, in July 2015, Strachman and Gibson were leaving the Thiel Fellowship to start a new but related project, the 1517 Fund, which would invest in Thiel Fellows and other high school and college-aged company founders.

1517 chose Pony Tracks for its annual meeting to exhibit Russell, who started Luminar at age seventeen and ranks with Buterin as the Thiel Fellow most capable of launching a reformation. While Buterin is unleashing a new global architecture for computing and finance, Russell is launching a revival of U.S. innovation in manufacturing hardware. Defying the frenzied hype of its software-based rivals in the selfdriving sweepstakes, from Google to Tesla, Luminar's product is comparable in its transformative reach and ambition to Henry Ford's early-twentieth-century assembly lines at River Rouge in Detroit.

Buterin too was a child prodigy on a Mozartian scale. At four his favorite toy was an Excel spreadsheet. At seven, he instructed himself in Mandarin and today fluently debates in the language during trips to Shenzhen. Both the Californian and the Canadian benefited from early guidance from their fathers. Dmitry Buterin introduced his son to bitcoin, and Robert Russell, now CFO of Luminar, pushed Austin ahead in optics. Like Buterin, Russell is not cowed by the powerful forces arrayed against him.

To Russell, nothing matters but performance. There is no long-run demand for a system that kills people. Aim at performance, and low cost will follow. Aim at low cost, and you will not achieve sufficient performance to have an enduring business. After a sufficient system is devised, demand will foster economies of scale and learning curves that bring the price down over time.

Doing research at Beckman Laser Institute while joining with the photonic genius Jason Eichenholz of Open Photonics to fill out his company, Russell eventually raised $36 million from 1517 and other investors. Throwing out the conventional wisdom about lidar, his team researched two thousand different ways to build a fundamentally more effective system. In 2013, they decided to make a drastic change in the prevailing practice.

In January 2009, Satoshi Nakamoto announced "the first release of Bitcoin, a new electronic cash system that uses a peer-to-peer network to prevent double-spending. It's completely decentralized with no server or central authority." He went on to specify:

Hal Finney, cryptographer supreme, posted his congratulations and offered these thoughts:"

Perhaps you like nested hierarchies of abstraction, recursions in recursions, like Matryoshka dolls-plays within plays, puzzles within puzzles. Perhaps you delight in logical paradoxes-Cretans asserting that all Cretans are liars, computer scientists insisting that their brains are merely computers, Byzantine generals outwitting treacherous mes sengers, physicists suggesting that they themselves are merely materi alist meat-machines in one of an infinite number of parallel universes and graphics virtuosos imagining that they are mere simulations in some gigantic virtual reality simulator controlled by a superior race of dmun mumixam micro swolle aliens.

What would you say if I told you that all of us, right now-in the United States and around the world-have been sucked up into the phantasmagorical pages of a novel by Neal Stephenson, the shy West Coast prodigy who is the twenty-first century's greatest writer and demiurge? You can say it is my wild opinion. You can call it metafact or mystifiction, a demented conspiracy theory, or even a massively multiplayer online game, or a virtual world. But you cannot prove that it isn't true-that the ardent logorrheic genius cyberludic scrivener with the ponytail and the kaleidoscopic brain and fiber-optic diet has not somehow infiltrated the very operating code of the system of the world.

In Stephenson's Reamde-his thousand-page thriller of 2011-a massively multiplayer online game (MMOG) called "T'Rain" has captivated the world's digerati. It is virtual-a computerized 3D "second life"-but it has a crucial dimension that allows it to overflow into what we quaintly term "real life." It has a currency-a monetary systemand it is based on a gold standard. It is a fake gold standard, but the emulation is accurate enough to create a gilded simulacrum of monetary value.

The man of the moment at the Bitcoin Summit in San Francisco in March 2014 is Marc Andreessen, perhaps the world's bellwether venture capitalist, who became famous as the curmudgeonly young inventor of the first popular Internet browser, Mosaic. Launched in 1992 from the National Center for Supercomputing Applications at the University of Illinois, Mosaic in six months brought a million new customers onto the World Wide Web. When Andreessen was gently ousted from the Mosaic project and denied access to his software ("to give others a chance to share the glory"), he co-founded Netscape, the second popular Internet browser and a spearhead of the explosive expansion of the World Wide Web in the mid-1990s.

A mysterious new technology emerges, seemingly out of nowhere, but actually the result of two decades of intense research and development by nearly anonymous researchers.

But the Forbes reporter objects: Warren Buffett dismisses bitcoin as a "mirage." Jamie Dimon, the CEO of JPMorgan Chase, calls it delusional. The Nobel laureate economist and Times columnist Paul Krugman declares it "evil."

In 31,000 lines of code, Bitcoin exploits a set of complex cryptographic algorithms that supposedly make it "as good as gold." Or maybe even better! Unlike gold, with the burden and glory of engagement in the real world, where it is rooted in the possibly changing time constants of extraction from deep beneath the earth, bitcoin is an entirely virtual, digital entity. Bitcoin, for better or for worse, is entirely an artifact of computers.

At the 2014 Bitcoin Summit, joining the fray of Satoshistas is Chamath Palihapitiya, a ferociously smart near-billionaire from Sri Lanka, a former force and friend of Mark (Zuckerberg) at Facebook. Virtually the only man at the summit wearing a suit and tie, Palihapitiya at the time owns some fifty million dollars' worth of bitcoins. Tall, dark, and lean, the physical opposite of the Nordic bear Andreessen, the Sri Lankan warns against "hyperbolic bullshit" from others. But as for himself, "When I buy bitcoin, I am using capital to support a way of ripping apart the financial system." Two years later, Palihapitiya was able to purchase a share of the Golden State Warriors NBA franchise.

But make no mistake. Bitcoin is really-that word again—a massively multiplayer online game. Mobilizing millions of players, hundreds of thousands of them online at a time, MMOGS such as World of Warcraft and Everquest are a formidable force, generating more than two billion dollars in subscription revenues. By comparison, the socalled bitcoin community is small. But bitcoin is a game with attitude and altitude, ingeniously designed to infiltrate and transform our world. So far the winner of the game is its founder-inventor, the mysterious Satoshi Nakamoto. He has amassed coins estimated to be worth ten billion dollars, better even than Neal Stephenson's hero John Forthrast, the founder of T'Rain.

Satoshi explained: "Governments are good at cutting off the heads of centrally controlled networks like Napster, but pure peer-to-peer networks like Gnutella [music] and Tor [The Onion Router for encrypted email and video] seem to be holding their own." Indeed, earlier in the century, the Tor system, originated in the NSA and developed by the entrepreneur Bram Cohen, comprised close to half of all Internet bits. Tor takes a video file, for example, and splits it up, peer to peer, across as many as seven thousand computers on the Net.

In bitcoin, Satoshi made possible a digital asset that cannot be copied without lots of work, which he called "proof of work." The proof of work is provided by miners laboriously checking out all the transactions and compiling them into blocks. Thus Satoshi enabled the creation of money on the Net and used the money to pay "miners" to validate its transactions. Recorded in a public distributed ledger, the transactions are mathematically "hashed" into chains of blocks that form an immutable database published across the Internet.

As O'Hagan writes, "Satoshi was loved by bitcoin fans for making a beautiful thing and then disappearing. They don't want Satoshi to be wrong or contradictory, boastful or shorttempered, and they really don't want him to be a 45-year-old Australian called Craig."

Buterin developed his idea in 2013 during a visit to Israel, which he says is in the lead in cryptographic sciences. At the time, he was a committed bitcoin evangelist, writing for Bitcoin Magazine and focused on bitcoin's potential as a currency. But in Israel he met entrepreneurs experimenting with "colored" coins with special purposes, tokens that could be used to open new markets-Mastercoin for financial contracts, Bancor for exchanges of liquid tokens, and smart contracts based on the proposals of Szabo. Scarcely out of his teens, Buterin audaciously set out to develop a new blockchain that could serve as a security and identity substrate for an unlimited range of smart contracts.

To implement this plan, he devised a new programming language called Solidity, a new currency called ether, and a new blockchain, more flexible and capacious, called Ethereum. Each of these steps was fraught with inventions. The Solidity language would be Turing-complete, which meant it could express any set of algorithms possible on a computer. The currency would define a unit of account based on the amount of energy consumed by the computations entailed by the contract.

Buterin does not believe that Wright is Satoshi, a position he backs with algorithmic logic. On a panel with Gavin Andresen, who had endorsed Wright's claim, Buterin retorted crisply, "I will explain why I think he is probably not Satoshi.... He had two choices of how to prove he is Satoshi. The easy choice was to write 'Craig Wright is Satoshi Nakomoto' and sign it with a private key known to be Satoshi's.

Under these inauspicious conditions, Buterin directed the completion of the Ethereum platform and launched more than a thousand new company projects. The average funding for each startup was more than two million dollars. They also invented something called the "ICO," which stood variously for initial cryptoasset offering, initial crowd offering, initial Cayman offering, or initial coin offering, depending on which name the lawyers thought was most likely to appease the baffled regulators.

The total funds raised-some eight billion dollars in less than one year-exceeded all money raised in IPOS or in venture capital for related ventures. The largest comparable venture outlay was $127 million for R3, led by ex-Googler Mike Hearn and Wright's associate lan Grigg, an effort by the big banks to catch up in blockchain technology.

This incident was not the result of flaws in the Ethereum blockchain itself, but that didn't matter. As the ruler of the chain, Buterin could not escape the need to intervene. Proponents of Ethereum Classic contend that this arbitrary action undermined the immutability of the database and the principle of decentralization that is the heart of the blockchain. Ethereum Classic has not yet had much influence.

The key difference between the bitcoin and Ethereum blockchains is that bitcoin focuses on security and simplicity while Ethereum focuses on capability and functionality. Ethereum's superior functionality is transforming a number of industries. As Buterin puts it, "The Internet tended to displace workers doing routine work on the edge of the system; the blockchain tends to disintermediate executives in the center."

Smart contracts may disintermediate lawyers, accountants, and bankers who do not get aboard. As Buterin says, "The Internet displaced the jobs of taxi drivers; the blockchain may displace Uber." Indeed, an Ethereum company called Swarm is attempting to enable cab-drivers to transact directly with their customers through a cooperative scheme on the blockchain.

Bitcoin gains momentum with every governmental campaign against cash, which is the alternative peer-topeer vessel for anonymous private transactions. Bitcoin appreciates every time a central bank promotes spurious growth with negative interest rates and inflation am noil targets, raiding the retirement savings of pensioners.

The U.S. Federal Reserve's inflation "target" is currently 2 percent a year, a program of massive ultimate devaluation. As socialism advances in many countries, debauching their currencies, people incrementally flee to the one global and relatively secure haven accessible through the Internet. Traditionally the haven currency was the U.S. dollar, but since early 2018-from Greece to Venezuela, from Argentina to Zimbabwe-the haven has increasingly been bitcoin.

The dominant monetary standard and haven currency through much of human history has been gold. With a total financial market of some $2.4 trillion, gold still dwarfs bitcoin at $128 billion. The supply of gold has grown by an average of 2 percent per year for centuries, giving it a less remorseless deflationary bias-an advantage over bitcoin, the supply of which is capped at twenty-one million units in 2140, 80 percent of which had already been "mined" by 2018. As a unit of account and store of value, two of the key facets of money, gold is the ultimate standard.

It all begins with the "Metaverse" in Neal Stephenson's 1992 novel Snow Crash, a vision of a virtual world on top of the real world. A quarter-century later it still excites geek romantics with its pro山 phetic music:k When Hiro first saw this place 10 years ago, the monorail hadn't been written yet. He and his buddies had to write car and motorcycle software in order to get around. They would take their software out and race it in the black desert of the electronic night.

Muneeb Ali quotes this passage from from Snow Crash in the opening of his magisterial dissertation, "Trust-to-Trust Design of a New Internet," the product of his work with Ryan Shea and Jude Nelson and their Princeton adviser, Michael J. Freedman. This team went out into the electronic night and attempted to light it up with an architecture for a transformed Internet-a metaworld of trust beyond the seven layers of communications technology.

Ali's mentor at Princeton, the computer scientist-cryptographer Michael Freedman, had worked for two decades on the theory and practice of peer-to-peer networks. He was the co-author of two chapters in the standard textbook Peer to Peer, and, with Martin Casado, an author of the canonical "open-flow" software-defined networking (SDN) paper. Today he is chief technical officer of TimescaleDB, an acclaimed open-source time-series database.

The seven-layer model consists of a hierarchical stack in which lower functions are controlled by higher functions. At the bottom is the physical layer, the fiber-optic lines, microwave oscillators, mixers, 1550- and 900-nanometer lasers, photodetectors, silicon routers, erbium-doped amplifiers, and twisted-pair telephone wires, antennas, coaxial cables--the list is endless-that carry the data packets across the network at the behest of the layers above it. Hard to design and build, this layer of hardware devices is at the heart of the miracle of modern electronics. But by the time Ali was studying at Princeton, much of the industry was ignoring hardware to build Turing machines in the ether.

In the OSI stack, above the physical layer is the datalink. This is the medium where hardware becomes "firmware" and software that define the electrical specifications, timing rules, and electron-photon conversions that enable the transmission of information across a link from one node or computational address to the next. Switches operate here at layer two, passing packets only to the next node. Local area networks such as Ethernet or WiFi function at this level. If you shun the highways of the Internet, you can live your life transmitting your bits and bytes across the datalink layers, layer two.

The third layer is the network layer, the domain of routers, which combines with the transport layer (layer four) to establish the endto-end links that constitute the TCP/IP Internet Protocols. This is the entire system of IP addresses and Transport Control Protocol traffic shuffles that comprise the connections from end to end across the Net. Layer three does the headers on the packets, the identities and addresses; layer four does the actual transmission and reception of data packets and traffic management, load balancing and acks (I got it!) and naks (I'm still waiting) that assure connections. Layers three and four tend to be a bastion of central powers, where governments and their intelligence arms chase down domain names and addresses, entities such as ICANN and even the UN's ITU. When they discover a Silk Road or perhaps an Alpha Bay, they track it down over layer three.

Above layer four is layer five-the all-important session layerwhich governs a particular two-way communication from beginning to end, whether a video stream, a Skype call, a Session Initiation Protocol conference, a messaging exchange, an email post, or evenand this would prove fateful-a transaction.

Layers six and seven are the schemes for presentations and applications-user interfaces, windows, formats, operating systems, and so on. These are summed up in the ingenious schemes of hyperlinks (click on a word and go to a new page) and universal resource locators (URLS)

As 70 percent of all links came to be handled through Google and Facebook, Berners-Lee feared that his Web was dying. He would become a Blockstack enthusiast. "When he heard what we were doing, he did a little dance," said Blockstack's software chief, Jude Nelson.

To describe the Osi stack in Telecosm, I used the example of a phone call. Pick up the handset and listen for a dial tone (physical layer signal), now often simulated; dial a number (every digit moves the call another link closer to the destination); listen for the ringtone (signifying a network connection and transport of signals). By the time you get someone on the line, you have gone through the first four layers of the OSI stack.

While a materialist might suppose that the physical layer is all, and a software triumphalist imagines that it is all in his head, the genius of networking is dualistic. Animated by trillions upon trillions of microchip transistors, vias, and traces, the physical layer is ultimately as opaque and unfathomable as it is ingenious and indispensable. Software logic proliferates in a hierarchy above and defines what the hardware does.

In the end, Google shunned most specialized networking hardware in favor of thousands of servers deployed across vast datacenters and integrated by software. The Turing machines were as immaterial and mutable as they were in Turing's mind. A router or computer or switch or Internet realization could be "virtualized" and have no specific hardware manifestation at all.

As Ali sensed, the rise of Netscape marked a turning point in the history of networking-the provision of new accessible roads for data.

Netscape's founders saw the Web as an arena for interlinked creative expression of all kinds, from photos to videos. Its founder, Marc Andreessen, and the investor Jim Clark, inventor of the 3D "geometry engine" at Silicon Graphics, both anticipated a 3D metaverse of games and virtual worlds. Through Netscape, Andreessen, Eich, Clark, and colleagues had given Ali the power to animate Web pages, share them with the world, and possibly make money on the Net.

It was a high point for technological entrepreneurship. After the year 2000, however, the number of startups would stagnate and IPOS nearly disappear for all but the biggest tech companies. In the wake of the Enron debacle, regulations under the Sarbanes-Oxley Act exacted a toll of some two million dollars for reaching public markets and imposed a rigid accounting regime high on paperwork and low on trust. It was entirely inimical to startup culture and finance.

By the time Ali arrived at Princeton in 2012, Netscape had foundered. Its browser gave way to Microsoft's Explorer, rolled out free and bundled with Windows 95. Initiating the now common practice of Internet leviathans' buying innovation, Microsoft was able to quell the Netscape challenge by acquiring the Spyglass browser. The leading designers of Spyglass, as it happened, were Netscape's Andreessen and Eric Bina, who had developed its basics as Mosaic while at the University of Illinois's supercomputer center. Microsoft acquired an elegant modular browser and made Netscape's inventors compete with themselves.

As Ali writes, "Currently, with frequent use of an online service, user data gets locked into 'data-silos,' e.g., data that is understood and stored by Facebook, Yahoo!, Google and others respectively but cannot be migrated across services. This leads to a centralized data model; the data silos inevitably get hacked eventually, e.g., the recent hack of 500 million Yahoo! users."

These silos, or "walled gardens," were what depressed Berners-Lee.

The Internet stack had become a porous and perforated scheme in which most of the money and power could be sucked up by the big apps at the top run by companies such as Google. What was needed was a blockstack that could keep the crucial IDs and personal data and pointers to storage addresses in a secure and immutable database on the blockchain.

Continuing to work on his "New Internet" Ph.D. thesis but running out of funds again, Muneeb Ali was more interested in actually building the new network. In 2012, he had the good fortune to meet Jude Nelson, a tall, blond-haired software engineer from Arizona who needed help in Manhattan on a storage project named Syndicate, an enterprise file-storage system based on blockchains. Using cryptographic mathematicshashes across time-it provides safe pointers to computer storage locations and addresses. Syndicate could use storage facilities like Google Drive, Amazon's S3, and Microsoft Azure as utilities by storing pointers and ID in the blockchain, with the data's owners retaining control.

The Blockstack team wanted to reestablish the network on reliable, lowentropy foundations. Ali explains: "Decentralized identity systems enable users to control a unique identity recorded on the blockchain that can be recognized by any site." He contrasts this universal ID with the current "username and password combo that can only be recognized by the site that had you create an account." With the blockchain, users can log in to the websites by automatically proving ownership of their identity.

Ali and Shea wanted to address the problem at a more fundamental level by developing a new secure protocol layer for the Internet by which the identification, money, power, and property could stay with their owner rather than be sucked up to the apps at the top. "I was impressed with Peter Thiel's perspective," says Shea. "Why would you want to compete with some existing company, be incrementally more efficient, and make the world only a slightly better place?"

In 2014, they took a $250,000 investment and moved out to Y-Combinator in Mountain View, California, for a stint and demo.

A little more than two weeks before, Ali's doctoral committee at Princeton had finally approved his dissertation, "Trust-to-Trust Design of a New Internet." Composed with Shea's help, it was comparable in its scope and ambition to Larry Page's "PageRank" thesis at Stanford.

A leaner and more constrained approach distinguishes Blockstack from Ethereum. Ryan Shea, the CEO, sums it up: "We are going for a much simpler system than Ethereum. With a larger attack surface, more things can go wrong. We are using blockchain and software for the core functions of naming, discovery of routing information, and payments.... [C]ore components like identity and discovery sho not be done in a way that exposes a large attack surface."

The Blockstack movement is founded on seven key principles:

Maximum scalability. It assures performance and scalability by separating the control plane (insulated in the blockchain) from the data plane, which can be dispersed across the network. This principle saves the blockchain for critical path functions of identity, payments, secunity, and discovery, while relegating bulk data storage and complex processing to any number of diverse cloud and edge facilities.

Single prototype: It establishes property rights by upholding the principle and precedence of singular documents, time-stamped, recorded, and algorithmically allocated. Because each item-even copies-always bears different immutable time-stamps, property claims can always be differentiated.

Parallel complement: Its expansion confers benefits of privacy and property on its participants without directly threatening incumbents, whose facilities are used by the system as a utility. As the Blockstack realm grows, its influence and power will rise, and incumbents will be motivated to accommodate it.

Lowentropy carrier: It provides a stable, predictable, and monolithic foundation for the high-entropy metaverse on the edge. It avoids capricious changes of law and structure that confuse entrepreneurial planning on the edges and cause security problems.

Free migration: It allows unobstructed passage from one blockchain or network to another without locking in users. This crucial feature is enabled by Jude Nelson's coding for virtual chains, which run on top of the fundamental blockchain as Java Virtual Machines run on top of many operating systems.

End-to-end, trust-to-trust: All nodes rest on roots of trust that are not heavily dependent on outside authorities.

Nelson recalls, "Soon we intend to give each application the ability to form its own blockchain securely. I'd argue that this makes it easier for average people to reap the business successes of blockchain-powered companies, since (1) there are more of them now, and (2) they all have tokens whose value may appreciate. Ethereum sort of does this with ERC20, but it doesn't work at scale. Plus, each ERC20 application is tied to the fate of Ethereum, which may make them less likely to survive in the long run." Unlike Ethereum with its esoteric Solidity, Blockstack makes its platform programmable in Brendan Eich's JavaScript, the world's most widespread computer language.

Hello. I'm to blame for JavaScript." A slightly pudgy, affable, fifty-five-year-old American computer programmer stands on stage in the gilded Vienna Volkstheater. This is Brendan Eich, the cofounder of Mozilla and the inventor of the Firefox browser, starting his 2016 TEDx talk "How to Fix the Web." He bows, with his hands on his head miming his embarrassment.

After his sudden ouster from Mozilla, Eich seemed renewed.

These crypto-token sales, known as "initial coin offerings" (ICOs)

Now in 2018, however, everyone involved in the field is nervously watching the Securities and Exchange Commission. Having suppressed IPOS so thoroughly, the SEC has turned its sights on ICOS as well. The leading lawyers in this field believe that such presales of goods and services-to be supplied or even defined, for the most part, laterskirt the mandate of the SEC and even its jurisdiction. After all, companies sell goods and services all the time in a variety of ways without any thought of the SEC. But the commission begs to differ. Deciding that virtually all tokens are securities and thus indeed under its jurisdiction, the SEC threatens drastically to increase the legal cost of cryptographic innovation in the United States. The danger is that it will extend the doldrums into which it has led the entrepreneurial economy and drive the industry out of the country.

Having already had a fabulously successful token sale, Brendan Eich speaks from the other side of the looking glass. He is one of the few people in the field who seem totally unruffled. And his aim is high.

Brave's compendiously cogent and scrupulously documented white paper from March 2017 details this crisis of Internet advertising. The situation is winner-take-all. Ninety-nine percent of the growth goes to Google and Facebook. Publishers-whether of websites, books, games, or music-are left with the final 1 percent. It is fraught with fraud. In 2016, fake ad demand generated by Internet bots cost advertisers some $7.2 billion, with ad malware to trick users rising 132 percent since 2015.

As Jonathan Taplan documents relentlessly in Move Fast and Break Things, the Google regime of aggregate and advertise is drastically reducing the income of musicians, journalists, and other producers of the content that Google seeks to monetize with ads and search.

Identifying the forces at the heart of this situation, Eich quotes Herbert Simon, the Carnegie Mellon information theorist: “What information consumes is the attention of the recipients. A wealth of information creates a poverty of attention and a need to allocate that attention efficiently among the overabundance of information sources that might consume it."

When information becomes abundant, time remains scarce. What Herbert Simon, Esther Dyson, Tim Wu, and their many disciples call attention is essentially just another word for time. As I explain in The Scandal of Money, time translates into the economy as money.

But Google and others are discovering that human attention can be depleted. As Eich remarks, it sinks away "until dopamine levels recover." Already evident is an Internet epidemic of "banner blindness."

Into this breach Eich is hurling his one billion BATS--the unit of exchange for an ingenious new decentralized open-source and efficient digital advertising platform based on Vitalik Buterin's Ethereum blockchain. Advertisers award BATS to publishers based on the attention of users measured by the pattern of their usage. Users too will be paid in BATS for accepting ads that they want to see or choose to tolerate in exchange for micropayments. They can donate these BATS back to favored publishers or use them in exchange for content.

This transparent system keeps user data private while removing middlemen and delivering fewer but more relevant ads that users actually seek. Publishers gain a larger share of the returns while advertisers get better reporting and performance and customers get ads they explicitly accept for pay. Rather than manipulating the viewer to see ads, advertisers find viewers who are interested in their appeals. Eich concludes, "Brave will reset the online ad-based Web ecosystem, giving advertisers, publishers, and customers a win-win solution whose components and protocols can become future Web standards."

In a world of abundant information but scarce time, what do people value most? As Kevin Kelly declares, "The only things that are increasing in cost while everything else heads to zero are human experiences.... Cheap abundant [virtual reality] will be an experience factory.

OTOY's metaverse is something entirely new. Its virtual worlds will be almost indistinguishable for many purposes from the topology of the real world. As Kevin Kelly breathlessly describes it, "We'll use it to visit environments too dangerous to risk in the flesh, such as war zones, deep seas, or volcanoes. Or we'll use it for experiences we can't easily get to as humans-to visit the inside of a stomach, the surface of a comet. Or to swap genders, or [as Jaron Lanier wants] become a lobster. Or to cheaply experience something expensive, like a flyby of the Himalayas." Most importantly, we will use it for new social interactions in new settings-concerts, dances, theaters, stadia.

In this new virtual domain, how will ownership be identified and defended? Musical works, for example, combine contributions from composers, lyricists, performers, distributors, and other participants, all of whom have particular claims. As an OTOY white paper explains:

The key promise of OTOY, as John Carmack, CTO of Facebook/ Oculus, observes, is an entirely new platform and user interface for the Internet, announced in 2015 at a joint press conference with Eich at the spectacular San Francisco headquarters of Autodesk, an OTOY investor.

With OTOY'S ORBX rendering protocol translated into Eich's JavaScript, any browser can blossom into a three-dimensional space.

The fulfillment of Neal Stephenson's Metaverse is possible for the first time. Without leaving their homes, people will be able to travel anywhere in the world, or even beyond it, and enjoy a full visual and even haptic experience. They will be able to interact with images and buy and sell rights within the virtual domains. They will attend plays and films and sports and news events and new kinds of morphic narratives. They will hang glide through the Alps, fly to the moon, and cruise beyond it to life after Google.

In 2012, people knew that face recognition was coming to handsets, but no one had been able to make it compact and fast enough. Balaban's work caught the attention of the academic image gurus Zak Stone and Nicolas Pinto at Perceptio Corporation, and they hired him in November to develop mobile face recognition technology for the iPhone.

Like all such projects by then, this one would be based on deep neuralnetwork processing. But it was mobile machine learning, Balaban explains, "meaning running face recognition and other neural nets on the phone's own graphics processing unit, not even uploading to the sky." He saw that artificial intelligence did not need to take place in giant data warehouses. This was a contrarian insight worthy of a Thiel Fellow (and by mid-2013 he was living with two of them, Austin Russell and Thomas Sohmers), but it took some years before he capitalized on it. "Basically I was learning deep learning."

Despite the disappointment, he still did not want to work on "something that wasn't mine." In early 2015, Gary Bradski-the robotics pioneer who developed computer vision at Intel, founded the Willow Garage robotics incubator, which convinced Wired's Kevin Kelly that "robots have wants," and started Industrial Perception, which made "stevedore robots" that could, as Stephen Balaban described it, "pick up and chuck a box so elegantly" that Google bought them—that Gary Bradski-invited Balaban to join his deep-learning team at Magic Leap. Launched in 2010, the Google-funded virtual reality venture in Florida had heretofore raised half a billion dollars while generating more national magazine covers than virtual reality advances. Neal Stephenson had just joined the company (as chief futurist), but Balaban wasn't convinced, however magic and well-funded the leap.

The month before, Chris Olah, Vitalik Buterin's high school friend who beat him to the Thiel Fellowship and was now an intern at Google Brain, published a blog post with two Google software engineers. It was titled "Inceptionism: Going Deeper into Neural Networks."

To Balaban, the code and its results were a visual confirmation of what Yoshua Bengio, a colleague of Geoffrey Hinton in the Montreal crucible of AI, calls the "manifold learning hypothesis." Bengio sees the essential job of a neural network as learning a hierarchy of representations in which each new layer is built up out of representations resolved in a previous layer. The machine begins with raw pixels and combines them into lines and curves transitioning from dark to light and then into geometrical shapes, which finally can be encoded into elements of human faces or other targeted figures. Scramble this process at an early stage and you get an artfully inflected picture; scramble it higher up in the hierarchy and you get a phantasia of "dream and nightmare" images, as Bengio puts it. In dreams and nightmares, as in machine-learning feedback loops, no new information is perceived.

It was less than two months before the 2015 Burning Man festival.

Then Stephen and Michael Balaban figured out how to support nearly one million users, each running his own little machine-learning gradient and editor. (Lambda Labs was still just the Balaban twins and their cousin.) Scaled up by a distributed queue-processing system, "it allowed us to add new nodes to the pool on demand." The flaw was that all the GPUs were controlled by Amazon Web Services, which had to be paid.

At that point, he felt a visceral resistance to sending any more money to Amazon. He simply couldn't do it. It was a "Zero to One"

Epitomizing the excesses of the elaborate Amazon setup, in Balaban's view, were top-of-the-line "machine learning" Tesla GPUs from Nvidia. He discovered that Nvidia's gaming chips were not only ten times cheaper but also faster. What mattered to Balaban's machinelearning algorithms were not all the custom "machine-learning features" but the number of usable floating-point operations per dollar. As Bill Dally had shown at Nvidia, machine learning is essentially a product of Moore's Law advances in processing speeds and parallelization.

keeping his eye on the key metric of FLOPS per buck, Balaban calculated that the up-market Tesla chips cost around five thousand dollars for Floating Point-32 performance teraflops of 10.6. The gaming chips (GeForce GTX 1080 TI) produced 11.3 teraflops and could be bought for $580 per module. It was not a close call. In Balaban's model of FLOPS per buck, the gaming chips were around twenty-four times better.

Among the people taking an interest in Balaban's progress was Georges Harik, a Silicon Valley titan who had, like Balaban, studied computer science at the University of Michigan and had then gone on to develop Google's AdWords. Harik observed, "I don't know how Dreamscope will go, but if you guys are good at Linux systems administration, what you can do is GPU cloud services." That was the numberten Google employee recommending that they compete with Google in the cloud. It was an idea.

Balaban, however, decided to downgrade Dreamscope and go into the computer infrastructure market. He would sell boxes, like Michael Dell in his early years. Mixing up the family group, Balaban brought in a high school pal named Jackson Sengle, lured away from a Ph.D. in bioengineering at Dartmouth. He understood the ribosomes that manufacture all the proteins in your body. Why not defy the new Silicon Valley norm and, as Peter Thiel put it to the Fellows in 2014, "do something that everyone thinks is stupid"-sell homemade computers?

They began putting them together manually, step-by-step, deep into the night above Silicon Valley. With their big advantage in GPU costs at $580 per module, they would not have to be especially efficient in assembly. Their product was a GPU Workstation that contained four GPU GeForce gaming modules from Nvidia, which Lambda priced at ten thousand dollars apiece; if you want them rack-mounted for your cloud, they cost twenty-five thousand.

Fiber-optic systems deploy lines of silica fiber that stretch unamplified across a distance the length of Long Island, combine thousands of threads in each cable and scores of data-bearing wavelengths in each thread, and are made of glass so pure that you could see through a pane of it forty miles thick. What Hölzle calls "low-power, high-density coherent optics" is one of the heroic feats of engineering in the Information Age, and it allowed him to increase the bandwidth across his data centers fifty-fold in six years. His forty-twokilohertz global information finder-fetcher, completing forty-two thousand searches per second, each one entailing hundreds or thousands of computing steps, is a historic technological achievement.

Muneeb Ali of Blockstack explains the transition:

As bitcoin is a calculator for money transfers, Ethereum is a global computer for executing programs. As bitcoin is a recorder of debits and credits for "coins" on a public ledger, Ethereum is a "virtual machine" for framing and sending software instructions for smart contracts or conditional transactions. To pay for it all, it also supplies coins-ether.

42. The Wright brothers-with a home library, no college degrees, and a bike shop-kicked off the age of flight.

36. There's no iron law of economics that says tuition should go up-and only up-year after year. By many measures, universities are the same or worse at teaching students as they were in the early 1980s. But now, students are paying four times as much as they did then. Imagine paying more every year for tickets on an airline whose planes flew slower and crashed more frequently, but that spent its revenue on one hell of a nice terminal and lounge instead. Would you put that sticker on your car's back window?

Squeezing into a dark jacket provided just in time by the University Club host, Berninger settles comfortably back into his chair, a rebel at rest in Washington, ready to tell his story. An electrical-systems engineer, Berninger has spent most of his business life as a cheerful disruptor of the telecom industry. He began at Bell Laboratories, where in 1995 he embarked on studies of how the voice-over-the-Internet protocol (VoIP) would affect the telecommunications companies. By allowing calls over the Internet, VoIP freed phones from some of the hustles and hassles of the public switched telephone network (PSTN).

Google began as an Internet company crawling the World Wide Web with its miraculous search capabilities. Today, as Eric Schmidt declares, it is moving from "search" to "suggest," using artificial intelligence to disintermediate the Web. Rather than being referred to a Web page by Google's system, you are increasingly provided with Google's own targeted response honed by its deep learning systems, its super-intelligent adaptation to your every whim.

Most of all, it will need the spread of 5G and the huge investments it entails. It will need the creativity of the new blockchain movement.

He acknowledged that "a data-structure founded on a string of small files containing hashes of previous files" may solve the nonproblem of establishing an immutable database. But who wants that?

Surely Vitalik Buterin doesn't want such a system. When the Ethereum DAO was hacked of some $150 million, he waded into his code base wielding his personal "hard fork" and rescinded the transaction.

Stinchcombe points out that "smart contracts" merely move the point of trust from lawyers writing quasi-English sentences in small print to software engineers on GitHub writing unintelligible code in one of scores of exotic languages. Solidity, anyone? Ruby-on-Rails?

Mathematics is not a closed or bounded system. It opens up at every step to a universe of human imagination. As Peirce's triadic logic illuminates, every symbol engenders its own infinity of imaginative interpretation. A symbol and its object are incoherent without an interpreter (Peirce says "interpretant"). Chaitin celebrates this regime as a new mathematics of creativity that opened up after Gödel and Turing disproved the Hilbert hypothesis of a complete and consistent mathematical universe.

The inevitable conclusion is that machines based on mathematical logic cannot exhaust the human domain; they can only expand it.

As I explained in the Scandal of Money, this rough cancellation of technical progress makes the price of gold a function of the time it takes to extract it.' In a world full of subjective human passions and greed, hungers and desires, time is the one economic factor that is indisputably objective. Thus, time lends objective substance to the movements of money.

Money is essentially a measure of the inexorable scarcity of time in economics. Imagine a world of barter. The rates of exchange between apples and houses, for example, would be determined by the differential time periods needed to produce an incremental unit. As a barter economy becomes a commercial economy, these common time factors become manifested in money.

Now we live in a global economy in which myriad goods and services are exchanged in unfathomable patterns in time and space. To mediate the tradeoffs and priorities of economic choice, money must be scarce. What remains scarce when all else becomes abundant is time.

Satoshi believed that his mining algorithm was mimicking gold.

Denouncing "the siren song of con artists and court jester economists," Ammous declares, "Contrary to the most egregiously erroneous and central tenet of the state theory of money, it was not government that decreed gold as money, rather it is only by holding gold that governments could even issue any form of money at all."

Proceeding to bitcoin, he asserts: "Nakamoto invented digital scarcity... a digital good that is scarce and cannot be reproduced infinitely...a digital good whose transfer stops it from being owned by the sender....

A currency cannot be put on autopilot; it needs oracles to channel it to the most promising entrepreneurial uses.

"However, there's a reason why the world has not adopted a Rothbardian monetary system up until now. It won't work, and bitcoin as designed will not last as a functional currency."

As Kendall observes, "For the same reason you can't have a 100 percent gold standard-there's not enough gold for our vast international economy and it would be highly deflationary-bitcoin's fixed limit is also highly deflationary and unworkable. To come to this conclusion Wright has to ignore, or not understand, the combined three-hundred-year history of the British, American, and international gold standards and their proven success."

"No other basic unit of measure," says Kendall-whether it's the second, the meter, the ampere, or the kilogram-"changes in value with demand. They are standards" based on physical constants. If money is a measuring stick, it cannot respond to demand.

If people believe in the scheme, most of the world's wealth might flow to bitcoin in periodic buying panics like the tulip mania, South Sea Bubble, and other derangements catalogued in 1841 by Charles Mackay in Extraordinary Popular Delusions and the Madness of Crowds? This outcome might be gratifying to current bitcoin holders, but it would obviously lead to government interventions, confiscations, crashes, and other reactions that would end this otherwise redemptive human project.

Throughout human history, people have understood that money plays a key role as a measuring stick. Contrary to Craig Wright's fervent view, currencies are not commodities, part of what they measure.

As Roberto Unger and Lee Smolin write in The Singular Universe and the Reality of Time, "Time is real. Indeed, it is the most real feature of the world... by which we mean that it does not emerge from any other aspect of nature.... Time does not emerge from space, although space may emerge from time."

Bitcoin, ten years old, is the first, with the highest market cap and the most robust and tested blockchain. It can serve as a store of value and a vessel for major international transactions. With a 21-millionunit cap, it is not a stable unit of account. Focusing on security, its "script" language is serviceable but not Turingcomplete, lacking recursive loops. So it is limited in its ability to accommodate smart contracts, but limited also in its vulnerability to hacks. Its Lightening extension gives it potential scalability for smaller transactions. Its key figures are Satoshi and Nick Szabo, who foreshadowed it with bit gold.

Ethereum, seven years old, is still the most versatile platform for smart contracts and initial coin offerings. Its software language, Solidity, is Turing-complete. Its coin-ether-commands the second-highest market cap among cryptocurrencies. Its "gas" metric specifying payments for use of energy in computation shields the system against spam and denial of service attacks. But responsive to computer advances, gas does not make ether a stable unit of account. Ethereum's leader, Vitalik Buterin, is perhaps the most impressive figure so far produced in the cryptocosm. Others involved with Ethereum's early development include departed co-founder Charles Hoskinson.

Bitmain, a miracle-working Chinese startup, started designing application specific integrated circuits (ASICS) for bitcoin mining in 2012. Five years later, in 2017, it apparently eclipsed Nvidia as the world's most profitable chip producer, with reported profits of near $4 billion. Bitmain's ASICS perform peta-hashes a second-that's 10%, or thousands of tera-hashes-which by this measure make them the world's most powerful computing devices. Under pressure from the Chinese government, Bitmain has since metastasized around the globe and launched a division to build artificial intelligence and machinelearning devices. Don't bet against them.

Blockstack has been in operation on the Net for four years, with hundreds of thousands of users. It is a platform for security and identity for a new distributed Internet. It provides a domain name service rooted in the bitcoin blockchain, a $25 million venture fund, and a scalable model that reserves the blockchain for pointers to memory addresses rather than for data storage itself. It thus uses the blockchain for what a blockchain can offer-security, identity, and trust-while freeing the blockchain from what it cannot offer-huge transaction speeds and storage space. Its key figures are Muneeb Ali, Ryan Shea, Luke Nelson, and Michael J. Freedman of Princeton.

NEO is an Ethereum rival that is dominant in China. A kind of operating system for OnChain, China's hyperledger project to make businesses compliant with regulatory regimes, NEO's CEO, Hong Fei, is politically savvy. The source of DNA (China's "distributed network architecture"), NEO is the foundation for a smart contract platform adaptable to the politics of Asia, which despite political resistance is arguably still at the heart of the crypto movement

Cardano, in preparation around the globe, is named after Girolamo Cardano, who originated probability theory in the sixteenth century.

IOTA is not a blockchain at all but a tangle of chained transactions where each transactor verifies two other transactions to qualify his own. Stemming from a community of crypto geeks in Norway, it aims to be hugely scalable and adaptable for an Internet of Things. Money seems to be pouring in.

In contrast to other cryptocurrencies, Hedera lacks open-source programmability, immutability, and time stamped permanence of records, but it is a galvanic and scalable invention that will make its mark. Making no change to the Internet architecture, it poses relatively little threat to the established order. But it will accommodate any form of smart contract or other application on top. It is programmable not only in Java but also in Ethereum's Solidity."

Disaggregated will be all the GAFAM (Google, Apple, Facebook, Amazon, Microsoft conglomerates)-the clouds of concentrated computing and commerce. A new hardware paradigm is moving beyond digital and silicon to analog and carbon nanotubes and hybrid chips with sensors and 5G antennas galore. Even money itself is being disaggregated and reinvented. The clouds are dispersing into the skies-sky computing rendered on your laptop and smartphone, spread across blockchains, transparent and transformative.

In 1948, when Shannon developed his Information Theory at MIT and Bell Labs, the world was preoccupied with communication across a noisy channel. Questions of truth and consequences deferred to questions of signal and noise. Information Theory began with "Communications Theory for Secrecy Systems." This paper proved that a perfect randomized one-time-pad constitutes an unbreakable code. It is a pillar of the theory of information as it defines one pole of a continuum between noise (white noise, purely random) and perfect order (predictable, deterministic, and information-free).

Bitcoin: A method of secure transactions based on wide publication and decentralization of a ledger across the Internet. Current credit card systems, by contrast, are based on secrecy and centralization and use protected networks and firewalled data centers filled with the personal information of the transactors.

Blockchain: A database, similar to a cadaster of real estate titles, extended to events, covenants, patents, licenses, or other permanent records. All are hashed together mathematically from the origin of the series, each record distributed and publicized on decentralized Internet nodes.

Boltzmann's entropy: Heat (the total energy of all molecules in a system) over temperature (the average energy of the molecules). Ludwig Boltzmann (1844-1906) identified this difference with missing information, or uncertainty about the arrangement of the molecules, thus opening the way for Claude Shannon and information theory. Both forms of entropy register disorder. Boltzmann's entropy is analog and governed by the natural logarithm e, while Shannon's entropy is digital and governed by log 2.

Chaitin's Law: Gregory Chaitin, inventor of algorithmic information theory, ordains that you cannot use static, eternal, perfect mathematics to model dynamic creative life. Determinist math traps the mathematician in a mechanical process that cannot yield innovation or surprise, learning or life. You need to transcend the Newtonian mathematics of physics and adopt post-modern mathematics-the mathematics that follows Gödel (1931) and Turing (1936), the mathematics of creativity.

Economic growth: Learning tested by falsifiability or possible bankruptcy. This understanding of economic growth follows from Karl Popper's insight that a scientific proposition must be framed in terms that are falsifiable or refutable. Government guarantees prevent learning and thus thwart economic growth.

Gödel's Incompleteness Theorem: Kurt Gödel's discovery in mathematical logic that any formal system powerful enough to express the truths of arithmetic will be incomplete and dependent on axioms not reducible to the system-truths that cannot be proved within the system itself. In developing his proof, Gödel (1906-1978) invented a mathematical machine that used numbers to embody axioms and thus anticipated the discoveries of computer science. By showing that mathematics could not be hermetically sealed or physically determinist, Gödel opened the way to postmodern mathematics: a mathematics of software and creativity. John von Neumann (1903-1957) was the first person to appreciate and publicize the importance of Gödel's demonstration in 1931 that mathematical statements can be true but unprovable.

Gold: The monetary element, atomic number 79, tested over centuries and found uniquely suitable as money. The five precious metals in the Periodic Table are rhodium, palladium, silver, platinum, and gold. Rhodium and palladium are rare elements that were not discovered until the eighteenth century. Platinum's melting point is three thousand degrees Fahrenheit, making it unworkable without advanced technology. Silver tarnishes and corrodes, and its reactivity makes it more tractable for most industrial purposes than gold. Only gold can function as a durable and unchanging measuring stick for value. Usually thought to be money because it is a useful commodity-pretty, shiny, divisible, portable, scarce, and convertible into jewelry-gold is in fact the monetary element because it is useless. Money is not valuable because it is really jewelry; jewelry is valuable because it is really money. Gold is a metric of valuation based on the time to extract an incremental ounce, which has changed little over the centuries, while gold has become more difficult to extract from deeper and more attenuated lodes. The gold metric is therefore not a function of technology and industrial progress, part of what it measures, but a pure gauge of value.

Hash: Conversion of a digital file of variable length into a string of characters of a specific length-in Secure Hashing Algorithm (SHA256 used in Bitcoin's blockchain cryptography) the output is always thirty-two bytes (256 bits). Hashes are prohibitively hard to invert; knowledge of the hash does not convey knowledge of the file, but knowledge of the file is readily converted into the hash. Any change to the file drastically changes the hash result. Hashes therefore reveal any tampering with the hashed data. The most common hash is the checksum at the end of every Internet packet. Hashes are the enabling technology of blockchains and hashgraphs.

Hashgraph: Use of chained blocks (called "rounds") of hashes in a treelike structure, with an ingenious algorithm called "virtual voting,"

Hypertrophy of finance: The growth of finance beyond the rate of growth of the commerce it measures and intermediates. For example, international currency trading is roughly seventy-three times more voluminous than all global trading in goods and services and an estimated one hundred times as voluminous as all stock market transactions. Oil-futures trading has risen by a factor of one hundred in some three decades, from 10 percent of oil output in 1984 to ten times oil output in 2015. Derivatives on real estate are now nine times global GDP. That's not capitalism, that's hypertrophy of finance.

Information Theory: Begun by Kurt Gödel when he made logic into functional mathematcis and algorithms. Information theory evolved through the minds of Claude Shannon (1916-2001) and Alan Turing (1912-1954) into its current role as mathematical philosophy. It depicts human creations and communications as transmissions across a channel, whether a wire or the world, in the face of the power of noise, with the outcome measured by its "news" or surprise, defined as entropy and consummated as knowledge.

Entropy is higher or lower depending on the freedom of choice of the sender. It is a libertarian index. The larger the available alphabet of symbols-that is, the larger the set of possible messages-the greater the composer's choice and the higher the entropy and information of the message

Metcalfe's Law: The value and power of a network grows by the square of the number of compatible nodes it links. Named for the engineer Robert Metcalfe (1946-), a co-inventor of Ethernet, this law Moore's Law: Cost-effectiveness in the computer industry doubles every two years. This pace corresponds closely to a faster pace in the number of transistors produced, signifying a learning curve. Formulated by Intel founder Gordon Moore (1929-) and inspired by Caltech professor Carver Mead's research, Moore's Law was originally based on the biennial doubling of the density of transistors on a silicon chip.

Noise: Interference in a message. Any influence of the conduit on the content: An undesired disturbance in a communications channel.

Peirce's Triad: The theorem of the mathematician and philosopher Charles Sanders Peirce (1839-1914) holding that all symbol and sign systems (such as software and mathematics) are meaningless without an interpreter. The triad consists of a sign (or symbol), an object, and a human interpreter. Removing the interpreter empties the triad, leaving it to be filled by ideology and artifice (e.g., "machine-learning" and "artificial intelligence").

Public Key Cryptography: Most cryptography is symmetrical: the same key (or string of digital numbers) both encrypts and decrypts the message. This is fine if you can personally give the key to the recipient.

Real money: A measuring stick, a metric of value, reflecting the scarcity and irreversible passage of time-entropy-based, equally distributed, and founded on the physical limits of the speed of light and the span of life. Bitcoin and gold are both real money in this sense.

Sand Hill Road: The arboreal abode of California venture capitalists and their "unicorns," stretching from the Camino Real near Stanford to Route 280 and into the clouds and wealth of Woodside and Silicon Valley. Losing its leadership in entrepreneurial capital to China, Israel, Initial Coin Offerings around the world (Icos), and other fundraising sites, this seat of luxury is filling up with lawyers and politicians

Shannon Entropy: Most simply measured by the number of binary digits need to encode a message, it is calculated as the sum of the base two logarithms of the probabilities of the components of the message.

Turing Machine: Inspired by Gödel's proof, Turing conceived an abstract universal computer model consisting of a control unit administering a set of instructions reading, writing, and moving one space at a time back and forth along an infinitely long tape divided into squares along its length. He proved that this hypothetical machine could perform any computable function. Silicon Valley has been cheering ever since, despite his further proof that most numbers cannot be generated by a computational process. Turing's universal computer could not calculate whether any particular program would ever halt.

Wealth: Tested knowledge. Physical law dictates that matter is conserved: material resources have not changed since the Stone Age.

← all highlights · 324 passages · Life After Google