Increasingly, Computer Systems May Harness Us and Our Data to Machines, Often Without Our Knowledge. How Should We Regulate That?
Back in 1999, Tim Berners-Lee, inventor of the World Wide Web, envisioned a time when computers would be used “to create abstract social machines on the Web: processes in which the people do the creative work and the machine does the administration.” More than 15 years later, the idea of social machines remains both arcane and relatively unexamined. Yet such machines are all around us. Many are built on social networks such as Facebook, in which human interactions—from organizing a birthday party to protesting terrorist attacks—are underpinned by an engineered computing environment. Others are to be found in massively multiplayer online games, where a persistent online environment facilitates interactions concerning virtual resources between real people.
It seems like every day, an entirely new advancement or discovery is made in the energy sector. From solar to fusion to thorium, it is hard to determine what the future of energy will look like and what impacts these advances will have on the world. Over the next few weeks, The Future of Energy series will attempt to explore how new sources of energy work (or could work), the obstacles to their adoption, and their potential geopolitical impact.
Out of all the current “alternative” energy sources pursued today, shale gas and tight oil are the best developed and most widely-adopted. Shale is also the “oldest,” in that the extraction of natural gas and other shale hydrocarbons has technically been around for over a hundred years.
In 1965, Dr. Gordon E. Moore wrote an article (PDF) based on a trend he noticed that the number of transistors in a dense integrated circuit (IC) doubles approximately every two years. Fueled by unrelenting demands from more complex software, faster games, and greater broadband video, this observation was later dubbed Moore’s Law and has held true for nearly 50 years. It became the de facto roadmap against which the semiconductor industry drives its research and development. But that roadmap may be faltering now due to fundamental physics limitations incurred at the incredibly small scales at which we fabricate chips. Can we find novel ways to circumvent these limits and thereby achieve Moore’s Law 2.0? If we are successful, what implications might such computational capacity have for society?
Inexorable Concentration of Capital Undermines the Drive for 'Shared Prosperity'Like seismic waves rippling outward after a tectonic shift, reverberations are roiling the economic-policy landscape after the US launch of the groundbreaking new analysis by Thomas Piketty, the scholar from the Paris School of Economics whose landmark tome – Capital in the Twenty-First Century – has newly jolted the economics profession.
Any Washingtonian or World Bank Group staffer who somehow missed the news of Piketty’s celebrated series of speeches and seminars last week – in Washington, New York, and Boston – received an unmistakable signal this week about what an important intellectual breakthrough Piketty has achieved. President Jim Yong Kim on Tuesday cited Piketty while putting the issue of economic inequality at the top of his list of priorities during his review of the Spring Meetings of the Bank and the International Monetary Fund. Noting that he was already about halfway through reading Piketty’s “Capital,” President Kim sent a clear message that the skewed global distribution of wealth, as analyzed by Piketty and emphasized by many officials at the Bank and Fund's semiannual conference, should be top-of-mind for policy-watchers at the Bank and beyond – indeed, at every institution that hopes to promote shared prosperity.
With an Urgent New Focus on Overcoming InequalityThe challenge of promoting shared prosperity was one of the unifying themes throughout the recent Spring Meetings at the World Bank Group and International Monetary Fund – the whirlwind of diplomacy and scholarship that sweeps through Washington every April and October. A remarkable new factor, however, energized this spring's event: In a vivid evolution of the policy debate, the seminars, forums, and news-media coverage seemed focused, to a greater degree than ever, not just on the economic question of the creation of overall economic growth but on what has traditionally been seen as a social question: the distribution of wealth.
Now Jeremy Rifkin endeavors to take us one step further. In Zero Marginal Cost Society, he argues for the next step in the human journey, applying the principles and benefits of zero marginal cost virtual space to physical reality. Decentralized renewable energy production at near zero post-investment cost enveloped in ubiquitous wireless computing and sensing networks, the Internet of Things (IoT). The pervasive truth of existence in a capitalist system, Rifkin maintains, is giving way to a hybrid economy; incorporating both traditional capitalism and the growing segment of technologically empowered peer-to-peer individuals Rifkin so eloquently calls the "Collaborative Commons."
But could the Internet of future generations be even more revolutionary? Keeping in mind that the Internet evolved largely without any central guidance – recently, the US government announced it will “transition out of managing domain names and addresses for the Internet Corporation for Assigned Names and Numbers (ICANN)” – what new forms or functions will this global system take as technologies such as robotics, autonomous vehicles, ubiquitous sensors, and others move toward an online presence? To understand these changes, we trace the Internet through four major stages of Web 1.0 to 4.0.