The official site of bestselling author Michael Shermer The official site of bestselling author Michael Shermer

Tag Results

Apocalypse A.I.

Artificial intelligence as existential threat

magazine cover

In 2014 SpaceX CEO Elon Musk tweeted: “Worth reading Superintelligence by Bostrom. We need to be super careful with AI. Potentially more dangerous than nukes.” That same year University of Cambridge cosmologist Stephen Hawking told the BBC: “The development of full artificial intelligence could spell the end of the human race.” Microsoft co-founder Bill Gates also cautioned: “I am in the camp that is concerned about super intelligence.”

How the AI apocalypse might unfold was outlined by computer scientist Eliezer Yudkowsky in a paper in the 2008 book Global Catastrophic Risks: “How likely is it that AI will cross the entire vast gap from amoeba to village idiot, and then stop at the level of human genius?” His answer: “It would be physically possible to build a brain that computed a million times as fast as a human brain…. If a human mind were thus accelerated, a subjective year of thinking would be accomplished for every 31 physical seconds in the outside world, and a millennium would fly by in eight-and-a-half hours.” Yudkowsky thinks that if we don’t get on top of this now it will be too late: “The AI runs on a different timescale than you do; by the time your neurons finish thinking the words ‘I should do something’ you have already lost.”

The paradigmatic example is University of Oxford philosopher Nick Bostrom’s thought experiment of the so-called paperclip maximizer presented in his Superintelligence book: An AI is designed to make paperclips, and after running through its initial supply of raw materials, it utilizes any available atoms that happen to be within its reach, including humans. As he described in a 2003 paper, from there it “starts transforming first all of earth and then increasing portions of space into paperclip manufacturing facilities.” Before long, the entire universe is made up of paperclips and paperclip makers. (continue reading…)

read or write comments (27)

ClimeApocalypse

Or just another line item in the budget?
magazine cover

In the year 2393 a historian in the Second People’s Republic of China penned a book about how scientists, economists and politicians living in the 21st century failed to act on the solid science they had that gave clear warnings of the climate catastrophe ahead. As a result, the world experienced the Great Collapse of 2093, bringing an end to Western civilization.

So speculate historians of science Naomi Oreskes of Harvard University and Erik Conway of the California Institute of Technology in their book The Collapse of Western Civilization: A View from the Future (Columbia University Press, 2014), a short scientific- historical fantasy. During the second half of the 20th century— the “Period of the Penumbra”—a shadow of anti-intellectualism “fell over the once-Enlightened techno-scientific nations of the Western world…preventing them from acting on the scientific knowledge available at the time and condemning their successors to the inundation and desertification of the late twenty-first and twenty-second centuries.” (continue reading…)

read or write comments (9)