Catalog

Record Details

Catalog Search



Superintelligence : paths, dangers, strategies  Cover Image Book Book

Superintelligence : paths, dangers, strategies / Nick Bostrom, Director, Future of Humanity Institute, Professor, Faculty of Philosophy & Oxford Martin School, University of Oxford.

Summary:

The human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. Other animals have stronger muscles or sharper claws, but we have cleverer brains. If machine brains one day come to surpass human brains in general intelligence, then this new superintelligence could become very powerful. As the fate of the gorillas now depends more on us humans than on the gorillas themselves, so the fate of our species then would come to depend on the actions of the machine superintelligence. But we have one advantage: we get to make the first move. Will it be possible to construct a seed AI or otherwise to engineer initial conditions so as to make an intelligence explosion survivable? How could one achieve a controlled detonation? To get closer to an answer to this question, we must make our way through a fascinating landscape of topics and considerations. Read the book and learn about oracles, genies, singletons; about boxing methods, tripwires, and mind crime; about humanity's cosmic endowment and differential technological development; indirect normativity, instrumental convergence, whole brain emulation and technology couplings; Malthusian economics and dystopian evolution; artificial intelligence, and biological cognitive enhancement, and collective intelligence.

Record details

  • ISBN: 9780199678112
  • ISBN: 0199678111
  • Physical Description: xvi, 328 pages : illustrations ; 25 cm
  • Edition: First edition.
  • Publisher: Oxford, United Kingdom : Oxford University Press, 2014.

Content descriptions

Bibliography, etc. Note:
Includes bibliographical references (pages 305-324) and index.
Formatted Contents Note:
Past developments and present capabilities -- Paths to superintelligence -- Forms of superintelligence -- The kinetics of an intelligence explosion -- Decisive strategic advantage -- Cognitive superpowers -- The superintelligent will -- Is the default outcome doom? -- The control problem -- Oracles, genies, sovereigns, tools -- Multipolar scenarios -- Acquiring values -- Choosing the criteria for choosing -- The strategic picture -- Crunch time.
Subject: Artificial intelligence > Philosophy.
Cognitive science.

Available copies

  • 1 of 1 copy available at Sage Library System.

Holds

  • 0 current holds with 1 total copy.
Show Only Available Copies
Location Call Number / Copy Notes Barcode Shelving Location Circulation Modifier Status Due Date Courses
Pendleton Public Library 006.301 B657 (Text) 37801000500833 Adult Non-Fiction Available -

Electronic resources


LDR 03369cam a2200373 i 4500
0011519223
003SAGE
00520250508001659.0
008131112s2014 enka b 001 0 eng d
010 . ‡a 2013955152
020 . ‡a9780199678112
020 . ‡a0199678111
035 . ‡a(OCoLC)857786110 ‡z(OCoLC)887687728
040 . ‡dUtOrBLW
042 . ‡alccopycat
08204. ‡a006.301 ‡223
1001 . ‡aBostrom, Nick, ‡d1973- ‡eauthor. ‡0(SAGE)1554031
24510. ‡aSuperintelligence : ‡bpaths, dangers, strategies / ‡cNick Bostrom, Director, Future of Humanity Institute, Professor, Faculty of Philosophy & Oxford Martin School, University of Oxford.
250 . ‡aFirst edition.
264 1. ‡aOxford, United Kingdom : ‡bOxford University Press, ‡c2014.
300 . ‡axvi, 328 pages : ‡billustrations ; ‡c25 cm
336 . ‡atext ‡btxt ‡2rdacontent
337 . ‡aunmediated ‡bn ‡2rdamedia
338 . ‡avolume ‡bnc ‡2rdacarrier
504 . ‡aIncludes bibliographical references (pages 305-324) and index.
50500. ‡tPast developments and present capabilities -- ‡tPaths to superintelligence -- ‡tForms of superintelligence -- ‡tThe kinetics of an intelligence explosion -- ‡tDecisive strategic advantage -- ‡tCognitive superpowers -- ‡tThe superintelligent will -- ‡tIs the default outcome doom? -- ‡tThe control problem -- ‡tOracles, genies, sovereigns, tools -- ‡tMultipolar scenarios -- ‡tAcquiring values -- ‡tChoosing the criteria for choosing -- ‡tThe strategic picture -- ‡tCrunch time.
520 . ‡aThe human brain has some capabilities that the brains of other animals lack. It is to these distinctive capabilities that our species owes its dominant position. Other animals have stronger muscles or sharper claws, but we have cleverer brains. If machine brains one day come to surpass human brains in general intelligence, then this new superintelligence could become very powerful. As the fate of the gorillas now depends more on us humans than on the gorillas themselves, so the fate of our species then would come to depend on the actions of the machine superintelligence. But we have one advantage: we get to make the first move. Will it be possible to construct a seed AI or otherwise to engineer initial conditions so as to make an intelligence explosion survivable? How could one achieve a controlled detonation? To get closer to an answer to this question, we must make our way through a fascinating landscape of topics and considerations. Read the book and learn about oracles, genies, singletons; about boxing methods, tripwires, and mind crime; about humanity's cosmic endowment and differential technological development; indirect normativity, instrumental convergence, whole brain emulation and technology couplings; Malthusian economics and dystopian evolution; artificial intelligence, and biological cognitive enhancement, and collective intelligence.
650 0. ‡aArtificial intelligence ‡xPhilosophy. ‡0(SAGE)1529510
650 0. ‡aCognitive science. ‡0(SAGE)1487934
85642. ‡3Contributor biographical information ‡uhttp://catdir.loc.gov/catdir/enhancements/fy1413/2013955152-b.html
85642. ‡3Publisher description ‡uhttp://catdir.loc.gov/catdir/enhancements/fy1413/2013955152-d.html
85641. ‡3Table of contents only ‡uhttp://catdir.loc.gov/catdir/enhancements/fy1413/2013955152-t.html
999 . ‡ebook
905 . ‡uadmin
901 . ‡a857786110 ‡bOCoLC ‡c1519223 ‡tbiblio

Additional Resources