Harold Stoner Clark Lectures





Technology Strategy and Existential Risks

Tuesday, February 17, 2015
11:10 am, Samuelson Chapel

The concept of existential risk provides an important focusing lens for long-term global technology strategy. Existential risks are those that threaten to destroy intelligent life on Earth or its potential for desirable development. But figuring out how to reduce existential risk is very difficult. Do we want faster or slower progress in particular technological areas? Faster or slower economic growth? More or less global integration?


Superintelligence

Tuesday, February 17, 2015
4:00 pm, Samuelson Chapel

If machine intelligence one day surpasses biological intelligence, a new era begins. The development of superintelligence will be the last invention humanity will ever need to make. However, managing the transition to the machine intelligence era will be an extremely difficult challenge. Bostrom will discuss some of the aspects of this challenge and how it relates to the wider macrostrategic situation that we now find ourselves in.



Linda Zagzebski, Ph.D.

Nick Bostrom

Nick Bostrom, Ph.D., is a professor in the Faculty of Philosophy at Oxford University and the founding director of the Future of Humanity Institute, a multidisciplinary center for careful thought about global priorities and big questions for humanity. He has a background in physics, computational neuroscience, and mathematical logic as well as philosophy. He is the author of some 200 publications, most recently the book Superintelligence: Paths, Dangers, Strategies (Oxford UP, 2014).

Bostrom is a recipient of the Eugene R. Gannon Award (one person selected annually worldwide from the fields of philosophy, mathematics, the arts and other humanities, and the natural sciences). He has been listed in Foreign Policy's Top 100 Global Thinkers and Prospect Magazine’s World Thinkers list – the youngest person in the top 15 from all fields.