Brinty
07-15-2009, 01:57 AM
The following question was posed to David Morrison of NASA and his answer follows.
I read Seth Shostack's new book and he said that its highly probable that any advanced intelligent life would be taken over by the artificial intelligence they created (like computers). Do you agree?
This is an intriguing suggestion, and Seth Shostak is not the first person to make it. Obviously I don’t know whether or not it is true, but it sounds possible to me. We may find out: projections of computer power suggest that we will have computers that are really smarter than humans within the next 25 years. This point is called the technological singularity (or just singularity for short). Wikipedia (en.wikipedia.org/wiki/Technological_singularity) says that “in 1965, I. J. Good first wrote of an intelligence explosion, suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers....In 1983, Vernor Vinge called this event ‘the Singularity’ as an analogy between the breakdown of modern physics near a gravitational singularity and the drastic change in society he argues would occur following an intelligence explosion.” You can also read about a new organization The Singularity University at (singularityu.org/), which is enrolling its first students this summer.
I read Seth Shostack's new book and he said that its highly probable that any advanced intelligent life would be taken over by the artificial intelligence they created (like computers). Do you agree?
This is an intriguing suggestion, and Seth Shostak is not the first person to make it. Obviously I don’t know whether or not it is true, but it sounds possible to me. We may find out: projections of computer power suggest that we will have computers that are really smarter than humans within the next 25 years. This point is called the technological singularity (or just singularity for short). Wikipedia (en.wikipedia.org/wiki/Technological_singularity) says that “in 1965, I. J. Good first wrote of an intelligence explosion, suggesting that if machines could even slightly surpass human intellect, they could improve their own designs in ways unforeseen by their designers....In 1983, Vernor Vinge called this event ‘the Singularity’ as an analogy between the breakdown of modern physics near a gravitational singularity and the drastic change in society he argues would occur following an intelligence explosion.” You can also read about a new organization The Singularity University at (singularityu.org/), which is enrolling its first students this summer.