Eliezer Yudkowsky, on the SIAI blog, posted his observations of the emergence of three “logically distinct” schools of thought related to the Singularity:
- Accelerating change (Ray Kurzweil, Alvin Toffler, John Smart): “technological change feeds on itself, and therefore accelerates” along a predictable curve.
- Event Horizon (Vernor Vinge): “Shortly, technology will advance to the point of improving on human intelligence (brain-computer interfaces, Artificial Intelligence). This will create a future that is weirder by far than most science fiction, a difference-in-kind that goes beyond amazing shiny gadgets.”
- Intelligence explosion (I.J. Good, Eliezer Yudkowsky [and, I’m sure, many others]): “the smarter you get, the more intelligence you can apply to making yourself even smarter.”
All three interpretations of the Singularity, Yudkowsky argues, require specific delineation to avoid being mashed into –and interpreted as– a single, apocalyptic metanarrative in popular discourse. Perhaps to better prepare educators for seemingly more absurd, ambiguous, and chaotic futures, we ought to build Singularity awareness, acceptance and preparedness by serializing our conversations:
First, change is accelerating. The good news is that we can plot out, reasonably predict, and prepare for much of it. What changes are our schools prepared for?
Second, a smarter society will start to build smarter things. Human intelligence hasn’t increased, but distributed knowledge across society will help us build improved humans, successor species and machines that will outsmart us. Students enrolled in schools today will likely face a future where “natural” humans are no longer the most intelligent species on the planet. How can we prepare them?
Third, our future could be very, very weird. Period. Are we doing anything to prepare students for futures beyond anyone’s imagination?