Information Age Education
   Issue Number 157
March, 2015   

This free Information Age Education Newsletter is edited by Dave Moursund and Bob Sylwester, and produced by Ken Loge. The newsletter is one component of the Information Age Education (IAE) publications.

All back issues of the newsletter and subscription information are available online. In addition, four free books based on the newsletters are available: Understanding and Mastering Complexity; Consciousness and Morality: Recent Research Developments; Creating an Appropriate 21st Century Education; and Common Core State Standards for Education in America. We are in the process of creating a free IAE book based on the Education for Student’s Futures series of newsletters, and it will be available early in April, 2015.

This issue of the IAE Newsletter is the first of two newsletters on the topic of Technological Singularity, and is part of the Education for Student’s Futures series of newsletters.

Education for Students' Futures
Part 20: The Coming Technological Singularity

David Moursund
Professor Emeritus, College of Education
University of Oregon

“Any sufficiently advanced technology is indistinguishable from magic.” (Arthur C. Clarke; British science fiction author, inventor, and futurist; 1917-2008.)

This is the first of two IAE Newsletters about the increasingly rapid pace of technological change going on in our world. It is a slight modification of the IAE Blog entry

The term technological singularity refers to some time in the future when computers become much “smarter” than people. Right now the rate of technological progress is both large and increasing. We have artificially intelligent computer systems that are more capable than humans in certain limited areas. However, we still seem far from a time in which computer intelligence exceeds human intelligence over the broad range of human intellectual endeavors.

Technological Singularity

Quoting from the Wikipedia:

The first use of the term "singularity" in this context was by mathematician John von Neumann. In 1958, regarding a summary of a conversation with von Neumann, Stanislaw Ulam described "ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue."

Irving Jack Good was a mathematician/cryptologist who worked with Alan Turing. He believed an ultra intelligent computer might be built before the end of the 20th century. Quoting from (Good, 1965):

Let an ultra intelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultra intelligent machine could design even better machines; there would then unquestionably be an "intelligence explosion," and the intelligence of man would be left far behind. Thus the first ultra intelligent machine is the last invention that man need ever make, provided that the machine is docile enough to tell us how to keep it under control.... It is more probable than not that, within the twentieth century, an ultra intelligent machine will be built and that it will be the last invention that man need make.

The idea of a technological singularity has been popularized by science fiction writer and mathematician/computer scientist Vernor Vinge (Vinge, 1993). Quoting Vinge:

The acceleration of technological progress has been the central feature of this century. I argue in this paper that we are on the edge of change comparable to the rise of human life on Earth. The precise cause of this change is the imminent creation by technology of entities with greater than human intelligence.

Ray Kurzweil

Ray Kurzweil is a computer scientist and engineer who has written and talked extensively about the coming singularity. See the two videos (Kurzweil, 4/28/2009; Kurzweil 6/4/2014), and his book, The Singularity Is Near (2005). We are creeping up on the technological singularity. Here is a forecast from Kurzweil (6/4/2014):

"Jeopardy" is a very broad natural language game, and Watson got a higher score than the best two [human] players combined. It got this query correct: "A long, tiresome speech delivered by a frothy pie topping," and it quickly responded, "What is a meringue harangue?" And Jennings and the other guy didn't get that. It's a pretty sophisticated example of computers actually understanding human language, and it actually got its knowledge by reading Wikipedia and several other encyclopedias.

Five to 10 years from now, search engines will actually be based on not just looking for combinations of words and links but actually understanding, reading for understanding the billions of pages on the web and in books. So you'll be walking along, and Google will pop up and say, "You know, Mary, you expressed concern to me a month ago that your glutathione supplement wasn't getting past the blood-brain barrier. Well, new research just came out 13 seconds ago that shows a whole new approach to that and a new way to take glutathione. Let me summarize it for you."

Twenty years from now, we'll have nanobots, because another exponential trend is the shrinking of technology. They'll go into our brain through the capillaries and basically connect our neocortex to a synthetic neocortex in the cloud providing an extension of our neocortex.

Many Are Concerned by the Technology Trend

Are you frightened by the concept of a technological singularity? Do you want the technological singularity to occur? If the technological singularity does occur, it certainly will disrupt humanity.

Stephen Hawking is one of Britain’s pre-eminent scientists (Cellan-Jones, 12/2/2014). Quoting from this BBC article:

Prof Stephen Hawking, one of Britain's pre-eminent scientists, has said that efforts to create thinking machines pose a threat to our very existence.

He told the BBC: "The development of full artificial intelligence could spell the end of the human race."

The theoretical physicist [Hawking], who has the motor neurone disease amyotrophic lateral sclerosis (ALS), is using a new system developed by Intel to speak.

Machine learning experts from the British company Swiftkey were also involved in its creation. Their technology, already employed as a smartphone keyboard app, learns how the professor thinks and suggests the words he might want to use next.

Prof Hawking says the primitive forms of artificial intelligence developed so far have already proved very useful, but he fears the consequences of creating something that can match or surpass humans.

Final Remarks

The possibility of a technological singularity is an interesting and challenging global problem. Kurzweil and others argue that it is inevitable. They argue among themselves about when it might happen, and they point to the rapid progress that is occurring. Hawking and others express fear of it happening, or argue that it will never happen. The naysayers argue that there are many aspects of a human being that can never be captured in a machine.

Certainly technological progress and the changes it is bringing to our world are a major and ongoing situation that is affecting all of us. Thus, a modern education should certainly include helping students understand the current and increasing pace of change in technology, how it is affecting them now, and how it will affect them in the future. As a teacher or parent, you can stay informed about major technological changes and share your insights with the students and others in your life.

Cellan-Jones, R. (12/2/1014). Stephen Hawking warns artificial intelligence could end mankind. BBC News. Retrieved 2/22/2015 from

Good, I.J. (1965). Speculations concerning the first ultraintelligent machine. In Advances in Computers, vol. 6, Franz L. Alt and Morris Rubinoff, eds., pp. 31-88, Academic.

Kurzweil, R. (6/4/2014). The hierarchy in your brain: Ray Kurzweil at TED 2014. Retrieved 2/23/2015 from

Kurzweil, R. (4/28/2009). The coming singularity. (7:10 video.) Retrieved 2/23/2015 from

Kurzweil, R. (2005). The singularity is near. When humans transcend biology. NY: Viking.

Vinge, V. (1993). The coming technological singularity: How to survive in the post-human era. Retrieved 2/22/2015 from The original version of this article was presented at the VISION-21 Symposium sponsored by NASA Lewis Research Center and the Ohio Aerospace Institute.


David Moursund is an Emeritus Professor of Education at the University of Oregon, and coeditor of the IAE Newsletter. His professional career includes founding the International Society for Technology in Education (ISTE) in 1979, serving as ISTE’s executive officer for 19 years, and establishing ISTE’s flagship publication, Learning and Leading with Technology. He was the major professor or co-major professor for 82 doctoral students. He has presented hundreds of professional talks and workshops. He has authored or coauthored more than 60 academic books and hundreds of articles. Many of these books are available free online. See In 2007, Moursund founded Information Age Education (IAE). IAE provides free online educational materials via its IAE-pedia, IAE Newsletter, IAE Blog, and books. See


Reader Comments

We are using the Disqus commenting system to facilitate comments and discussions pertaining to this newsletter. To use Disqus, please click the Login link below and sign in. If you have questions about how to use Disqus, please refer to this help page.

Readers may also send comments via email directly to and

About Information Age Education, Inc.

Information Age Education is a non-profit organization dedicated to improving education for learners of all ages throughout the world. Current IAE activities and free materials include the IAE-pedia at, a Website containing free books and articles at, a Blog at, and the free newsletter you are now reading. See all back issues of the Blog at and all back issues of the Newsletter at