News

artificial intelligence

Viewing posts tagged artificial intelligence

Noel Sharkey on the inexorable rise of robots

From Silicon.com:

In this video interview, Noel Sharkey, professor of robotics and AI at the University of Sheffield, discusses developments in robotics – from the proliferation of robots in Japan’s automotive industry to the stair-climbing dexterity of Honda’s Asimo robot and beyond.

He also discusses ethical issues, and in which countries we can find the most robots … and some implications.

Read the original article…

The Education Futures timeline of education

Education Futures celebrates its first five years of exploring new futures in human capital development with a timeline of the history of education from 1657-2045. This timeline provides not only a glimpse into modern education, but plots out a plausible future history for human capital development. The future history presented is intended to be edgy, but also as a conversation starter on futures for education and future thinking in human capital development.

As always, we invite your feedback and suggestions for further development! We expect many enhancements and updates to this resource in the near future.

Kurzweil's Transcendent Man

We haven’t had an opportunity to screen Ray Kurzweil‘s the film, Transcendent Man, yet, but The Futurist magazine published a preview:

Scene: A movie theater on the west side of Manhattan during the Tribeca Film Festival. The audience teems with hip New York film students eager to see the world premiere of a new documentary. They’re joined, unexpectedly, by computer scientists, geneticists, and futurists from Denmark, the United Kingdom, and Hong Kong. The lights dim. After a brief opening, inventor Ray Kurzweil appears on the screen, looks squarely into the camera, and says, “I’m never going to die.”

So began the world premiere of Barry Ptolemy’s Transcendent Man, a feature-length film that chronicles Kurzweil’s ideas on the future of technological innovation. Chief among his forecasts: In the next 30 years, humans will use genomics, nanotechnology, and even artificial intelligence to escape death.

The film is in limited release and we will post more about the film and its implications for education as soon as we have an opportunity to view it. In the meantime, Read more at The Futurist or visit the film’s website.

Singularity University

Singularity University

This past week

A shockwave passed through the singularity community today with the public launch of Singularity University at the NASA Ames campus in Silicon Valley.  Singularity University aims to assemble a world class community of thought leaders, academics, and entreprenuers across the many fields of exponentially advancing technologies (nanotechnology, genetics, medicine, artificial intelligence, etc.) in order to address humanity’s grand challenges.

With significant backing from Google and NASA, and with the participation of a renowned cast of faculty and advisors, Singularity University is poised to literally overnight become a world class institution for the innovation, collaboration, and leadership that will allow the world to capitalize on the great promise of technology to solve the world’s greatest problems…

Founded by Ray Kurzweil, Peter Diamandis (X Prize Foundation), and Larry Page (Google), the Singularity University focus its curriculum on technologies surrounding:

  • Future Studies & Forecasting
  • Networks & Computing Systems
  • Biotechnology & Bioinformatics
  • Nanotechnology
  • Medicine, Neuroscience & Human Enhancement
  • AI, Robotics, & Cognitive Computing
  • Energy & Ecological Systems
  • Space & Physical Sciences
  • Policy, Law & Ethics
  • Finance & Entrepreneurship

What’s missing, however, is a human capital development focus.  As the world approaches the Technological Singularity, how can we design better human capital futures?  Moreover, what are the social, cultural, and educational elements we need to start studying and working on today to ensure our success? …our survival?

Is there room for term papers in the 21st century?

The flak I caught yesterday regarding SafeAssign got me thinking about term papers in the 21st century. Information and communications technologies make it easy and rewarding to share information. More recently, however, ICTs are allowing people to build creative and innovative products from the information available. We’re evolving into a “cut-and-paste society.” Some examples of which are:

  • YouTube, which allows anybody to share videos that interest them with anybody in the world for free
  • Mogulus, which allows anybody to create their own TV station for free (something that very recently required a sizable staff and millions of dollars of funding)
  • GarageBand, which provides people with tools to record, mix and publish their own music
  • Hip-hop, which often mixes, juxtaposes and generates new meanings from music, images and texts

Academic culture and traditions have not caught up to 21st century society. What real meaning is there for society if we were to continue to place heavy focus on traditional term papers, and police the content to make sure no influence is present from modern society?

Creative work, also, is being generated increasingly by machines. Two examples are Brutus and the 20th century’s MINSTREL (see Noah’s comments). Why should we worry about originality in student work if we are perhaps only a couple years (or months?) away from machines that will be able to write original essays, theses, novels, etc., for them? …and what if these machines could write these documents better than –and vastly outperform– most students?

Is there something else schools should focus on?

BT futurist on Nobels and alien thinking

Australia’s Computerworld jumps on the futures bandwagon, and provides insight into the 21st century (in stark contrast to what others are writing on the future). In an interview with British Telecom futurist Ian Pearson, a few daring predictions emerged:

1. “Thinking” is going to seem very alien to many people:

We will probably make conscious machines sometime between 2015 and 2020, I think. But it probably won’t be like you and I. It will be conscious and aware of itself and it will be conscious in pretty much the same way as you and I, but it will work in a very different way. It will be an alien. It will be a different way of thinking from us, but nonetheless still thinking. It doesn’t have to look like us in order to be able to think the same way.

2. Some machine intelligences will outsmart humans by 2020, and they will begin winning Nobel Prizes.

This raises an important concern. Our schools are not preparing students to thrive in an environment with a plurality of creative and intellectual modalities. Rather, through regimes such as No Child Left Behind, they are being transformed into cookie-cutter automatons. The irony is that as machines become much more intellectually-capable and creative, human capital is becoming more mechanistic. Which has the better potential to thrive through this century?

Three Singularities, three conversations

cog-threat.jpgEliezer Yudkowsky, on the SIAI blog, posted his observations of the emergence of three “logically distinct” schools of thought related to the Singularity:

  1. Accelerating change (Ray Kurzweil, Alvin Toffler, John Smart): “technological change feeds on itself, and therefore accelerates” along a predictable curve.
  2. Event Horizon (Vernor Vinge): “Shortly, technology will advance to the point of improving on human intelligence (brain-computer interfaces, Artificial Intelligence). This will create a future that is weirder by far than most science fiction, a difference-in-kind that goes beyond amazing shiny gadgets.”
  3. Intelligence explosion (I.J. Good, Eliezer Yudkowsky [and, I’m sure, many others]): “the smarter you get, the more intelligence you can apply to making yourself even smarter.”

All three interpretations of the Singularity, Yudkowsky argues, require specific delineation to avoid being mashed into –and interpreted as– a single, apocalyptic metanarrative in popular discourse. Perhaps to better prepare educators for seemingly more absurd, ambiguous, and chaotic futures, we ought to build Singularity awareness, acceptance and preparedness by serializing our conversations:

First, change is accelerating. The good news is that we can plot out, reasonably predict, and prepare for much of it. What changes are our schools prepared for?

Second, a smarter society will start to build smarter things. Human intelligence hasn’t increased, but distributed knowledge across society will help us build improved humans, successor species and machines that will outsmart us. Students enrolled in schools today will likely face a future where “natural” humans are no longer the most intelligent species on the planet. How can we prepare them?

Third, our future could be very, very weird. Period. Are we doing anything to prepare students for futures beyond anyone’s imagination?

Six scenarios for the Technological Singularity

Two articles related to the Singularity Summit have appeared on preparing for the Technological Singularity:

First, Jamais Cascio writes on a Metaverse Roadmap Overview:

In this work, along with my colleagues John Smart and Jerry Paffendorf, I sketch out four scenarios of how a combination of forces driving the development of immersive, richly connected information technologies may play out over the next decade. But what has struck me more recently about the Roadmap scenarios is that the four worlds could also represent four pathways to a Singularity. Not just in terms of the technologies, but — more importantly — in terms of the social and cultural choices we make while building those technologies.

The scenarios explored are:

  1. Virtual Worlds: the combination of simulation and intimate (highly personalized) technologie
  2. Mirror Worlds: the intersection of simulation and externally-focused technologies
  3. Augmented Reality: the collision of augmentation and external technologies
  4. Lifelogging: brings together augmentation and intimate technologies to record the experiences and histories of objects and users (what Cascio refers to as “participatory panopticon“)

Read more at Open the Future

Second, Bryan Gardiner writes on the Wired blog that Peter Thiel, co-founder of PayPal, multi-millionaire Facebook backer, and the president of Clarium Capital Management, a global macro hedge fund, is devising a Singularity-aware investment strategy based on two, polarized scenarios in a near-future world where machines will become smarter than humans:

  1. Negative scenario: where machines won’t need us and humans become expendable
  2. Positive scenario: where humans would still have a positive outlook

Regardless of the two scenarios, Gardiner points out that the volatile booms and busts over recent years are indicative of the market’s attempts to align itself with near-Singularity transformations:

In essence, he argues that each of these booms represent different bets on the singularity, or at least on various things that are proxies for it, like globalization. What’s more, we’ve been seeing them now for over 30 years.

The markets are catching on to accelerating change. Why not bet on the Singularity in our schools as well?

The future of search?

The semantic web approaches!

powerset.jpgPowerlabs, which will launch in early September, utilizes Powerset, a large-scale search engine that breaks the confines of keyword search and takes advantage of the structure and nuances of natural language, according to the company. At the moment, they’re accepting sign-ups for pre-release experimentation.

Singularity Institute blog launched

The Singularity Institute for Artificial Intelligence (SIAI) has launched a blog covering research and outreach updates, videos, articles, papers, events, goals, and relevant science and technology news.

SIAI is a not-for-profit research institute in Palo Alto, California, with three major goals: furthering the nascent science of safe, beneficial advanced artificial intelligence (self-improving systems) through research and development, research fellowships, research grants, and science education; furthering the understanding of its implications to society through the AI Impact Initiative and annual Singularity Summit; and furthering education among students to foster scientific research.