IBM CEO Ginni Rometty on the biggest misconception about intelligent machines
Posted by hkarner - 20. Oktober 2016
Source: The Wall Street Journal
Subject: The Natural Side of A.I.
THE RISE OF ARTIFICIAL intelligence has inspired both fascination and fear of the world to come. Some tech prophets envision a “singularity,” in which advances in AI trigger drastic technological growth, while others imagine that autonomous machines will someday turn on their creators and destroy us. But when you’re engaged in the science of machine intelligence, you understand that this is a false set of choices shaped by a misleading phrase.
The term “artificial intelligence” was coined in 1955 to convey the concept of general intelligence: the notion that all human cognition stems from one or more underlying algorithms, and that by programming computers to think in the same way, we could create autonomous systems modeled on the human brain.
At the same time, other researchers were taking a different approach. Their method—which worked bottom up to find patterns in growing volumes of data—was called IA, short for “intelligence augmentation.” Ironically, the methodology not modeled on the human brain has led to the systems we now describe as cognitive. IA is behind real-world applications such as language processing, machine learning and human-computer interaction. The term “AI” won out in the end, despite being a misnomer.
‘Only cognitive systems can optimize data, the new natural resource of our time.’Whether you call them AI or IA, these cognitive systems, such as IBM’s Watson, can ingest vast amounts of data, learn from it, and then reason in the form of hypotheses or recommendations. They have transformed industry after industry, from law to medicine, from education to retail. Oncologists at Memorial Sloan Kettering use cognitive systems to help understand large swaths of medical data and then diagnose patients’ cancers; Macy’s uses them to send shoppers personalized offers as they walk through the door. Watson is working with “Sesame Street” to create a cognitive tutor for the next generation of collaborative learning tools; even fashion designers, movie editors and chefs use cognitive technologies as part of their creative process.
The world is producing an enormous amount of data every day—video, audio, impulses from sensors, medical journals, movie scripts, etc. About 80% of it is the unstructured kind that would be dark to traditional computers, which can capture the information but can’t understand what it means. Cognitive systems can.
These cognitive systems are neither autonomous nor sentient, but they form a new kind of intelligence that has nothing artificial about it. They augment our capacity to understand what is happening in the complex world around us.
Leaders of every industry and institution are sprinting to become digital, adopting digital products, operations and business models. But once everything becomes digital, who will win?
The answer is clear: It will be the companies and the products that make the best use of data. Data is the great new natural resource of our time, and cognitive systems are the only way to get value from all its volume, variety and velocity. Having ingested a fair amount of data myself, I offer this rule of thumb: If it’s digital today, it will be cognitive tomorrow.