How cognitive computing is driving the race to give machines a 'human brain'

How cognitive computing is driving the race to give machines a 'human brain'

 

From HAL and CP30 (or Metal Mickey for kids who grew up on British TV in the 80s) to the human androids of Ash in Alien and Data in Star Trek, films and TV have for decades portrayed a vision of the future where machines have some level of human intelligence.

In scientific labs across the globe, engineers and scientists are trying to make this a reality, and it's called cognitive computing.

Historically, computers have needed to be programmed and worked to algorithms. The next stage is about machine intelligence and self-learning systems. It's about the holy grail of trying to mimic the human brain. These scientists are called neuromorphic engineers and there are several big research projects dedicated to designing a computer that replicates the human brain.

One is the 10-year $1.3bn EU-backed Human Brain Project, launched last year. It aims to create a computer 1,000 times faster than the fastest supercomputers we use today. This project, and others like it, aim to build data driven models with learning rules that are as close as possible to the actual rules used by the human brain.

Those who watch too much sci-fi might say this is the point where the machines take over and become our masters. Why do we need this massive computing power and machine intelligence? The simple answer is the amount of data now being generated. And as Big Data collides with trends such as the Internet of Things we will need more intelligent machines to help us make sense of it all.

And it's not all lab theory. There are already early examples of this cognitive computing approach in the real world. The voice-activated personal digital assistants on our smartphones are a basic example of this type of computing, remembering and learning our interests, contacts, the places we regularly visit… and we can expect these to become far more sophisticated in the coming years.

Last year Google used neural network technologies to scan and identify a database of 10 million images without supervision. The system was able to train itself to identify cats and Google has used the technology to improve its image search service.

Another is IBM's Watson artificial intelligence system, which first made a mark in 2011 when it beat former record holders of the longest winning streak on the US TV quiz show Jeopardy! and won the $1m prize. One of the defeated players Ken Jennings even joked, "I for one welcome our new computer overlords".

While there's a novelty element to that example, Watson is also being used by US healthcare provider Wellpoint to quickly process vast amounts of medical information. The system can sift through the equivalent of one million books or 200 million pages of data, analyse it and provide a precise response in less than three seconds. This can help medical staff identify diagnosis and treatment options for patients, particularly in complex cases.

Cognitive computing offers us many benefits and it’s not about robots taking over the world, it's about machines and humans increasingly working together in ever smarter and more amazing ways. But let's hope someone has watched all those sci-fi films and remembers to build an off switch, just in case…

Photo credit: Ronny R

Sherif Haridy

Finance Director @ Reckitt | CFA, Cost Reduction

10y

Reminds me of HAL 9000

Like
Reply
G. Dudnik

MSc in Bioinformatics / Data Nerd

10y

beautiful but scary at the same time

Like
Reply
Mark A. Archer, PhD

IT Executive Manager, Strategist, Architect, Inventer, Developer & Technical Visionary

10y

We are still a long way from human-like intelligence. Pattern recognition is just one part of intelligence. We haven't gotten nearly as far with complex problem solving, contextual awareness and creativity. It's easy to get excited by the recent accomplishments, but there is way too much hype around this. See my Pulse post "Building Intelligence: Why Current Approaches to Artificial Intelligence Won't Work"

Like
Reply
Jeff Ello

Purdue University, CISSP, GCFE, ITIL Master

10y

I hope we can do better than a human brain - we already have those.

Like
Reply

To view or add a comment, sign in

More articles by Olaf Swantee

Insights from the community

Others also viewed

Explore topics