Artificially Intelligent or Intelligently Artificial
Introduction
We are living in the midst of a surge of interest and research into Artificial Intelligence (hereby A.I.). It can seem like every week there is a new breakthrough in the field and a new record set in some task previously done by humans. Not too long ago, artificial intelligence seemed a distant dream for especially interested researchers. Today it is all around us. We carry it in our pockets, it's in our cars and in many of the web services we use throughout the day. As this technology matures, every business must ask itself the central question: how will this disrupt my industry? Are we using artificial intelligence or augmentative intelligence?
Throughout this research paper, we will investigate the possible interpretations of A.I and its implications in healthcare industry.
About Intelligence and Artificial Intelligence (A.I.)
If we ask Wikipedia: Intelligence is “the ability to perceive or infer information, and to retain it as knowledge to be applied towards adaptive behaviors within an environment or context.”
Image Source : cognitivescale.com
Two major group of intelligence: Artificial intelligence (AI, also machine intelligence, MI) is intelligence demonstrated by machines, in contrast to the natural intelligence (NI), marked by complex cognitive feats and high levels of motivation and self-awareness, displayed by humans and other animals.
For the purpose of this paper, think of A.I. technology as the type of technology you would use to do tasks that require some level of intelligence to accomplish, deployed in domains where there is a lot of uncertainty.
The primary goal of A.I. is to perform any intellectual tasks that a human being can. A.I. is also referred to as "strong AI"[1], "full AI"[2] or as the ability of a machine to perform "general intelligent action"[3] .
Some references emphasize a distinction between strong AI and "applied AI"[4] (also called "narrow AI"[1] or "weak AI"[5]): the use of software to study or accomplish specific problem solving or reasoning tasks. Weak AI, in contrast to strong AI, does not attempt to perform the full range of human cognitive abilities.
How A.I is different from I.A
MIT scientist John McCarthy coined the term “artificial intelligence” in 1955 when he laid out seven aspects for machines to achieve true intelligence. In reality, even the most sophisticated technologies achieve two or perhaps three of these. The rest isn’t technically possible.
- Simulating higher functions of the human brain
- Programming a computer to use general language
- Arranging hypothetical neurons in a manner so they can form concepts
- A way to determine and measure problem complexity
- Self-improvement
- Abstraction, the quality of dealing with ideas rather than events
- Randomness and creativity
So, what does it mean for healthcare SaaS platforms that want to act in an intelligently artificial way? It means applying handy rules based on the single most applicable machine aspects laid out by John McCarthy.
Augment, don’t replace, work
Platforms should be able to take away the drudgery of everyday tasks, which frees up people so they can act on insights and add more human value such as empathy and creativity.
Predictive and prescriptive analytics can have a permanent and significant impact. The deterministic model is prediction at its simplest: Do one precise thing, with ranked precise outcome; whenever the first input is there, the outcome will always be the same.
Much more interesting is stochastic modeling, or dealing with chance. The machine is trained to spot randomness — and not just spot it, but figure out what might happen as a result of it. That’s when platform starts helping people before they knew they needed help at all.
Why I.A in Healthcare
Being disruptive via Intelligently Artificial to augment Natural Intelligence in a healthcare domain is a compelling story in contrast to replace Natural Intelligence via Artificial Intelligence. The combination of the three factors outlined are foundation for EHR platforms, most of which we have not yet seen.
Decreasing cost of computing power
Thinking, for all practical purposes, is computation. And in order to simulate a system that is even remotely intelligent, a great number of computations are needed. GPUs, the chips used for generating computer graphics in video games etc., were eminently suited to run the sort of massive parallel computations needed for building augmented intelligence architecture. In practical terms, this has meant that calculations that used to take up to several weeks, now take less than a day, and the time is shrinking. Building intelligent applications would simply not be possible without the increase in cheap, available computing power that we have been fortunate to witness the last decades.
Availability of data
It is no coincidence that the recent intense interest in augmented intelligence from the healthcare industry comes right after Big Data became a household word. The by far biggest common denominator is that they sit on truly massive amounts of data that they need to analyze.
Augmentative Intelligence a.k.a. I.A. bears the promise of an automatic analysis and management of historical sea of data, which is what healthcare industry are looking for. There is an interesting symbiosis between Intelligence research and data: Just like the brain of a child, an intelligent system needs information to learn. And healthcare platforms who sit on significant volume of information usually wants to minimize the human effort of analyzing this data. This relationship is bound to fuel the development of intelligence going forward.
A nice example of this is EMR SaaS Intelligently Artificial engine. It is a distributed cognitive system, meaning that it is spread out in the cloud, collecting information every time it's being used, everywhere. This means that the more people use the engine, the better the system becomes at it's job.
Better Algorithms
Artificial Neural Networks (ANN), is an attempt to model the network of nerve cells in a human brain on a computer. Loosely speaking, it is a interconnected web of artificial neurons that either fire or not based on what the input to the neuron is. A key part of building these neural networks, is to be able to train them to do the correct thing when they see data. The study of these algorithms has now spawned its own subfield known as Deep Learning, its name referencing the number of neuron layers in the neural networks.
Conclusion
There exists a big gap between where businesses could be and where they are in terms of building internal competency with, and implementing, this technology. It is precisely because of this gap that we maintain the hypothesis that the disruptive force of I.A. is almost completely independent of the future progression of the field. It has already arrived.