In recent months, the tech world has been buzzing with two words — artificial intelligence. At this point, it’s no longer something that’s new to consumers, thanks to Google’s use of AI in their latest flagship smartphones, the Google Pixel 2 and the Google Pixel 2 XL. However, as Sigmund Freud would put it, “it’s just the tip of the iceberg”.
And that’s actually true, given that the artificial intelligence can have so many more applications that aren’t limited to the improvement of smartphone camera photos. This technological advancement, along with the introduction of the Internet of Things (IoT) and 5g technologies, we can definitely hope to see much more use of AI, especially in the form of smart assistants.
That’s right, smart assistants, like Google Assistant, Apple’s Siri, Amazon’s Alexa, and the best friend nobody asked for, Samsung’s Bixby, will soon become commonplace and a vital part of everyday life. But once again, that’s just the tip of the iceberg.
Basically, for something to be deemed intelligent, at least as far as organisms go, it has to be able to learn.
According the basic principles of psychology, learning is defined as “a long-lasting, or even permanent change in behavior brought about by previous experiences”. An organism must be able to not only make sense of its environment (which sensors can actually do), but also retain information (which storage drives can also do), as well as assimilate that information in order to program itself to respond in a more effective manner the next time that same situation is encountered.
It’s this very mechanism that keeps us from burning our hands on a hot stove repeatedly. We only have to experience it once and our brains automatically assimilate the information that the stove is not to be touched, which then explains our aversion to touching hot things.
This concept is the main premise upon which artificial intelligence is built. We want to be able to create something that is capable of learning, and thus it also becomes capable of self-improvement. This technology can be of extreme help in the healthcare environment where there’s a heavy emphasis on diagnosis, prognosis, treatment selection, and of course, business processing.
Artificial intelligence can greatly increase a hospital’s efficiency by being able to assist doctors in their diagnosis. For example, when a patient walks in with a certain set of symptoms, the AI — assuming that it’s already been programmed to help with diagnosis — can then pull up a list of possible diseases that could be causing the patient harm.
After a diagnosis is met, then the rest of the process follows suit.
Now one aspect of healthcare that nobody likes doing is paperwork. While there is currently a lot of BPM software for healthcare, one cannot simply argue against the fact that the software could be greatly improved if it was capable of adjusting to each patient’s needs.
This eliminates the need for strenuous processes, both on the patient’s side and that of the healthcare company’s side. And once again, all this is just the tip of the iceberg.
And while we’ve delved rather deeply into the potential applications of AI into the healthcare field, one question still remains.
“Why haven’t we adopted this system yet?”