Artificial Intelligence is becoming a major technology in digital health. Accenture calls AI “the new nervous system of healthcare” and estimates that medical institutions’ spending on technology at $6.6 billion will grow by 40% annually. Consulting firms are also betting heavily on smart algorithms, predicting industry savings of $150 billion from AI apps by 2026. Anesthesiology, as part of medicine, will also benefit from using these solutions. Let’s take a look at how Artificial Intelligence and Machine Learning in custom healthcare software development are transforming this area of healthcare.
The development of the AI market in healthcare
The intense development of Big Data, high demand for healthcare services, and a shortage of human resources are driving healthcare institutions to embrace technological innovation. People demand that care be personalized and accessible.
The heads of clinics and hospitals see great promise for AI in solving these issues and are looking to invest in the technology. Statista estimates that the global AI market in healthcare will grow to $28 billion by 2025.
AI in healthcare is used for various purposes:
- Diagnosis of diseases.
The algorithm helps diagnose cancer by eliminating false-positives cases. The system examines CT, MRI, and ultrasound images highlighting abnormal areas that confirm the presence of diseases.
AI solves the problems of the spread of tuberculosis in developing countries. To diagnose the disease, a doctor only needs a microscope and a smartphone with Internet access. The healthcare worker takes a picture of a stained sample of the patient’s sputum. AI algorithms scan the received image and send information to the physician about the number and types of tuberculosis cells in the sample.
Artificial Intelligence predicts Alzheimer’s disease before its onset. The system analyzes MRI images based on various parameters, i. e. hippocampal characteristics, white matter density, etc.) and correlates these data with other patient parameters, i. e. the severity of cognitive impairment, genetic characteristics, and so on. Based on all these signs, the machine model diagnoses the disease long before its onset and makes it possible to predict its course. This work of the neural network is superior to current diagnostic methods.
- Drug development.
It takes an average of 10 years and $2.8 billion to discover and develop a new drug and obtain drug approval. For pharmaceutical firms whose pilot projects don’t meet the requirements, this means huge losses. AI eliminates the problem as it is able to find suitable chemical compounds from tens of thousands of candidates faster than a human.
A smart algorithm is able to predict with incredible accuracy the structure of the target protein on which the treatment is based, the drug-protein interactions, and therapy success. The technology develops a new drug molecule, selects subjects for the second and third phases of clinical trials, participates in other events crucial for the production, quality assessment, and promotion of the drug.
- Hospital management.
AI improves hospital administrative processes. A clinic is a place where work is constantly in full swing and unforeseen situations often occur. How to determine how many wards, doctors, equipment will be needed to provide proper care to all patients?
There are solutions on the market that solve the problem. For example, the Qventus AI app can tell patients’ conditions based on their vital signs. In order to help a patient in time, the program makes appointments with doctors and notifies them of a critical situation.
Other algorithms analyze historical data on hospital workload and determine the period in which the institution will need additional staff and equipment. This is how managers efficiently regulate the work of their organization. They are ready to provide quality assistance even during peak illness periods.
- Combating COVID-19.
AI technology has approved itself during the pandemic. AI-powered apps help doctors diagnose the disease, identify strains of the virus, and provide people with information about the spread of COVID-19. Algorithms determine the degree of lung damage based on CT. The NCBI electronic library published data that AI detected the infected people even in cases where the human eye didn’t discern any anomalies. Doctors said that the people were healthy while the system identified 17 out of 25 patients who later turned out to have the virus.
Six use cases for AI in anesthesiology
During surgery, an anesthesiologist must monitor up to 100 vital signs of a patient. It’s quite challenging for the human brain to cope with such an amount of information, so Artificial Intelligence can become a doctor’s reliable assistant.
AI has a large number of applications in anesthesia:
- tracking the depth of anesthesia (DoA);
- anesthetic delivery control;
- risk assessment before and after surgery;
- pain management;
- operating room logistics.
The depth of anesthesia is the degree of depression of the central nervous system by the anesthetic. A doctor selects the medicine in accordance with the patient’s health and chooses the right dosage.
AI helps doctors to understand the depth of anesthesia monitoring. It’s usually measured using a bispectral index (BIS monitoring) or an extended electroencephalogram (EEG). This data can be analyzed by AI. The accuracy of sedation depth estimation by the smart algorithm is 92% which is higher than in other monitoring methods, e.g. response entropy index.
Anesthetic delivery control
The anesthesiologist has three major tasks:
- Ensure patient safety during and after surgery;
- Reduce the postoperative recovery period;
- Reduce the cost of using medicines.
This can be achieved by controlling the delivery of the anesthetic to the human body. AI calculates the exact dosage of the drug for a specific operation based on preliminary laboratory studies. The technology predicts how long the drug will be absorbed into the blood, distributed throughout the body, and excreted from it. This data helps to plan the postoperative care of the patient.
Even before surgery, AI in anesthesiology detects patients who are likely to have a difficult recovery after surgery. It identifies those who may develop a reaction to the administered drugs, i. e. pain, vomiting, depression, and respiratory affection. The algorithm will suggest a plan for how to prevent or manage these problems during and after the surgery. The surgical team will be able to better prepare for these unexpected events in order to save the patient from the side effects of anesthesia.
Artificial Intelligence is able to determine which patients are likely to have severe postoperative pain and who will need long-term opioid support or other types of pain relief. It must be remembered that long-term use of opiates painkillers can cause drug dependence. To prevent this from happening, anesthesiologists draw up an anesthesia plan based on AI data using non-opioid methods of pain relief, i. e. nerve blocking, epidural anesthesia, and other alternatives.
To identify such patients, anesthesiologists need to interview them before surgery and fill out large questionnaires. Algorithms analyze electronic medical records and identify people who are likely to have long-term pain. Studies show that these are often young people, overweight patients, women, people who have previously used opioids.
Operating room logistics
According to Cardinal Health’s survey of surgeons and anesthesiologists, the OR needs an improved supply chain management system. Good control helps to avoid situations when there are not enough anesthetics or equipment for the planned operation.
As a result, 27% of respondents witnessed the use of an expired drug on a patient, while 23% have encountered clients whose surgery was postponed due to a lack of consumables.
AI will improve supply chain management by warning ahead of time when drug stocks are running out. Automated RFID systems allow you to monitor medical inventory and quickly find free equipment if an emergency operation is needed.
Obstacles to the implementation of AI in anesthesiology and the future of technology
Perhaps by 2030 anesthesiologists will have assistant bots that will be able to adjust the set parameters, i. e. anesthetic dose, blood pressure, heart rate, respiratory rate, and so on. The doctor will give commands and monitor the progress of the operation.
So far, studies note an imbalance between the hype raised around AI and real technological progress. The possibilities of AI in anesthesiology and other areas are still limited and years will pass when doctors will at least partially trust the algorithm for such serious tasks as anesthesia control.
An important reason for this mistrust is the hidden model of the algorithm. Doctors see only the result of AI “analysis” – forecasts and recommendations. But the “machine” does not provide explanations and facts why it is necessary to do just that. When the technology becomes more transparent, it will accelerate its acceptance in the medical ranks.
HIMSS has identified four major barriers to artificial intelligence:
- AI is in its infancy;
- forecasts must be supported by numbers;
- organizations infrastructure is not ready for AI;
- not all businesses are able to efficiently collect and manage data.
As soon as medical institutions cope with these barriers and see the benefits of AI in practice, a new era in anesthesiology will begin.
Most AI-based custom medical software in anesthesiology is developed and tested in clinical settings. It’s still difficult to say how useful they will be for clinicians and whether they will be able to solve the issues in anesthesiology. Technology is evolving and learning to work with large amounts of data. The researchers believe that the algorithm will take on some of the workloads of doctors, intellectually enhancing it. Thus, doctors will have more time to communicate with patients and more opportunities to do their job better.