top of page

3 Questions To Dr. Anne-Laure Rousseau


3 Questions To… Dr. Anne-Laure Rousseau, vascular physician in Paris, President and Co-founder of NHANCE, a NGO whose objective is to improve patient access to care, through the parameterization of tools, training and evaluation of artificial intelligence algorithms on abdominal ultrasound in order to develop an ultrasound assistance tool.




1.

You created NHANCE, a non-profit community that creates free AI tools for ultrasound interpretation. According to you, how can AI be used to enhance quality of care for everyone?


Bringing the best diagnosis to a patient is not a trivial matter for a doctor who has to guess with his hands and his stethoscope (a tool that dates back 200 years) what is going on inside another's body. In some situations, this trick must be done in a few minutes, but sometimes there are only a few seconds to save a life and you can't be wrong. That's why, before the end of my studies, I became interested in ultrasound because it brought a major asset: to see, in a non-invasive way, what is happening inside the body of another.


"In 2019, we achieved an accuracy of nearly 80% in our algorithms for detecting suspicious kidney lesions."

In parallel with the deployment of ultrasound in all medical fields, artificial intelligence has developed considerably in recent years in the field of visual recognition. These artificial intelligence algorithms are extremely efficient and sometimes exceed human performance in terms of visual recognition for image interpretation. Artificial intelligence does not know how to do anything by itself; algorithms must be trained by qualified people to produce results.


To take a concrete example, today in almost 80% of cases, kidney cancer is discovered by chance during an examination carried out for another reason. If kidney cancer is discovered at an early stage and localized, it is then possible to perform a lumpectomy, i.e. surgery to remove the tumor, and this will cure the patient. If the cancer is discovered incidentally but it extends beyond the kidney, or if clinical symptoms such as lumbar pain, blood in the urine lead the patient to consult, the kidney cancer is most often already at an advanced stage or even metastasized, which makes it more difficult to cure and the prognosis is poorer. Hence, the need to detect this cancer as early as possible. An issue to which algorithms could greatly contribute.


We therefore presented a research project developing artificial intelligence algorithms to identify a suspicious lesion on ultrasound images of the kidney in order to contribute to the early detection of patients at risk of developing cancer. We also wished to identify the diagnosis of dilatation of the pyelocaliceal cavities, which is very useful in an emergency context. This involves identifying kidneys whose excretory pathways are obstructed or compressed, which means that the patient must be treated within a short time. In 2019, we achieved an accuracy of nearly 80% in our algorithms for detecting suspicious kidney lesions. Our accuracy was just over 60% for the detection of dilatation of the pyelocaliceal cavities, and we continue to work to improve these results.


2.

Two thirds of the world do not have access to imaging. In that regard, mobile ultrasound could be a sea change. Indeed, you claim that ultrasound has the potential to become the new stethoscope from doctors all over the world. Could you explain why?


When a patient comes for consultation, the physical examination alone is most often insufficient to reach a diagnosis. Ultrasound is the only imaging technique which:


  • is non-invasive,

  • is without side effects (in particular no radiation-induced cancer),

  • allows real-time diagnosis,

  • is economic.


It is for these reasons that this technique is a method of choice:


  • in an emergency,

  • in consultation for patient follow-up or a self-help examination,

  • in public health for population screening.


Ultrasound is not a recent technology; however, it is undergoing a revolution linked to the emergence of the smartphone. Like computers, ultrasound machines are gradually decreasing in size and price. In the next few years, we will go from voluminous ultrasound machines used in hospitals at several hundred thousand euros to a simple probe connected to a smartphone at a few hundred euros. We anticipate a massive deployment of these technologies in the near future, which have the potential to become tomorrow's stethoscope in almost all medical practice conditions.


We estimate that at least one billion additional ultrasound examinations could be performed each year to improve the quality of care. The real challenge in order to be able to perform these billion additional exams is to increase the number of qualified operators. The ultrasound examination is called "dependent operator", meaning that its quality depends on the training and attention of the personnel performing the examination.


Currently, learning ultrasound is complex and does not scale. The WHO has edited and put online the "Manual of diagnostic ultrasound", a 650-page book in 2 volumes, so that all doctors can take up ultrasound and improve "health for all" on the planet. A doctor who wishes to train in France, at the end of his 10 years of studies, will be able to present the Inter-University Diploma in General Ultrasound, which lasts 2 years, includes more than 500 hours of practice in different hospitals, about 90 hours of theoretical courses, and 10 exams. We are well aware of this, since two of the founding members of the association went through this course and met there.


"Deep learning techniques have only recently been applied to ultrasound with small databases. These experimental results show that these artificial intelligence techniques are more efficient than all previous techniques in the field of medical image segmentation."

Parallel to this expansion of ultrasound, another revolution is taking place. This is the much more publicized revolution of artificial intelligence, whose results have improved considerably in recent years and offer very relevant results in the field of image recognition. In particular, the deep learning technique has become essential for all issues of visual detection, which is the basis of ultrasound analysis.


Two publications have made a strong impression in the field of artificial intelligence applied to medicine in recent years. The first, published by a Stanford team in Nature in 2017, reports the results of an artificial intelligence trained to recognize skin tumors in photographs. The reliability of the algorithms developed was as good as that of medical specialists. The second, another article published in JAMA at the end of 2016 reports equivalent performances in the screening of diabetic retinopathy in the fundus of the eye. Deep learning techniques have only recently been applied to ultrasound with small databases (hundreds of images, more rarely thousands). These experimental results show that these artificial intelligence techniques are more efficient than all previous techniques in the field of medical image segmentation. Before going any further, our team wanted to establish a proof of concept of the interest of these techniques in the context of the practice of medical ultrasound.


"It is our responsibility to make progress in this research".

We chose to name our project Nhance for "enhance", which means "to improve" and is used in particular to talk about image quality improvement. We want to improve patient care. Our goal in our early projects was to help recognize an organ in an ultrasound image. Our first encouraging results led us to conduct projects in partnership with the Assistance Publique Hôpitaux de Paris, the Health Data Hub, INRIA and the Interdisciplinary Institutes of Artificial Intelligence.


3.

In your views, AI could become a tool for radiologists to enhance the relevance of their interpretations and to help them spend more time with patients. In the near future how will AI transform and impact health professions?


To improve the innovation cycle in France is urgent. As a comparison, for one study on AI and health there are dozens of them in Asia, Canada and the USA and it would be better not to become a user of products developed elsewhere when we can develop them here.


"AI will make it possible to do well certain time-consuming tasks, with little added value".

We are fortunate today to have revolutionary tools at our fingertips and a path is emerging to improve health. Reduce stress linked to the exponential increase in medical knowledge, give back time to caregivers, allow them to be more reliable. If we want this to be done in an ethical manner that allows the greatest number of patients to benefit from it according to the main principles of the French healthcare system, it is our responsibility to make progress in this research. The system is a bit out of breath.


In addition to helping with diagnosis, AI will make it possible to do well certain time-consuming tasks, with little added value, which are nowadays attributed to doctors. Physicians spend a staggering amount of time entering data into a computer system (from clinical examination, vitals, interrogation, biology, genetic imaging), the number of which is growing exponentially. Then the physician spends an astounding amount of time per hour classifying and relating the data, and finally concentrates on interpreting the collected data that has real value in medical thinking. The medical profession suffers, with an extremely high rate of depression and burn out, in the USA almost one out of two doctors is in burn out. Patients also suffer from these cold, mechanical consultations where the modern doctor's eyes are three times more glued to the screen than to the patient.


The assistance of artificial intelligence tools can free doctors from certain tasks that end up dehumanizing medicine. Consequently, it allows the revaluation of the founding aspects of medicine: empathy, the doctor-patient relationship, the global vision of the patient. Artificial intelligence must be used to give medicine a more human face.



TO GO FURTHER:

bottom of page