As artificial intelligence (AI) makes its way into cancer care – and into discussions between doctors and patients – oncologists grapple with the ethics of its use in medical decision-making. In a recent survey conducted by researchers at the Dana-Farber Cancer Institute, more than 200 oncologists in the United States largely agreed on how AI can be responsibly integrated into aspects of patient care and said also expressed concern about how to protect patients from hidden biases. AI.
The investigation, described in an article published March 28 in Open JAMA Network, found that 85% of respondents said oncologists should be able to explain how AI models work, but only 23% thought patients needed the same level of understanding when considering treatment. Just over 81% of respondents believe patients should provide consent to the use of AI tools to make treatment decisions.
When the survey asked oncologists what they would do if an AI system selected a different treatment regimen than the one they planned to recommend, the most common response, offered by 37% of respondents, was to present both options and let the patient decide.
When asked who was responsible for medical or legal issues arising from the use of AI, 91% of respondents cited AI developers. This figure was much higher than the 47% saying the responsibility should be shared with doctors, or the 43% saying it should be shared with hospitals.
And while 76% of respondents noted that oncologists should protect patients from biased AI tools – which reflect inequities in representation in medical databases – only 28% were confident they could identify patterns. of AI that contain such biases.
“The results provide a first insight into where oncologists are thinking about the ethical implications of AI in cancer care,” says Andrew Hantel, MDfaculty member in the leukemia and population sciences divisions of the Dana-Farber Cancer Institute who led the study with Gregory Abel, MD, MPH, chief medical officer at Dana-Farber. “AI has the potential to produce major advances in cancer research and treatment, but stakeholders – doctors and others who will use this technology – have not had much information about what its adoption will mean for their practice.
“It is essential that we assess now, in the early stages of the application of AI to clinical care, its impact on that care and what we need to do to ensure it is deployed responsibly. Oncologists need to be part of this conversation. begin to build a bridge between the development of AI and the ethical expectations and obligations of its end users.
Hantel notes that AI is currently used in cancer care primarily as a diagnostic tool – to detect tumor cells on pathology slides and identify tumors on X-rays and other radiological images. However, new AI models are being developed that can assess a patient’s prognosis and may soon offer treatment recommendations. This capability has raised concerns about who or what is legally responsible if an AI-recommended treatment results in harm to a patient.
“AI is not a licensed professional physician, but he may one day make treatment decisions for patients,” notes Hantel. “Will AI become its own practitioner, will it be licensed and who are the humans who could be held responsible for its recommendations? These are the kinds of forensic issues that need to be resolved before the technology is implemented. .
“Our survey found that while almost all oncologists felt that AI developers should bear some responsibility for AI-generated treatment decisions, only half felt that this responsibility also fell on oncologists or hospitals” , added Hantel. “Our study provides insight into where oncologists currently stand on this and other ethical issues related to AI and will, we hope, serve as a springboard for further examination of these issues in the future .
Financial support for the study was provided by the National Cancer Institute of the National Institutes of Health (grants K08 CA273043 and P30 CA006516-57S2); the Dana-Farber McGraw/Patterson Research Fund for Population Sciences; and a Mark Foundation Emerging Leader Award.