to reply. Developing decision support in both primary care and hospital care is an important issue. recently posted Discussion article in Läkartidningen [1] Unfortunately unclear, because it does not distinguish between AI as a way to develop algorithms/decision rules for decision support and support for more general AI.
The article also lacks a discussion of the difference (in current circumstances) between decision support based on evidence-based algorithms developed by consensus within healthcare and support where AI has been used to develop algorithms that can be reviewed. It is, of course, a crucial question about how to continue to work with decision support. It is likely that both methods will be needed, but what are the pros and cons of different solutions, and what artificial intelligence adds in different situations, it should be clarified.
However, more important is the need not only to test internal validity in decision support, but also that it is relevant and meets needs in clinical use – external validity. It is especially important when using AI to develop and test algorithms in certain units and start using them in other similar clinics. Otherwise, hopes can be exchanged for the risks to patient safety. For example, the Epic scoring system has temporarily withdrawn AI-based decision support for septicemia in acute care after it was shown to have significant shortcomings when used more widely in similar units. [2].
There is no reference in the article to the very active discussion taking place among AI and computer science researchers about the difficulties in validating AI-based decision support. [3, 4]. Two of the authors appear, by affiliation, to represent the company whose decision support was highlighted in the debate essay; On the other hand, there is a lack of declaration of approval for the other two. AI means new opportunities, but the debate essay does not give me the support to evaluate what is being offered about the company’s medical technology product or about AI in these contexts.
I look forward to more articles on decision support with and without elements of AI that highlight not only the perceived benefits but also the benefits and risks identified in relevant clinical trials and risk analyzes when implemented in healthcare. There is also a need to have a discussion about how we should act in care in the future with many forms of decision support from different providers in the same work environment with different reasoning about signals and recommendations [5].
Possible ties or jealous relationships: none given.
Lakartidningen.se
More Stories
The contribution of virtual reality to research in medicine and health
The sun could hit the Internet on Earth
In memory of Jens Jørgen Jørgensen