You are currently viewing FDA Makes New Rules for Explaining AI in Medical Devices

FDA Makes New Rules for Explaining AI in Medical Devices

Rate this post

In 2021, these three agencies created 10 guiding principles for good machine learning practices. One of these principles was that users should get clear and essential information about the devices. However, the details of how to achieve this were not provided at that time.

Now, the agencies have published a new document focused on transparency. They define transparency as how well information about a machine learning-enabled medical device is communicated to the right people.

Sometimes, these devices can give a result, like a diagnosis, without explaining how they arrived at that result. A device has “explainability” if it can explain the reasoning behind its output.

According to the FDA, it is useful to provide users with the logic of the algorithm “when this information is available and easily understood.” This helps users to “critically assess the device and its output when making decisions about patient care.”

The new document says that effective transparency means thinking about what information users need and the best way to share it. The FDA and its partners want device makers to consider the who, why, what, where, when, and how of transparency.

The “what” refers to the type of information that companies should share. Best practices include giving a clear and accurate description of the product and how it is used in healthcare.

Other parts of the document talk about where, when, and how to provide information. For example, information can be given through the device’s user interface. The agencies say it’s a good idea to make sure the software interface is easy to understand. Another suggestion is to provide “timely notifications” when a device is updated or new information is found.

Source: medtechdive