Advice to the Profession on the Responsible use of Artificial Intelligence in the Practice of Medicine

July 08, 2024 |
News Advice to the Profession

View a PDF version of this document here

 

 

Advice to the Profession

 

CPSM provides advice to the profession to support registrants in implementing CPSM’s Standards of Practice, Practice Directions, and the Code of Ethics and Professionalism. This advice document does not define a Standard of Practice, nor should it be interpreted as legal advice.

 

In general, advice documents are dynamic and may be edited or updated for clarity at any time. Please refer to this article regularly to ensure you are aware of the most recent advice. Major changes will be communicated to registrants through the CPSM newsletter; however, minor edits may only be noted within the documents.

 

Preamble:

It is important for registrants to educate themselves about the responsible and ethical use of artificial intelligence (AI) in practice. This document primarily addresses generative artificial intelligence (GenAI), though most principles have broad application to other forms of AI. The advice provided is centred on the importance of education, accountability, transparency, informed consent, confidentiality, and equity in healthcare. Systems issues are also addressed.

(Click to jump to each section) 

  1. Context for this document
  2. Potential applications for GenAI in medical practice 
  3. Ensuring compliance with applicable professional requirements 
  4. Continuing professional development is necessary 
  5. Responsible use of GenAI tools and accountability 
  6. Disclosure and informed consent
  7. Complete and accurate documentation
  8. Bias, equity, and respect for persons
  9. Confidentiality and data security
  10. Data sovereignty
  11. Health system harm
  12. Patients using GenAI
  13. Useful resources

 

1. Context for this document

Recent advances in AI1, such as the public release of new GenAI2 technologies like ChatGPT (OpenAI) and CoPilot (Microsoft), have stimulated significant discussion around the impact AI will have on the delivery of healthcare. GenAI is in focus given its remarkable ability to create new content, such as diagnoses, treatment plans, and encounter notes (e.g., AI scribes)3.

While CPSM acknowledges the enormous potential for GenAI to improve accessibility, enhance quality, and reduce administrative burden, we also recognize the significant ethical, legal, and professional challenges that accompany its use, including relating to over-reliance. This document is intended to provide advice to assist registrants in addressing these challenges.  

Overall, the use of GenAI in healthcare should minimize potential data-related harm and promote the equitable delivery of safe quality care. As explained below, appropriate precautions anchored in the Code of Ethics and Professionalism are key to the responsible use of AI.

Back to top

2. Potential applications for GenAI in medical practice 

AI tools used to support diagnosis, monitoring, treatment planning, and documentation have been around for many years. These include dictation and speech recognition software, electronic medical record (“EMR”) macros or templates, and automated pathways, protocols, and clinical scores. Software exists that can analyze diagnostic information or identify pharmacological interactions and contraindications. These more traditional tools often rely on machine learning (ML)4 technology and well-circumscribed datasets.

Large language model (LLM)5 based GenAI is a major evolution in AI technology that builds upon advances in complex algorithms, advanced data analytics and ML. Significantly, GenAI tools can create new content based on existing data and training. Given this capability, the importance of independent clinical judgment and professional accountability is crucial if GenAI is adopted in practice. In this context, there are concerns surrounding the transparency of underlying models and source data in the use of LLM based GenAI that must be understood and managed.

GenAI tools that support ambient capture of patient consultations and clinician dictations are already being incorporated into medical practice to assist with documentation. These tools are commonly referred to as AI Scribes and, to the best of our understanding, usually do not fall under the definition of a medical device requiring Health Canada approval.6 That flows from their characterization as a tool to assist with documentation, rather than a clinical decision support tool.
 
Other potential applications for GenAI in the practice of medicine include tools for generating diagnoses, differential diagnoses, prognoses, prescriptions, treatment plans, educational materials, and medical reports.7 These clinical decision support applications would require Health Canada approval. While there are many examples of approved medical devices that rely on AI and ML8 it is understood no medical devices that use LLM GenAI have been approved as of the last update to this document.9
 
The use of GenAI in healthcare is a rapidly evolving area. In the circumstances, there is currently limited research-based evidence to guide an approach to the use of GenAI tools by registrants, though there are many studies underway. Laws and regulations specific to this field are contemplated or in development at various levels of government.10 It is anticipated use of GenAI in the practice of medicine will become commonplace in the coming years.
 
Back to top
 

3. Ensuring compliance with applicable professional requirements 

3.1.      Registrants are reminded that requirements and principles from existing legislation, regulations and professional expectations apply when using any type of AI in their practice. This includes CPSM’s Code of Ethics and Professionalism, Standards of Practice of Medicine, and Practice Directions.   

 

3.2.      If the AI tool fits the definition of a medical device, then registrants should ensure it is approved by Health Canada. As of the date of this document, no LLM GenAI medical devices have been approved. CPSM understands AI Scribes usually do not fall under the definition of a medical device.

 

Back to top

 

4. Continuing professional development is necessary

4.1.      Registrants have a professional and ethical duty to maintain a requisite level of skill, knowledge, and judgement to provide competent and safe care in their professional practice. Continuing professional development is key to meeting this obligation.

 

4.2.      As AI tools are increasingly incorporated into the healthcare system, which may include GenAI tools, it is important that registrants follow significant developments and strive to understand how the technology works, its limitations, benefits and risks, and privacy implications. 

 

Back to top

 

5. Responsible use of GenAI tools and accountability 

5.1.      CPSM does not regulate medical devices, tools, or technologies; our role is to regulate registrants who use these tools in their professional practice. The GenAI tools registrants use must be physician guided to create positive and transparent interactions that instill trust. Consistent with the prevailing standards for any technology used in care, registrants are ultimately responsible for their use of GenAI tools and may be held accountable for any harms that flow from use.

5.2.      Care provided by a registrant should always reflect their own clinical reasoning and professional judgment. The Canadian Medical Protective Agency (CMPA) makes it clear that AI technologies are intended to assist and complement clinical care. They are not a replacement for clinical reasoning and professional judgment.11 CMPA also has helpful guidance on the use of AI scribes.12 Registrants should take steps to avoid over-reliance on GenAI tools to such a degree that it jeopardizes independent professional judgement and vigilance.

 

5.3.      The extent to which a registrant may be held professionally accountable for their use of a GenAI tool will depend on the relationship between the GenAI being used and the risk that it may either create patient harm or otherwise impact the professional obligations of the registrant. As GenAI technologies perform functions that more closely model the practice of medicine, the risk to patients of their application generally increases. The appropriate level of and accountability will increase accordingly.

 

5.4.      If a registrant chooses to use a GenAI tool in their professional practice, then it is important they understand the tool’s intended purpose, limitations, risks, and benefits to ensure safe and competent use. Registrants should:

5.4.1.        critically assess whether the tool suits its intended purpose in the context of the registrant’s practice setting and professional practice, 

5.4.2.        ensure that the tool is:

5.4.2.1.        up-to-date, valid, and reliable,

5.4.2.2.        transparent respecting the data it is trained on,

5.4.2.3.        explainable to patients, including respecting limitations, and

5.4.2.4.        capable of producing good and interpretable output, and

5.4.3.        if not using a closed/locked GenAI system, then acknowledge the potential risks of open-source nature of GenAI tools and the issues they could create.

 

5.5.      An adequate understanding of a GenAI tool’s design, the training data used in development, and the nature of the tool’s outputs is necessary to assess validity, reliability, and to identify and mitigate potential areas of bias. Depending upon the technology, underlying data sources and training may not be transparent, making it difficult to understand the validity and reliability of the technology. Technology that is not transparent should be avoided or, at a minimum, approached with heightened vigilance and caution.

 

5.6.      GenAI outputs may include information that is inaccurate, incomplete, outdated, harmful, biased, or otherwise inappropriate for the medical problem. Registrants must be aware and responsive to this issue. Such content carries the risk of undermining trust and accountability or violating ethical and professional standards.

 

5.7.      The duty of competence requires more than the detection and elimination of false GenAI results. Competence requires the continuous application of clinical reasoning and analysis regarding all potential options and impacts, including those that are included or omitted from or by GenAI tools. Risks associated with the use of GenAI include:

5.7.1.        reducing human agency and autonomy by replacing or influencing human decision-making or behaviour; for instance, through complacency or frame-of-reference thinking,

5.7.2.        potentially compromising privacy and security by exposing sensitive and confidential data to unauthorized parties or malicious attacks, and

5.7.3.        skills degradation over time.

 

5.8.      If a GenAI tool produces clinical decision support or advice related to a specific patient’s care, the registrant accepts the responsibility for care delivered. Whether GenAI clinical decision support is followed or ignored, registrants should document the rationale behind the deviation or use of an AI tool designed to give specific advice or guidance.13 

 

5.9.      Registrants must be aware of conflict-of-interest concerns and guard against them when using AI. The Code of Ethics and Professionalism and CPSM’s Standard of Practice for Conflicts of Interest apply in this respect. This is particularly important if AI is used to recommend treatment options, for example specific medications. 

 

5.10.   Practice settings should establish applicable policies and procedures for the responsible use of GenAI and update privacy policies accordingly. This should include policies and processes to identify, report, and address concerns about the use of GenAI tools, and the potential for raising issues with vendors or operating organizations.

 

Back to top

 

6. Disclosure and informed consent 

6.1.      With a novel technology such as GenAI, CPSM recognizes that it may be challenging to communicate the risks and benefits accurately and comprehensively. As a starting point, registrants are reminded that provisions of the Code of Ethics and Professionalism and CPSM’s Standard of Practice for Good Medical Care apply respecting disclosure and informed consent regarding the use of GenAI tools in providing care.  

 

6.2.      Informed consent is not a list of AI-generated risks and benefits, but instead a meaningful dialogue and shared decision-making between the physician and patient. One of the primary goals of the informed consent process as a component of good care is to ensure patient autonomy in clinical decision-making. This is accomplished both by informing patients about the care they are receiving, including assessment and management decisions, and safeguarding patient privacy. Use of GenAI has implications for these issues and must be disclosed. GenAI may be used to assist in this process but the ultimate responsibility rests with the registrant.

 

6.3.      For informed consent to be valid, a patient must be adequately informed about their diagnosis and treatment options, the risks and benefits involved, and reasonable alternatives. If a GenAI tool is used in clinical decision support, registrants should:

6.3.1.        disclose how GenAI tool was used,

6.3.2.        discuss capabilities and limitations of the tool,

6.3.3.        discuss safeguards that have been put in place to manage bias and ensure validity and reliability, and

6.3.4.        be able to independently explain components of diagnosis and treatment options to fulfill their professional responsibilities relating to the informed consent process.

 

6.4.      A lack of transparency regarding the role that GenAI has played in the delivery of care and the inability of the physician to communicate with the patient can undermine trust and may serve to highlight the registrant’s lack of understanding of how the GenAI tool works.

 

6.5.      Informed consent is also important for privacy reasons. Because data received during a patient encounter may be entered into GenAI tools, registrants must receive a patient’s informed consent in advance of use.14 This is necessary for tools such as AI Scribes that record patient encounters. The discussion should include the reasons for making the recording, how patient data may be accessed, used, or shared, as well as potential risks involving data integrity and privacy. Patients also need to be informed about their right to refuse, withdraw, or modify consent, and their access and copying rights under PHIA if the recording is maintained as part of the patient record.15

 

Back to top

7. Complete and accurate information 

7.1.      Currently, the most common use for GenAI is as a clinical scribe. This software interacts with the patient’s EMR to automate documentation of care. Without proper oversight, this may lead to incomplete or inaccurate documentation and subsequent patient harm.

 

7.2.      Registrants must be mindful that the requirements of CPSM’s Standard of Practice for Documentation in Patient Records applies to documentation created with the support of GenAI. The Standard requires registrants to ensure that the patient record accurately and completely reflects their involvement in care. As such, registrants must not rely on content created by a GenAI system as the sole or final source of information. Rather, they must verify and validate any GenAI-generated content to ensure it is accurate and complete before it is entered into the patient record. Registrants should not attempt to hide the use of GenAI in professional practice.

 

7.3.      Recordings of an encounter as captured by a GenAI tool (e.g., audio, video, and/or transcript) are not explicitly addressed in CPSM’s Standard of Practice for Documentation in Patient Records or the Standard of Practice for Maintenance of Patient Records in All Settings. This will need to be considered when those Standards are next reviewed. Meanwhile:

7.3.1.        The key is satisfying the requirements of the Standard of Practice for Documentation in Patient Records.

7.3.2.        Registrants should assume they are required to retain the complete recording, or transcript if one is generated, as part of the patient record. Additionally, this would be prudent for medico-legal reasons, including if the completeness or accuracy of the encounter notes comes into question.16

7.4.      Registrants are expected to record the context in which documentation is generated. This should include the author’s identity, for example the registrant, learner, scribe, or allied healthcare provider. Documentation should include notation of any assistive technology used to generate the note, for example, dictation software or an AI scribe.

Back to top

8. Bias, equity, and respect for persons

8.1.      A key professional responsibility in medicine has always been the assurance that clinical decisions and recommendations are not biased. Potential biases and risks related to the use of GenAI can arise from the source data. Biased training data incorporated into GenAI tools may ultimately impact patient care and because of the potential that GenAI could perpetuate, rather than eliminate, bias in healthcare.

8.2.      AI systems encumbered by prejudiced, false, or inaccurate information may carry a bias that can be detrimental to providers and harmful to patients. Registrants should therefore make reasonable efforts to identify and address such biases before using GenAI systems in patient care. Registrants should use caution in interpreting AI generated content, accounting for the demographics and health context of the patient they are assessing.

8.3.      Registrants are expected to respect the dignity, diversity, cultural values, and rights of patients and colleagues, and avoid using GenAI to create or disseminate content that is discriminatory, offensive, or harmful.

 

Back to top

 

9. Confidentiality and data security 

9.1.      Registrants are expected to ensure compliance with applicable federal and provincial laws, including PHIA, as well as applicable policies within the practice setting.

9.2.      Registrants are legally and ethically required to ensure confidential information, including the patient’s personal information and personal health information, is adequately protected.

9.2.1.        Registrants should take care not to expose a patient’s personally identifiable information when using GenAI to support their clinical care.

9.2.2.        Even without names or personal health numbers, a patient’s privacy may be exposed by the clinical uniqueness of a case.

9.2.3.        A privacy impact assessment (PIA) should be conducted should AI be introduced to practice.

9.2.4.        The practice setting’s Securing Personal Health Information Policy should address GenAI if it is used in the practice setting. 

9.3.      Part of the use of GenAI in documenting care requires these systems to access and review personal health information. Registrants should be aware of what security measures are in place to ensure the information provided to AI systems remains secure and in compliance with existing provincial and federal laws, as well as the patient’s preferences. Registrants retain their duty to audit patient records in accordance with CPSM’s Standard of Practice for Maintenance of Patient Records in All Settings. Registrants should consult with their IT or cybersecurity expert to ensure that any AI system uses has appropriate data security, confidentiality, and retention protocols.

Back to top

10. Data sovereignty 

10.1.  Principles of ownership, control, access, and possession (“OCAP”) assert that First Nations have control over data collection processes and that they own and control how this information can be used. Registrants should consider whether the Securing Personal Health Information Policy within their practice setting is respectful of data sovereignty for Indigenous peoples. The Canadian Institute for Health Information (CIHI) provides some guidance in this area.17 18 19

Back to top

11. Health system harm

11.1.      Cost overruns and system inefficiency arising from poor data design and use are a material source of harm in the healthcare system. The use of GenAI to support documentation and clinical decision-making has the potential to improve efficiency and access to care while reducing costs. However, it must be carefully evaluated to avoid unintended harm to patients or the uploading of inaccurate or unverified information to provincial databases.

Back to top

12. Patients using GenAI

12.1.  People are using GenAI in their personal lives, including for health advice. ChatGPT and other Chatbots have the potential to significantly impact how patients acquire medical information online. This includes to track health statistics and to help understand signs and symptoms. In some cases, they may wish to bring this information to the attention of their physician. In fact, many GenAI products advise users to review information provided with a qualified medical practitioner.  Registrants should be prepared for this to occur more often in practice and should consider a respectful approach to handling the situation. 

Back to top

13. Useful resources 

Back to top

References: 


1Artificial Intelligence (AI) is an umbrella term that refers to the ability of a machine (e.g., computer) to perform tasks associated with intelligent beings, such as reasoning, language comprehension, and decision making.

2 Generative AI (GenAI) refers to advanced AI systems that can be prompted to generate new content, including audio, images, text, and videos, in response to prompts from users. GenAI relies on massive datasets and complex underlying algorithms and computer models.

3 Examples of these technologies include IBM’s Merative (formerly Watson Health) and Google’s Med-PaLM

4 Machine learning (ML) refers to a machine’s ability to learn from its own experiences rather than relying exclusively on explicit programing. ML involves the use of algorithms that can analyze data, make predictions based on that data, and establish models when applied to data. ML can be supervised or unsupervised.

5 Large language models (LLMs) are a type of AI that use massive data sets, algorithms, and deep learning to understand context, and generate and predict new content. Models can be fine-tuned on specialized data to produce improved outputs. All LLMs are a form of GenAI.

6 See the federal Food and Drugs Act and associated Medical Devices Regulations.

7 Experimental examples include DxGBT, which is described as diagnostic support software, and Med-PaLM (Google Research), which is a LLM designed to provide high quality answers to medical questions.

8 Medical devices that use ML to achieve their intended medical purpose are known as machine learning-enabled medical devices (MLMDs). The term "Medical purpose" is defined in the federal Food and Drugs Act. MLMDs are subject to the Food and Drugs Act and associated Medical Devices Regulations.

9 It is understood that recently approved AI assessment tools for diabetic retinopathy screening and dermatology differential diagnosis suggestions do not rely on LLMs.  

10  See the Office of the Privacy Commissioner of Canada’s publication on ‘Principles for responsible, trustworthy and privacy-protective generative AI technologies’ at https://www.priv.gc.ca/en/privacy-topics/technology/artificial-intelligence/gd_principles_ai/

11 Canadian Medical Protective Association. The emergence of AI in healthcare, 2019 (revised 2023).

12 CMPA’s ‘AI Scribes: Answers to frequently asked questions’, retrieved from https://www.cmpa-acpm.ca/en/advice-publications/browse-articles/2023/ai-scribes-answers-to-frequently-asked-questions

13 For example, some GenAI tools used for this purpose, which are noted to be in their research phase, generate an extensive list of differential diagnoses and the rationale for each diagnosis listed. This output should be included in the patient record akin to a consultation.  

14 Express consent is always required to record an encounter with a patient. Depending on the practice setting, it may be wise to share the practice setting’s policy and information about the AI in advance.

15 See CMPA’s ‘Recording clinical encounters with patients: What physicians need to know’, retrieved from

https://www.cmpa-acpm.ca/en/advice-publications/browse-articles/2023/recording-clinical-encounters-with-patients-what-physicians-need-to-know

16 See CMPA’s ‘Recording clinical encounters with patients: What physicians need to know’ retrieved from https://www.cmpa-acpm.ca/en/advice-publications/browse-articles/2023/recording-clinical-encounters-with-patients-what-physicians-need-to-know

17 See the Government of Canada’s ‘Pan-Canadian Health Data Charter’ retrieved from https://www.canada.ca/en/health-canada/corporate/transparency/health-agreements/shared-health-priorities/working-together-bilateral-agreements/pan-canadian-data-charter.html

18 See CIHI’s, ‘A Path Forward: Toward Respectful Governance of First Nations, Inuit and Métis Data Housed at CIHI, retrieved from https://www.cihi.ca/sites/default/files/document/path-toward-respectful-governance-fnim-2020-report-en.pdf

19 See FNIGC’s ‘The First Nations Principles of OCAP, retrieved from https://fnigc.ca/ocap-training/