Can Paramedics use AI to write their PCR narratives?

This is a question I have been getting a lot recently, and the answer is, frustratingly, maybe, sometimes. As artificial intelligence tools like ChatGPT become easier and more accessible to access, paramedics are (understandably) looking at those tools to solve one of the most time-consuming writing tasks they have: the narrative portion of their patient care report (PCR). In a busy system, a paramedic may have 12-14 narratives to write in a single shift, and the idea of having a computer write the narrative for you is certainly appealing! But whether or not doing so is legal or ethical depends on a number of issues. 

No Matter What, the (Human) Author of the Report is Responsible for The Content of the Report

AI tools—particularly large language models (LLMs) like ChatGPT—are excellent for assisting with grammar, writing structure, and summarization. However, they do not understand medicine or clinical context the way a trained provider does, and (obviously) they weren’t there on the call. Narratives must accurately reflect what happened during the call, and the final responsibility for that content lies with the primary provider. If a narrative is inaccurate, incomplete, or misleading due to AI-generated content, liability may fall squarely on the EMS provider or agency.

Another issue is that AI tools like ChatGPT are designed to “fill in” content when they don’t have all the information available. Take, for example, the narrative written by an AI tool:

Dispatched emergently to 123 Main Street for a 72-year-old male, identified as John Smith, reporting shortness of breath. Upon arrival, patient was found seated in a tripod position, alert and oriented x3, with audible wheezing and moderate respiratory distress. Patient has a known history of COPD and states that symptoms began worsening over the past several hours. He denied fever, chest pain, or recent illness, but reported increased sputum production and difficulty breathing at rest.

Primary assessment revealed patent airway, labored breathing with accessory muscle use, and diffuse expiratory wheezing in all lung fields. Skin warm and dry. Radial pulses strong and regular. SpO₂ 88% on room air, respiratory rate 28/min, HR 96 bpm, BP 138/86 mmHg. Patient speaking in short phrases.

O₂ administered via nasal cannula at 3 LPM, then upgraded to nebulized albuterol/ipratropium (2.5 mg/0.5 mg in 3 mL NS) via handheld nebulizer en route. Patient tolerated treatment well with some improvement in air movement and decreased work of breathing noted. SpO₂ improved to 94% on oxygen.

Patient transported in semi-Fowler’s position to Big City Hospital without incident. Continuous monitoring performed during transport. Report given to ED staff on arrival; care transferred without complications. No acute changes noted during transport.

Looks good right? The problem is, the prompt given to the AI tool to write the narrative was “write a narrative for a paramedic patient care report for patient John Smith who lives at 123 Main Street who had COPD and was short of breath with wheezing throughout and who was given a nebulizer treatment enroute to Big City Hospital.” See the issue? The AI tool completely made up an age, most of the assessment findings, vital signs, doses for treatment, responses to treatment, etc.  

Practice Tip: EMS agencies, in their AI policies, should remind staff that using AI to generate or polish narratives is acceptable only if the provider personally reviews, verifies, and edits the content before submission.

HIPAA and Data Privacy Concerns

Unless your agency has vetted an AI platform and has a Business Associate Agreement (BAA) in place with that platform, input of any personally Identifiable Information (PII) or protected health information (PHI) into these systems, is likely a HIPAA privacy violation. If you are going to try to use an AI platform to help with a narrative, there is a significant legal difference between asking it to “write a narrative for a patient care report for a patient who tripped and was taken to the hospital,” which may be acceptable from a HIPAA perspective, and “write a narrative for patient John Smith who lives at 123 Main Street who had COPD and was short of breath with wheezing throughout and who was given a nebulizer treatment enroute to Big City Hospital,” which is almost certainly a HIPAA privacy breach. While the second prompt may get you a much better narrative, it almost certainly violates the HIPAA privacy rule. 

Practice Tip: Agencies should prohibit providers from uploading any PII or PHI to non-secure, non-compliant AI platforms. Instead, look for EMS-specific tools developed with HIPAA compliance in mind or ensure any AI vendor will sign a BAA.

Agency Policies and EMS Regulations

Most EMS agencies and state EMS regulators have not issued guidance on AI use by paramedics. Agencies should consider developing internal policies before there is an issue, addressing:

  • Whether AI use is permitted

  • How AI may be used

  • Which platforms are approved or prohibited

  • Training and documentation expectations

Practice Tip: EMS leaders should consult legal counsel in developing internal AI policies, and before allowing or prohibiting AI tools, and ensure policies are clearly communicated to staff.

AI tools have real potential to reduce documentation burden in EMS. EMS agencies should develop clear policies, train staff, and involve legal counsel to ensure compliance with privacy laws and documentation standards.


Previous
Previous

New PA EMS Inspection Forms