Clinically Intelligent is written by Juwon Akinyande in a personal capacity and is not affiliated with, endorsed by, or representative of Barts Health NHS Trust or any other NHS organisation. Content is for general educational purposes only. It does not constitute clinical, legal, or information governance advice. Before applying any guidance to your own practice, consult your Trust information governance lead, your Caldicott Guardian, your line manager, and your professional body.
Who's taking minutes. Every meeting starts with that question. The person who ends up doing it is also expected to contribute, which means trying to listen, speak, and type at the same time. Nobody does any of those three things well. The minutes that come out the other side are a thin record of what was actually said. Sometimes they land in your inbox by the end of the week. Other times two weeks later. Occasionally they never come at all.
I had this exact problem when I was planning to get feedback from someone I previously managed. In the past I had tried to capture feedback sessions by typing while I listened. The output was always thin. I was not listening properly because I was focused on typing. I was not typing properly because I was focused on responding. The staff member left with a vague sense of being heard. I left with a vague set of notes that took me another hour to write up.
This time I used Otter.
This Week's Tool
Otter is a note taking tool that captures and converts audio to text in real time. I have been using it for meetings for a while because when I am in a conversation I want to be focused on the person in front of me, not on a keyboard.

[Otter welcome page on the app]
Setup is fast. Download Otter from the App Store or Google Play, or go to otter.ai on a browser. Create a free account. The free tier gives you 300 minutes of transcription per month, which is more than enough for the use case I am writing about. Open the app and you land on the home screen.

[Otter Home Screen]
With the setup done, the difference in the feedback session was immediate. I asked more questions because I was actually listening. He spoke more freely because he could see I was not racing to capture his words. By the end of the meeting Otter had a full transcript and a draft action list. I reviewed it, edited it, and sent the minutes out the same afternoon. What previously took an hour was finished by the time the meeting ended.
Otter also has a chat function that turns the transcript into something searchable. Instead of scrolling line by line to find what was said about a topic, you ask the question and it pulls the line for you. What used to take five minutes of hunting now takes ten seconds.
Otter is not just for in person meetings. You can have it join a virtual meeting through your calendar so it captures Teams, Zoom, or Google Meet without you having to do anything. You can also import audio you have recorded locally if you want to transcribe something you captured outside the app.

[Option to import audio after clicking plus sign on home screen]

[Otter calendar option]

[Otter channels option]
The Limitations
Three things worth knowing before you rely on it.
Otter does not label speakers automatically. It assigns Speaker 1, Speaker 2, Speaker 3 and you label them yourself afterwards. In a one to one this is quick. In a group meeting you have to do the work before the transcript becomes useful.
Pronunciation. The transcript misses words, especially with strong accents or technical vocabulary. NHS terminology, drug names, and place names may take a hit. You will always need to clean the transcript before sharing it.
Multiple voices at once. If two people speak over each other, Otter usually catches one and loses the other. The recording is intact so you can listen back, but the transcript itself is incomplete in those moments.
Information Governance
The use case in this issue is narrow. A one to one feedback meeting with someone I previously managed, with explicit consent, no patient information discussed. That is the meeting type most clinicians have regularly and it is the use case I am writing about.
A few things worth knowing before you try it yourself.
Consent has to be explicit. Tell the person you want to record before the meeting starts. Get their agreement on the recording itself so it is captured in the transcript. If they say no, do not record. Recording someone in a workplace meeting without their knowledge engages privacy law and Trust policy, neither of which is worth the time saved on minutes.
Patient information stays out of the tool. Otter is a consumer product. If a meeting involves case discussion, MDT, or supervision where patients come up, this is not the right tool for it. That is a different conversation with your Trust's IG team and probably a different tool entirely.
Otter stores audio and transcripts on Amazon Web Services in the United States and uses transcription content to train its AI models. I could not easily find a clear opt out. Until that changes, the safe assumption is that anything you record may inform their models. That alone is reason enough to keep real patient context out of the tool.
Where I am with this. I am taking this workflow to my Trust's IG team before I scale my use of it. The question I am putting to them is whether a consumer tool on a personal device for non patient meetings with consent fits inside Trust policy. When I have an answer I will pass it on.
The same reminder as always. Otter is a consumer product. It has not been through NHS procurement or been assessed against DCB0129 or DCB0160. Until your IG lead tells you otherwise keep real patient context out of the tool.
If you are ever unsure ask your IG lead before you record anything.
Where To Start This Week
Your next one to one. Supervision. A check in with a colleague. A meeting where it is just you and one other person and you would normally walk out trying to remember what was agreed.
Tell them at the start that you would like to record the conversation so you can capture the action points properly. Ask if they are comfortable with that. If they say yes, press record. If they say no, do not record.
After the meeting, look at the transcript. See whether it captured what you would have struggled to write down at the time. Notice whether the action list comes out cleaner than the one you would have built from memory.
Ten minutes of setup. One real conversation. You will know within an hour of the meeting ending whether this is something you want to keep using.
Prompts Worth Saving
No prompt this week. Otter does not need one. You press record and it transcribes the meeting in real time. The work the prompt would do has been built into the tool.
If you want to take the transcript further, you can drop it into Claude or ChatGPT and ask the tool to pull the action points out for you. I have not done this with my own transcripts yet but if you want another perspective on what the meeting covered, two prompts worth trying.
For action points.
Pull out the action points from this meeting transcript. For each action include who agreed to do it, what the action is, and any deadline mentioned. If something was discussed but no clear xaction was agreed, list it separately as an open item.
For theme summary.
Summarise this meeting transcript by theme. For each theme include the main points raised, the conclusions reached, and any disagreements that were not resolved. Keep it brief and clinically practical.
Opinion
I record meetings for three reasons. Accuracy. Time. And to make sure both parties leave with the same action plan.
Most meetings end with two different accounts. The person who took the notes types theirs up. The other person waits to find out if theirs made it in. By the time the minutes land, sometimes the next day, sometimes two weeks later, the conversation has already faded. What gets actioned is whatever survived one person's memory and whatever they managed to type while also trying to contribute to the meeting.
That is the problem. Whoever writes the minutes decides what the meeting was. Not deliberately. Just because writing is selective and memory fades. The person taking notes is usually trying to listen, contribute, and capture at the same time. Something always gets lost. The version that gets sent out is their version. Shaped by what they caught, what they understood, and what they had time to type.
Otter does not change how meetings run. It changes what survives them. The transcript holds every word. The action list comes from the conversation itself, not from your reconstruction of it the next day. The hour you used to spend rebuilding the meeting in your head is now an hour you can spend on the work the meeting was about.
The tool is not for minute taking. It is for shared memory.
In Case You Missed It
Federated Data Platform expansion. NHS England has continued rolling out the Federated Data Platform across Trusts, connecting operational data to support theatre scheduling, discharge coordination, and patient flow. The platform is now live in a growing number of organisations with measurable operational use cases. Source. NHS England / Digital Health, April 2026.
The NHS is not just getting better data. It is starting to use systems to shape how patients move through care. Clinicians still make the decisions, but will be within systems that shape what gets prioritised. Most services are not designed for that yet. That gap is where issues will show up.
NHS moves to close source repositories. NHS England has moved to restrict public code repositories over concerns that advanced AI tools could identify and exploit vulnerabilities more easily. Teams have been asked to reduce exposure of open source code while security approaches are reviewed. Source. The Register, May 2026.
The closure is understandable. AI tools can scan open code in minutes and surface weaknesses humans took years to find. Control reduces risk. But it slows innovation in the part of the NHS that has actually been moving. The NHS will always pick control. The question that closure does not answer is what AI tools can already see in everything else the NHS has built.
That is all for Issue 05. Every week I will bring you something practical you can use and a view on where this space is heading. Next week the tool that turns the page nobody reads into the picture they cannot forget. If someone you know would find this useful pass it on.
Clinically Intelligent drops every Wednesday. If you are not yet subscribed you can join free at clinicallyintelligent.com.
The AI tools discussed in Clinically Intelligent are consumer products. They have not been independently assessed by the author against DCB0129 or DCB0160 clinical risk management standards, and they may not be approved for clinical use by your employer. Before using any tool described in this newsletter in connection with your clinical practice, you must satisfy yourself that its use is permitted under your Trust information governance policy, your DSP Toolkit obligations, your professional registration requirements, and any applicable contractual terms with your employer. The author accepts no liability for use of any tool or workflow described in this publication. Patient identifiable information must not be entered into any consumer AI tool under any circumstances, irrespective of any guidance contained in this newsletter.

