Apple Intelligence’s Transcriptions Are Very Good

I have been using MacWhisper for recording and transcribing my conference calls and interviews for quite a while now. I have also tried FlowVoice.AI for using voice to respond to emails, text messages and do short dictations. Now that I have upgraded to the latest version of the Mac operating system, today, I decided to put the new Apple Intelligence voice transcription to the test. 

I dropped an audio file with a 58-minute-long interview into a note I created in Apple Notes. Less than a minute later, the conversation was transcribed. I then clicked on the audio file and inserted the transcript into the current note. 

It did a great job of organizing the conversation — questions and answers — in a streamlined fashion. In general, it did a much better of formatting the text in a readable manner. A series of related comments showed up as a single paragraph. There was literally no difference in the actual text generated by Apple Intelligence and MacWhisper. I tried on some older files — the results were equally good. The voice memos app on the iPhone also does a great job of transcribing the conversations.

While, in theory, MacWhisper should work with OpenAI more effectively, I have so many time outs that I stopped trying. I also encounter errors about the context window. Despite having an API, it never really works. As a result, I default to the built-in models. 

It pains me to say, considering my bias toward indie developers, but there seems to be no reason for me to open MacWhisper going forward.  Except for old habits, and a soft spot for the app.

Addendum: You can also dictate directly into notes, or if you want to record your interview, just hit the new “audio recording” icon in the Notes menu. You get very good results. However, apps, like FlowVoice do a much better job of summarizing a call or a dictation.

November 4, 2024. San Francisco