Startup No Barrier is using AI to improve medical translation
The company just closed a $2.7 million seed funding round on Nov. 17.
• 3 min read
Cassie McGrath is a reporter at Healthcare Brew, where she focuses on the inner-workings and business of hospitals, unions, policy, and how AI is impacting the industry.
Each week, we schedule our rounds with Healthcare Brew readers. Want to be featured in an upcoming edition? Click here to introduce yourself.
There are 26 million US residents with limited English proficiency, according to health researcher KFF, which also found that people with language barriers are more likely to report physical health as “fair” or “poor” compared to those with English proficiency.
Language services are also in demand, with Kent State University reporting that the language services market size is expected to reach $76.8 billion in 2025 and grow to $98.1 billion by 2028.
Startup No Barrier is using AI to close some of these language gaps. The HIPAA-compliant platform is currently in use at more than 100 healthcare sites across 12 states and can translate in real time into 40+ languages including Spanish, Mandarin, and Arabic, according to the company’s website.
Fresh off a $2.7 million seed funding round on Nov. 17—led by venture capital firms A-Squared Ventures, Esplanade Ventures, Rock Health Capital, and Fusion—CEO and co-founder Eyal Heldenberg spoke with us about how the technology works and his overall goals for the company.
This interview has been edited for clarity and length.
How exactly does the technology work?
The problem is the workflow today. Every time the patient doesn’t speak English, there is a whole workflow to just get a human interpreter, and sometimes they don’t and need to bring a family member.
Even if you have a human interpreter, this is a speech-to-speech [issue]. The provider says something, then the medical interpreter thinks about the best way to convey the message to the patient and vice versa. So we need a speech-to-speech experience. We need to take into account everything that the human interpreter will. For example, gender. There are different languages where gender is a context.
Navigate the healthcare industry
Healthcare Brew covers pharmaceutical developments, health startups, the latest tech, and how it impacts hospitals and providers to keep administrators and providers informed.
To summarize, it’s a speech-to-speech, real-time, audio-to-audio pipeline where we take into consideration different parameters…even to stop the conversation and say, ‘Hey, I didn’t get the medication name. Could you repeat it?’ This is another characteristic of human interpreters. So this is what we are basically trying to do: take the human brain and put it in a piece of software.
What are your long-term goals for the company?
In the clinic itself, the patient has four to five touch points: reception, nurse, doctor, X-ray. So visits are one thing. Telehealth is another thing. What about documents? What about websites? What about chats? What about if I need to sign a consent form? What about patient education? A hospital or health network needs to make sure there is an accurate, safe, compliant layer to facilitate it. And the stories we heard [show us] it’s hard.
We believe that it is actually solvable for the majority of the encounters and for the majority of languages, though not for every language. American Sign Language is not there yet because of the visual element. But we do believe that in the next three, four years, you’re going to be seeing mass adoption and more affordable, more accessible, and more private [communication via different languages]. We are eager to see the market dynamics here and see the adoption around these technologies.
Navigate the healthcare industry
Healthcare Brew covers pharmaceutical developments, health startups, the latest tech, and how it impacts hospitals and providers to keep administrators and providers informed.