Code-switching is not a bug — it's how multilingual people actually think
The neuroscience behind why bilingual professionals switch languages mid-sentence, and why fighting it costs your team real productivity.
Every bilingual person has lived this moment: you're explaining something technical in English, and mid-sentence your brain reaches for a word that only exists in your other language. Not because you forgot the English word. Because the English word doesn't capture what you mean.
I watched it happen last Tuesday. A product manager on a call with her team in Tokyo was walking through a launch timeline in English. Then she hit the part about stakeholder alignment and switched to Japanese: "nemawashi ga mada desu." Three words that would take two paragraphs to explain in English — the informal consensus-building process where you individually sound out each decision-maker before the formal meeting so nobody is surprised. There's no English equivalent that carries the same weight.
Her transcript in a traditional tool? A gap. An error. A failure to transcribe.
That's the problem we kept running into when we started building MangoFinch. Every existing transcription tool treats a language switch as a mistake to correct. We think it's the opposite.
Your brain is running two languages simultaneously
Here's something that surprised me when I first read the research: bilingual people don't "turn off" one language when speaking another. Both languages are active simultaneously. All the time.
A 2012 study published in *Psychological Science* by Viorica Marian and Anthony Shook found that when bilinguals hear a word in one language, their brains also activate the phonological neighbors in their other language. Say "marker" to a Spanish-English bilingual and their brain also lights up "marca." This isn't a bug in human cognition. It's parallel processing.
The inhibitory control model, developed by David Green at University College London, explains what's actually happening: your brain is constantly managing competition between two active language systems. Speaking in one language requires actively suppressing the other. That suppression takes cognitive effort — real, measurable effort that shows up on fMRI scans.
This is why forcing someone to operate exclusively in their second language isn't just uncomfortable. It's computationally expensive for their brain.
The real reasons people switch mid-sentence
Linguists have been studying code-switching for decades, and the data consistently shows it's not random and it's not laziness. It follows specific patterns.
**Precision switching.** Some concepts are sharper in one language. A German engineer switching to "Feierabend" instead of saying "the culturally specific feeling of the work day being done and leisure time officially beginning" is being more precise, not less. A single word replaces an entire explanatory sentence.
**Emotional register switching.** Research by Jean-Marc Dewaele at Birkbeck, University of London found that bilinguals consistently rate emotional expressions as more intense in their first language. When something matters — frustration, excitement, concern — people reach for their L1. A developer saying "esto no puede ser" when hitting a blocking bug isn't being unprofessional. They're being honest.
**Cultural context switching.** Some ideas are culturally embedded. Try explaining the Portuguese concept of "saudade" — the melancholic longing for something absent — in English. You can approximate it, but you lose the cultural resonance. The word carries centuries of Portuguese maritime history and literature. "I miss it" doesn't come close.
**Efficiency switching.** Sometimes one language is simply faster for a given phrase. Korean has single-syllable words for concepts that take five English words. Arabic has verb forms that encode information English needs a whole subordinate clause for. Bilinguals optimize for communication speed without consciously deciding to.
**Topic-associated switching.** If you learned calculus in Mandarin, your mathematical vocabulary lives in Mandarin. If you learned management theory in English, those frameworks feel native in English. People naturally gravitate toward the language they originally encoded the knowledge in.
The productivity tax of English-only meetings
Here's where it gets expensive for companies.
A 2018 study by Neeley and Dufresne at Harvard Business School found that non-native English speakers in English-only multinational companies reported spending 20% more cognitive effort on language management than on the actual content of their work. One in five brain cycles going to translation instead of problem-solving.
That's not a rounding error. For a team of ten where six members are non-native English speakers, you're losing the equivalent of more than one full-time employee's cognitive output to language overhead.
I've seen it play out in real meetings. The native English speakers talk faster, interrupt more, and dominate the conversation — not because their ideas are better, but because the medium is their home court. The non-native speakers wait for openings, simplify their points to avoid grammatical mistakes, and often stay quiet on complex topics where their L2 vocabulary isn't strong enough to do their thinking justice.
A senior data scientist at a fintech company told me: "In English, I sound like a junior. In Mandarin, I sound like myself." She had eight years of experience and was consistently passed over in meetings because she couldn't articulate her insights with the same authority in English.
Concepts that refuse to translate
Every language has them. Words and phrases that encode an entire worldview.
**Japanese: "nemawashi" (根回し).** The process of quietly building consensus before a formal decision. You talk to each stakeholder individually, understand their concerns, adjust your proposal, and by the time the official meeting happens, everyone already agrees. Trying to explain this in English every time it comes up in a mixed-language meeting is exhausting.
**Portuguese: "saudade."** Not just missing someone. A bittersweet, existential longing — for a person, a place, a time, even a future that never happened. Brazilian team members reach for this word when English's emotional vocabulary falls short.
**German: "Feierabend."** The moment the workday ends and personal time begins. It's not just "quitting time" — it carries cultural weight about work-life boundaries that the English phrase doesn't have. When a German colleague says "Feierabend" at 5 PM, they're invoking a social contract.
**Korean: "nunchi" (눈치).** The art of reading a room — sensing the collective mood, understanding unspoken dynamics, knowing when to speak and when to stay quiet. English has no single word for this, and describing it as "emotional intelligence" misses the social and collective dimension.
**Arabic: "tarab" (طرب).** The state of musical ecstasy, when music moves you so deeply you feel transported. It's not "enjoyment." It's a specific altered state that Arabic music culture has named and cultivated for centuries.
When a bilingual person reaches for one of these words mid-meeting, they're not code-switching out of laziness. They're reaching for precision that their other language simply doesn't offer.
What this means for meeting tools
Most transcription tools are built on a monolingual assumption: one meeting, one language. When they encounter a switch, they do one of three things — try to force the audio into the "expected" language (producing garbage), flag it as an error, or just go silent.
All three responses send the same message: the way you naturally communicate is wrong.
We built MangoFinch on the opposite assumption. Code-switching is the norm for multilingual teams, not the exception. Our system detects language boundaries in real time and applies the correct recognition model to each segment. When someone switches from English to Japanese mid-sentence, both parts get transcribed accurately — because we expected that to happen.
The technical challenge was harder than building a monolingual transcription tool. Our speech engine handles individual languages well, but the transition points between languages — where one phonological system fades and another begins — are where most systems break. We spent months tuning our language detection at the segment level to handle these boundaries without dropping words.
The result is a transcript that reads the way the meeting actually sounded. Mixed languages, natural switches, and all.
The cost of pretending everyone is monolingual
Here's what I keep coming back to: the companies with the most diverse, multilingual teams are often the ones with the strictest English-only policies. And those policies don't make the multilingualism go away. They just push it underground.
People code-switch in side conversations. In Slack DMs. In the hallway after the meeting where they actually hash out what they were trying to say. The "official" meeting becomes a performance in English, and the real work happens in the spaces between.
That's an organizational design failure, not a language problem.
The 20% cognitive overhead for non-native speakers isn't just about individual productivity. It compounds. Ideas that would have been shared get swallowed. Nuances that would have been expressed get flattened. The person who has the clearest mental model of a problem stays quiet because explaining it in their L2 would take three times as long and sound half as convincing.
Building for how people actually communicate
When we started MangoFinch, the working thesis was simple: multilingual meetings are normal, and the tools should catch up.
Ten thousand meeting minutes later, the data confirmed what every bilingual professional already knew — people switch languages for good reasons, they do it in predictable patterns, and the switches carry information that monolingual transcripts lose entirely.
A language switch at a topic boundary often signals that the speaker is about to share domain knowledge they originally learned in that language. A switch to L1 during a heated discussion signals emotional investment. A switch to a shared minority language between two participants signals trust and in-group communication.
Flattening all of that into "error: language not recognized" isn't just technically wrong. It erases meaning.
We're not the first people to notice this. Linguists have been writing about the sophistication of code-switching since the 1970s. What's new is that the speech recognition technology has finally caught up enough to handle it in real time. And we think that changes what's possible for multilingual teams.
Not by making everyone speak one language. By making it practical for everyone to speak the way they actually think.
Try MangoFinch free
Real-time transcription and translation for multilingual teams. No credit card required.
Start a free meeting