- An analysis of the in-app code for the Android “Google Translate” app v10.2.43 update has further clarified the specifications for the “Audio Mode” being introduced to “Live Translate.”
- The definitions for whether translated audio for the user and the other party is played or displayed on an Android device, “Pixel Buds” series, or “Android XR” glasses in the “Conversation,” “Listening,” and “Silent” modes have been revealed.
- By selecting either headphones (Pixel Buds) or glasses (Android XR) as the device for “Live Translate,” the output destinations for the user’s translation (Your language) and the partner’s translation (Their language) are clearly assigned based on the wearable device being used (Pixel Buds / Android XR).
Previously, we reported that the “Live Translate” feature in the Android version of the “Google Translate” app—which automatically translates conversations between two languages to facilitate communication—would soon be available on “Android XR” smart glasses in addition to the Google Pixel Buds series. We also noted the upcoming introduction of an “Audio Mode” that allows users to choose from three modes: “Conversation,” “Listening,” and “Silent.”
Furthermore, an analysis of the in-app code for the Android “Google Translate” app v10.2.43 update, released around Thursday, January 29, 2026, has further clarified the specifications for the “Audio Mode” coming to “Live Translate.”
<string name="cl_live_translate_with_headphones">Live translate with headphones</string>
<string name="cl_live_translate_with_headphones_prompt">Connect headphones to hear translations in real-time.</string>
<string name="cl_audio_mode_conversation_with_glasses_summary">Conversation with glasses</string>
<string name="cl_audio_mode_conversation_with_headphones_summary">Conversation with headphones</string>
<string name="cl_audio_mode_listening_summary">Listening</string>
<string name="cl_audio_mode_conversation_with_glasses_your_language_in_glasses">Translations in your language (%1$s) will play on glasses. Translations in their language (%2$s) will play out loud.</string>
<string name="cl_audio_mode_conversation_with_headphones_their_language_in_headphones">Translations in your language (%1$s) will play out loud. Translations in their language (%2$s) will play on your headphones.</string>
<string name="cl_live_translate_with_headphones">ヘッドフォンでのライブ翻訳</string>
<string name="cl_live_translate_with_headphones_prompt">ヘッドフォンを接続すると、翻訳をリアルタイムで聞くことができます。</string>
<string name="cl_audio_mode_conversation_with_glasses_summary">グラスで会話</string>
<string name="cl_audio_mode_conversation_with_headphones_summary">ヘッドフォンでの会話</string>
<string name="cl_audio_mode_listening_summary">聞き取っています</string>
<string name="cl_audio_mode_conversation_with_glasses_your_language_in_glasses">あなたの言語(%1$s)による翻訳がグラスで再生されます。相手の言語(%2$s)による翻訳がスピーカーで再生されます。</string>
<string name="cl_audio_mode_conversation_with_headphones_their_language_in_headphones">あなたの言語(%1$s)による翻訳がスピーカーで再生されます。相手の言語(%2$s)による翻訳がヘッドフォンで再生されます。</string>
What has been revealed from the v10.2.43 update code is the definition of how translated audio for both the user and the other party will be played or displayed across Android devices, the “Pixel Buds” series, and “Android XR” glasses for each mode: “Conversation,” “Listening,” and “Silent.” Specifically, when a user selects either headphones (Pixel Buds) or glasses (Android XR) for “Live Translate,” the output for “Your language” and “Their language” will be clearly routed depending on the wearable device being worn.
The specifications are as follows:
| ■ When Glasses (Android XR) are selected (Conversation with glasses) ➜ The user checks the translation of the partner’s words on the glasses (Android XR), while the translated audio for the partner is played through the Android device’s speaker. |
| ・Translation for the user: Played/displayed on the glasses (Android XR) worn by the user. ・Translation for the partner: Played from the Android device speaker and displayed on the screen. |
| ■ When Headphones (Pixel Buds) are selected (Conversation with headphones) ➜ The user hears the translation of the partner’s words through the headphones (Pixel Buds), while the translated audio for the partner is played through the Android device’s speaker. |
| ・Translation for the user: Played through the headphones (Pixel Buds) worn by the user. ・Translation for the partner: Played from the Android device speaker and displayed on the screen. |
In other words, it has become clear that “Audio Mode” is not just a simple mode toggle, but a feature where the app automatically configures the optimal audio input and output based on the device selected by the user. Additionally, in “Listening” mode, it appears that detailed output settings will be possible depending on the situation, such as listening to both languages through the glasses (Android XR) or listening only to the partner’s language through the headphones (Pixel Buds).
Another interesting point is the discovery of a new description for an output device option called “Partner’s Headphones.” This suggests that in face-to-face “Live Translate” sessions, if the other party is also using compatible headphones or during device sharing, the user may be able to control the audio output destination for the partner as well.
This means that “Live Translate” in the Android “Google Translate” app could potentially offer much more seamless conversations by utilizing headphones (Pixel Buds) and glasses (Android XR).
“Google Translate” App Link






コメントを残す