How Artificial Intelligence would aid the Metaverse language problem?
Artificial intelligence is taking a way to solve the Metaverse language problem with Onemeta AI
The metaverse is the inevitable evolution of the internet. It aims a virtual world where billions of people live, work, shop, learns, and interact with each other — all from the comfort of their couches in the physical world. CES 2023 has presented Onemeta AI to solve the Metaverse language problem for making the path of Metaverse more comfortable.
The metaverse’s intrinsic democratization is one of its primary features; theoretically, everyone on the earth has access to the same virtual environment without any of the typical entry restrictions imposed by location. Except for one. If the people meeting in the metaverse don’t speak the same language, sharing an experience could get difficult. Although there are excellently written and spoken translation services like Google Translate and Skype Translator, the problem soon becomes one of scale: The majority of such services are made for one-on-one interactions, whereas a metaverse experience frequently seeks to include dozens, if not hundreds of people. It’s challenging to find a solution if everyone is speaking their own language.
Onemeta AI is here to solve the Metaverse language problem. Its Verbum service, which will make its debut at CES 2023, can translate conversations between up to 50 speakers of various languages in real time. The company claims it can support 82 and 40 dialects. And it doesn’t just deliver real-time transcripts — Onemeta AI can provide voice, too. “You could have 50 people on a Zoom call, and they could each have their own native tongue,” said David Politis, spokesperson for Onemeta AI. He further added “They would hear someone speak in Japanese, but they would then hear it in English or in Italian or in Russian and onscreen they would see it in their language as well.”
There was a chance to demo Verbum at CES on Thursday night. As we spoke through a headset to a woman in Central America, the system translated our words into Spanish and her responses into English. Even though there was a slight delay, the conversation felt natural and flowed well. Words were transcribed within a second of being spoken, and the artificial intelligence voice — which was as good if not better than the TikTok lady — came on about a second after that. Verbum is initially targeted by Onemeta at group meetings for global teams, but it may also be used for metaverse experiences: Consider an online multiplayer role-playing game (MMORPG) where players from all over the world want to communicate with one another quickly in real-time situations (think Call of Duty) or an esports competition where spectators want to both follow the action and interact with one another simultaneously.
English is the language that is used the most, according to Politis. “However, if Portuguese or Russian is your mother tongue, your English will probably not sound exactly like it. Thus, there will inevitably be a breakdown in communication. Nearly all of that can be done away with. There is undoubtedly a need for what Onemeta is providing with Verbum, but whether others, particularly Microsoft and Google, who have resources that Onemeta does not, rise to the occasion will determine whether Verbum’s success.
The post How Artificial Intelligence would aid the Metaverse language problem? appeared first on Analytics Insight.
This content was originally published here.