Google I/O 2018 was started on Tuesday and unveiled the new updates as expected including new tools for the developers. However, we all were waiting for that one news breaking stage performance from Google that it brings every year. The company finally showcases its new AI Duplex with a live demonstration that left everyone awestruck at the shoreline amphitheater. The AI system is capable of talking over the call on behalf of humans to a local business for making reservations and appointments.
But, that's what not we are amazed about, it's the way Google Duplex communicate it has the capability to speak like humans. The next level AI chatbot sounds similar to humans, containing the imperfections like ‘umm’, ‘ahh, and other artifacts like ‘I Gotcha’, instead of typical computer voice. The sound was so human that it fools the person on the other side and made them think that they are talking to an actual human. As the humans can't expect the computer to speak so casually and that’s what makes Google Duplex more dangerous than we think.
Img Source: Google
Google was working on Duplex for quite some time now and finally demonstrated it at the Google I/O stage. The theater full of spectators was surprised when the Google AI made an appointment to a restaurant without letting know the person on the other side that it's not human who is talking. Taking the artificial intelligence to a new human level is quite a breakthrough for the company but should we appreciate it or worried about it. The technology getting the human capabilities is something that joints the dots to the dystopia that Mr. Musk always talking about.
Duplex calling a restaurant:
The Duplex technology will be integrated with the Google Assistant for the users to simplify their works. “Another benefit for users is that Duplex enables delegated communication with service providers in an asynchronous way, e.g., requesting reservations during off-hours, or with limited connectivity. It can also help address accessibility and language barriers, e.g., allowing hearing-impaired users, or users who don’t speak the local language, to carry out tasks over the phone.”, Google mentioned in its Blog.
Duplex asking for holiday hours:
Google also explained how its used the natural sounding to make Duplex more human with the combination of concatenative text to speech (TTS) and a synthesis TTS engine to empowering it for responding on the bases of circumstance.
Example of complex statement:
For now, we are struggling to distinguish between the fake news and images. This new human-like talking AI will make harder to believe on our ears too, doesn't that scares you? I understand the technology needs to be evolved for the betterment of the world and finding new ways to make our lives more convenient. However, what’s good about giving the ability to disguise as a human to a technology that could think on its own one day. Maybe Google Duplex isn't that much of dangerous as we are thinking of now, but the advancement represents that it is possible for the AI to become more like a human.
Google said it is still working on Duplex and the final products will be available in Summer. We don't have any idea how the Duplex will work and capable of after Google done with making it more human sounding.
This technology demonstration at the Google I/O made us think that on calling someone in future how will we be able to recognize if its human or a bot answering the call.
Akash Singh Chauhan is a senior writer at MobileAppDaily and he mainly covers all the latest happenings and tweaks in mobile app technology. Being an Engineering graduate he is always compelled to the technology and tries to discover new trends in the tech world. Along with any tech news he also never misses a single episode of ‘Dragon Ball’.Follow