The Weird Way Amazon Wants To Reunite Family Members
Amazon wants to reunite family members in a really unconventional way with the help of a new feature on its Alexa devices.
This article is more than 2 years old
Amazon’s recently revealed upcoming Alexa update has some questioning whether or not it crosses a moral line. Amazon recently hosted an event focused on showcasing advancements that the company is making in the areas of artificial intelligence and machine learning. This is where the newest Alexa feature was demonstrated. The feature in question gives Alexa the ability to mimic people’s voices, including the voices of individuals who have passed on. And thus, aims to, in a sense, reunite individuals with their deceased loved ones.
Amazon’s Senior Vice President and Head Scientist for Alexa, Rohit Prasad demonstrated the company’s vision in terms of how the new feature would potentially work. At the event, Prasad showed an example of Alexa reading a story to a young child in the voice of their grandmother that had passed away. Prasad emphasized that this feature is one way in which Amazon is aiming to make AI more invaluable to humans. In the case of a deceased loved one, he emphasized that while hearing the voice of a deceased loved one will never be able to eliminate the pain of death, it can preserve memories and, perhaps, promote healing. “While AI can’t eliminate that pain of loss, it can definitely make the memories last,” detailed Prasad at the event.
According to Prasad early testing suggests that with the new feature Alexa will be able to learn and mimic a voice after hearing as little as one minute of audio. The whole demonstration was certainly impressive, albeit slightly morally questionable. As of now, Amazon has not released any details pertaining to the timing of when the feature will be officially rolled out to the public.
Moreover, Engadget pointed to the fact that while Amazon’s new Alexa feature is certainly impressive, there are a lot of potential pitfalls that could arise from its existence, too. That type of technology could be easily exploited for malicious purposes. For instance, it could easily be used to commit fraud. Many banking technologies use voice recognition as a way to verify client identities. That tool could be used to illegally gain access to someone’s financial assets. It is scary to think how easy Amazon may be making it for nefarious folks to more effectively commit crimes. Putting tech like that into the hands of the public should not be taken lightly. Although, the fact that no specific details have been provided regarding when the feature will be made available is a good sign that Amazon recognizes the potential risks of unleashing a beast such as that.
That being said, circling back to morality, it would also be wise for Amazon to conduct extensive testing relating to the possible psychological implications that type of technology could have on an individual’s psyche. This is an uncharted sector of technology that Amazon is traversing, that comes with a lot of uncertainty and those uncertainties are not something that should be taken lightly. Regardless, it should be incredibly interesting to see how this project continues to evolve in the months and years ahead.