Crast.net
  • News
    • Internet
  • Apps
  • Smartphones
    • Android
    • Apple
  • Devices
  • How to
  • Computer
    • Windows
    • Security
  • Reviews
  • Games
  • More
    • Comparisons
No Result
View All Result
Crast.net
  • News
    • Internet
  • Apps
  • Smartphones
    • Android
    • Apple
  • Devices
  • How to
  • Computer
    • Windows
    • Security
  • Reviews
  • Games
  • More
    • Comparisons
No Result
View All Result
Crast.net
No Result
View All Result

Amazon’s Alexa takes on the creepy aspect with a new feature to mimic the voices of dead people

by Tracy Lopez
June 24, 2022
in News
Amazon’s Alexa takes on the creepy aspect with a new feature to mimic the voices of dead people

Your scientists were so busy wondering whether they could or didn’t stop to think about what to do

Amazon Alexa

Nowadays, virtual assistants like Google Assistant or Amazon Alexa are with us wherever we go, even though that can seem a little intimidating at times. It looks like Amazon is trying to take things to another level, as the company took to the platform this week to boast about Alexa’s new ability to mimic the voices of dead people.

During Amazon’s Re-Mars conference in Las Vegas (via Android Authority), the company showed off a feature that allows the assistant to quickly learn someone’s voice from a voice recording of less than a minute before emulating it. allows. The intended use case is to allow people to hear the voices of deceased relatives once again. In a video shown during Keynote, a child asks Alexa if their grandmother can finish reading the Wizard of OzPrompting Alexa to speak in the child’s grandmother’s voice.

ANDROIDpolice video of the day

Also read: best alexa games in 2022

Some people certainly find solace and comfort in hearing the voice of a deceased relative they miss dearly, especially through the use of video recordings or other stored material at hand. This is certainly not the same as an AI reconstruction of their voice based on a short voice recording, which is instead, at best, a bit disturbing, and depending on the person, possibly as emotionally manipulative or destructive. may come to the fore.

It doesn’t look like it will be making its way to Amazon’s voice assistant anytime soon, as the company didn’t talk about a specific release timeline. It’s possible that he wanted to flex his AI muscles in front of Google and other competitors in the space. If that was the purpose, it’s undoubtedly one of the top “weird flexes but ok” moments we’ve seen to date.

Source

Related News

How to become famous on TikTok by applying these simple tips and tricks

How to become famous on TikTok by applying these simple tips and tricks

by Julian Ferreno
June 25, 2022

Would you like to be famous one day? And if you were... What kind of famous person would you like...

NVIDIA’s GeForce GTX 1630 4 GB Graphics Card Reportedly Launching for 0 US on June 28

NVIDIA’s GeForce GTX 1630 4 GB Graphics Card Reportedly Launching for $150 US on June 28

by James Kaufman
June 25, 2022

NVIDIA will soon be launching its GeForce GTX 1630 4GB graphics card, an entry-level solution designed for the $150 US...

Complete 2021 Microsoft 365, Windows, and Azure Bundle

Complete 2021 Microsoft 365, Windows, and Azure Bundle

by Vincent Ledbetter
June 25, 2022

In this course, you will learn how to support and configure Windows 10 desktop in an organizational environment. You'll develop...

The best websites to watch anime

The best websites to watch anime

by Julian Ferreno
June 25, 2022

Share In recent years, anime has spread all over the world, gaining more and more fans of these impressive Japanese...

  • Privacy Policy
  • Terms and Conditions
  • Dislcaimer

© 2022 Crast.net - Gadget and Tech News.

No Result
View All Result
  • News
    • Internet
  • Apps
  • Smartphones
    • Android
    • Apple
  • Devices
  • How to
  • Computer
    • Windows
    • Security
  • Reviews
  • Games
  • More
    • Comparisons

© 2022 Crast.net - Gadget and Tech News.

This website uses cookies. By continuing to use this website you are giving consent to cookies being used. Visit our Privacy and Cookie Policy.