Voice cloning technologies can generate a near-perfect voice clone based on a short audio clip or snippet of someone’s voice. The technology has potential to help people—for example, people who’ve lost their ability to speak, offering them a powerful and valuable means of communication. But, in the wrong hands, voice cloning technologies can do harm.
Take, for example, the family emergency scam, where an impostor pretends to be a distressed relative. A scammer could clone a voice that sounds just like your loved one. Scammers could also clone the voice of a CEO or other company executive. Then trick employees into transferring large