Loading...

"How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI..."

Loading...
"How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI..." - Hallo friend WELCOME TO AMERICA, In the article you read this time with the title "How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI...", we have prepared well for this article you read and download the information therein. hopefully fill posts Article AMERICA, Article CULTURAL, Article ECONOMIC, Article POLITICAL, Article SECURITY, Article SOCCER, Article SOCIAL, we write this you can understand. Well, happy reading.

Title : "How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI..."
link : "How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI..."

see also


"How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI..."

"... by having people categorize voice recordings. The company held internal trials and says it tried to address any biases that might arise from varying ethnicity, gender, or age. In our experience, the Halo could detect ups and downs in our voice, but seemed to misinterpret situations regularly. And some of the feedback feels, ironically, a bit tone-deaf — especially when judging a woman’s voice. Our sample size of two isn’t sufficient to conclude whether Amazon’s AI has gender bias. But when we both analyzed our weeks of tone data, some patterns emerged. The three most-used terms to describe each of us were the same: 'focused,' 'interested' and 'knowledgeable.' The terms diverged when we filtered just for ones with negative connotations. In declining order of frequency, the Halo described Geoffrey’s tone as 'sad,' 'opinionated,' 'stern' and 'hesitant.' Heather, on the other hand, got 'dismissive,' 'stubborn,' 'stern' and 'condescending.'... The very existence of a tone-policing AI that makes judgment calls in those terms feels sexist. Amazon has created an automated system that essentially says, 'Hey, sweetie, why don’t you smile more?'"


I think it would be cool to have a wristband that informed me what my tone was... and even cooler to have a conversation with a trusted companion while we both had these tone-police wristbands on and could see each other's display. But wait... why do we need wristbands? Why can't I have this AI in my iPhone and monitor the tone of anybody I happen to be talking to, and why shouldn't I assume that anyone listening to me can be generating this information? Is this alarming? 

If it's alarming, is it because we're going to be off-loading our human judgment that makes us so brilliantly sensitive to the infinite tones of the human voice? Is it because a machine will seem objective, so that you won't just wonder whether someone is being condescending to you, you'll have a scientific/"scientific" verification of condescension or whatever? Is it because you'll figure out how to train your voice to get words you like to appear on the screen and you won't quite be you anymore? Is it because you won't know the extent to which others have trained their voice to disguise their real intentions and the value of our gift for the understanding of speech will crash? 

Oh, by the way, I'm an Amazon Associate, so when — if — you by a Halo wristband, I'd appreciate it if you'd use this link.
Loading...
"... by having people categorize voice recordings. The company held internal trials and says it tried to address any biases that might arise from varying ethnicity, gender, or age. In our experience, the Halo could detect ups and downs in our voice, but seemed to misinterpret situations regularly. And some of the feedback feels, ironically, a bit tone-deaf — especially when judging a woman’s voice. Our sample size of two isn’t sufficient to conclude whether Amazon’s AI has gender bias. But when we both analyzed our weeks of tone data, some patterns emerged. The three most-used terms to describe each of us were the same: 'focused,' 'interested' and 'knowledgeable.' The terms diverged when we filtered just for ones with negative connotations. In declining order of frequency, the Halo described Geoffrey’s tone as 'sad,' 'opinionated,' 'stern' and 'hesitant.' Heather, on the other hand, got 'dismissive,' 'stubborn,' 'stern' and 'condescending.'... The very existence of a tone-policing AI that makes judgment calls in those terms feels sexist. Amazon has created an automated system that essentially says, 'Hey, sweetie, why don’t you smile more?'"


I think it would be cool to have a wristband that informed me what my tone was... and even cooler to have a conversation with a trusted companion while we both had these tone-police wristbands on and could see each other's display. But wait... why do we need wristbands? Why can't I have this AI in my iPhone and monitor the tone of anybody I happen to be talking to, and why shouldn't I assume that anyone listening to me can be generating this information? Is this alarming? 

If it's alarming, is it because we're going to be off-loading our human judgment that makes us so brilliantly sensitive to the infinite tones of the human voice? Is it because a machine will seem objective, so that you won't just wonder whether someone is being condescending to you, you'll have a scientific/"scientific" verification of condescension or whatever? Is it because you'll figure out how to train your voice to get words you like to appear on the screen and you won't quite be you anymore? Is it because you won't know the extent to which others have trained their voice to disguise their real intentions and the value of our gift for the understanding of speech will crash? 

Oh, by the way, I'm an Amazon Associate, so when — if — you by a Halo wristband, I'd appreciate it if you'd use this link.


Thus articles "How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI..."

that is all articles "How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI..." This time, hopefully can provide benefits to all of you. Okay, see you in another article posting.

You now read the article "How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI..." with the link address https://welcometoamerican.blogspot.com/2020/12/how-could-computer-possibly-know-you.html

Subscribe to receive free email updates:

0 Response to ""How could a computer possibly know you sound like a Debbie Downer? Amazon said it spent years training its tone AI...""

Post a Comment

Loading...