Amazon Deploying New And Very Dangerous Facial Recognition Technology
Amazon Deploying New And Very Dangerous Facial Recognition Technology by: Off The Grid News
TDC Note – What I don’t understand is why anyone would ever PAY for the privilege of having Amazon or, especially, google spy on your conversations in your own home. Now, these devices can paint a picture that is so detailed of each person that enters into a home with one of these devices it should scare the begeezes right out of you. But, no, buy more.
America’s favorite retailer, Amazon, has entered the surveillance business with new recognition technology. Many are now accusing Amazon of selling that recognition technology to the government (deep state) to monitor its citizens.
“The company has developed a powerful and dangerous new facial-recognition system and is actively helping governments deploy it,” the American Civil Liberties Union (ACLU) charged. The system, called Amazon Rekognition, uses artificial intelligence to identify individuals by their faces.
“Powered by artificial intelligence, Rekognition can identify, track, and analyze people in real time and recognize up to 100 people in a single image,” the ACLU charged.
Amazon Recognition Technology Is Watching You
“Amazon Recognition today announces three new features: detection and recognition of text in images. It also uses real-time face recognition across tens of millions of faces. Not just that, detection of up to 100 faces in challenging crowded photos,” an Amazon Web Services (AWS) press release states.
“Customers using Amazon Rekognition to detect objects and faces in images have been asking us to recognize text embedded in images,” the AWS press release states. That means Amazon Rekognition is able to read t-shirts with political messages, tattoos or even the titles of books.
Among other things, Amazon Rekognition can identify faces of individuals in a crowd. That means the new recognition technology can pinpoint individuals participating in a political rally or attending a gun show.