« First « Previous Comments 8 - 28 of 28 Search these comments
Privacy cannot be overridden by the need for safety, without probable cause.
Later, Child Endangerment. Law Enforcement (oh, you know, those helpful Child Protective Services types) will stop by and take away your child.
rocketjoe79 saysPrivacy cannot be overridden by the need for safety, without probable cause.
Isn't there something in the 4th Amendment about that kind of thing?
Will it find the pictures of Tim Cook in butt chaps?
Just saw this on the news. So totalitarian. Time to get freedom phone I guess.
I don't do anything illegal
WookieMan saysI don't do anything illegal
What if they make it illegal to leave your house unless you submit to the jab?
WookieMan saysI don't do anything illegal
What if they make it illegal to leave your house unless you submit to the jab?
Too many high stakes districts where they'd lose in a landslide in 2022 and 2024.
Only if they fix the voting system. At this point I question the legitimacy of ALL politicians.
No practical way to enforce.
I'd agree we're in weird times, but it will never be legal to do so.
Laws don't matter anymore, like elections. We don't have rule of law anymore, nor democracy.
Grabs the photos and covertly sends them to Tim Cook and other pedos. After doing this for a few months, it turns the perp over to the police so they can say they're helping society.
Conclusion
Perceptual hashes are messy. The simple fact that image data is reduced to a small number of bits leads to collisions and therefore false positives. When such algorithms are used to detect criminal activities, especially at Apple scale, many innocent people can potentially face serious problems.
My company’s customers are slightly inconvenienced by the failures of perceptual hashes (we have a UI in place that lets them make manual corrections). But when it comes to CSAM detection and its failure potential, that’s a whole different ball game. Needless to say, I’m quite worried about this.
https://archive.is/ys5Q5