Comments 1 - 16 of 16 Search these comments
If a human is involved in any what so ever with something, there will be a bias transferred, even your own and you do have some just as everyone else in the world does .
How do you "transfer bias" into facial recontion in particular
DASKAA saysHow do you "transfer bias" into facial recontion in particular
Humans have to program and write the software and their is no such thing as an unbiased human.
Think of it this way - "bias" is how each of sees the world and none of us see it exactly the same way. Much as we would like to, we are not going to be able to get rid of it - mitigate and minimize it perhaps but not eliminate it.
Perhaps at some point in the future when robots/AI many generations removed from the creators are able to construct their own - bias may be eliminated - maybe - but then there would exist the possible bias against humans.
Humans have to program and write the software and their is no such thing as an unbiased human.
2 - it says the truth, but it hits a topic that where for social reasons, we don't want to stick to the exact "truth".
Heraclitusstudent says2 - it says the truth, but it hits a topic that where for social reasons, we don't want to stick to the exact "truth".
Reality is biased in that case.
In the first place, what's wrong with it?
There are 2 types of biases:
200 cognitive biases rule our everyday thinking
She was mocked for this claim on the grounds that "algorithms" are "driven by math" and thus can't be biased—but she's basically right. Let's take a look at why.
First, some notes on terminology—and in particular a clarification for why I keep putting scare quotes around the word "algorithm." As anyone who has ever taken an introductory programming class knows, algorithms are at the heart of computer programming and computer science. (No, those two are not the same, but I won't go into that today.) In popular discourse, however, the word is widely misused.
Let's start with the Merriam-Webster dictionary definition, which defines "algorithm" as:
[a] procedure for solving a mathematical problem (as of finding the greatest common divisor) in a finite number of steps that frequently involves repetition of an operation broadly: a step-by-step procedure for solving a problem or accomplishing some end
It is a step-by-step procedure. The word has been used in that sense for a long time, and extends back well before computers. Merriam-Webster says it goes back to 1926, though the Oxford English Dictionary gives this quote from 1811:
It [sc. the calculus of variations] wants a new algorithm, a compendious method by which the theorems may be established without ambiguity and circumlocution.
Today, though, the word "algorithm" often means the mysterious step-by-step procedures used by the biggest tech companies (especially Google, Facebook, and Twitter) to select what we see. These companies do indeed use algorithms—but ones having very special properties. It's these properties that make such algorithms useful, including for facial recognition. But they are also at the root of the controversy over "bias."
More: https://arstechnica.com/tech-policy/2019/01/yes-algorithms-can-be-biased-heres-why/
#Algorithms #Bias #SciTech