Yeah... Somehow I doubt if there was an issue, that it was intentional. I mean, an optical tracking system having difficulty tracking darker colours?
Who would have thought it... (And since it has an IR projection pattern, that would probably make it worse.)
But the kinect technology is rife with potential discriminatory issues; It contains a lot of image recognition features that are related to biometrics.
Considering it can recognise and load your user profile, it means it can determine who you are based on how you look.
But, from the rumours I've heard, it can also recognise people without a profile and classify them according to gender...
And if you think failing to track dark-skinned individuals is racism, what about when it mis-classifies your gender?
Anyone care to think about the reaction you'd have if the system decided you were a girl?
(Or male, if you happen to be female.)
Every time you intentionally program a system to classify people into groups, you're far closer to issues relating to discrimination than if your system happens to fail to recognise certain racial groups due to technical issues.
One is an unintended consequence of how the technology works, the other is explicitly programming the device to discriminate, and probably pissing off anyone who seems to be somewhat ambiguous.
Still, how many features of kinect can we list that have the potential to be offensive to some groups of people?
Who would have thought it... (And since it has an IR projection pattern, that would probably make it worse.)
But the kinect technology is rife with potential discriminatory issues; It contains a lot of image recognition features that are related to biometrics.
Considering it can recognise and load your user profile, it means it can determine who you are based on how you look.
But, from the rumours I've heard, it can also recognise people without a profile and classify them according to gender...
And if you think failing to track dark-skinned individuals is racism, what about when it mis-classifies your gender?
Anyone care to think about the reaction you'd have if the system decided you were a girl?
(Or male, if you happen to be female.)
Every time you intentionally program a system to classify people into groups, you're far closer to issues relating to discrimination than if your system happens to fail to recognise certain racial groups due to technical issues.
One is an unintended consequence of how the technology works, the other is explicitly programming the device to discriminate, and probably pissing off anyone who seems to be somewhat ambiguous.
Still, how many features of kinect can we list that have the potential to be offensive to some groups of people?