> then you can filter by those characteristics without the automated process being "based on the physical characteristics of an individual's face" -- you'd just be filtering on a DB column like any other.
A case agent looks at the still photo from security footage, concludes the person is male and white. They put that information into the mug shot database and it returns the subset of all mugshots that are male and white. They have just used a semi automated system to assist to ascertain the identity of the person in the security footage based on the physical characteristics of their face. This is forbidden by the ordinance.
>what if your system is much more accurate guessing ages for some racial groups than others, or more accurate in distinguishing between white and latinx than between latinx and asian? Suddenly your "filtering" process can also have a bias problem, where some people are more likely than others to be included in the results for a query which shouldn't actually include them.
What if it isn't? Doesn't matter. It's a tool to attempt to match two photos. If a tool is not as effective due to the way light and different skin tones interact, it just limits the leverage factor of the tool. It doesn't make it wrong and it isn't an infringement on people's rights. We use DNA matching even though it can't distinguish twins and sometimes it comes back and says it's a close match for Bob which leads them to suspect Bob's brothers family, e.g. Golden State Killer. I fail to see why tools have to be perfect or not be useful.
A case agent looks at the still photo from security footage, concludes the person is male and white. They put that information into the mug shot database and it returns the subset of all mugshots that are male and white. They have just used a semi automated system to assist to ascertain the identity of the person in the security footage based on the physical characteristics of their face. This is forbidden by the ordinance.
>what if your system is much more accurate guessing ages for some racial groups than others, or more accurate in distinguishing between white and latinx than between latinx and asian? Suddenly your "filtering" process can also have a bias problem, where some people are more likely than others to be included in the results for a query which shouldn't actually include them.
What if it isn't? Doesn't matter. It's a tool to attempt to match two photos. If a tool is not as effective due to the way light and different skin tones interact, it just limits the leverage factor of the tool. It doesn't make it wrong and it isn't an infringement on people's rights. We use DNA matching even though it can't distinguish twins and sometimes it comes back and says it's a close match for Bob which leads them to suspect Bob's brothers family, e.g. Golden State Killer. I fail to see why tools have to be perfect or not be useful.