> I don't understand how Google's algorithm can be misled into finding sexiness in those.
I'm reminded of a paper for which the authors generated different pictures of static that fooled neural network image classifiers into confidently identifying them as different objects: https://arxiv.org/abs/1412.1897
> Computer vision and human vision are nothing alike. And yet, since it increasingly relies on neural networks that teach themselves to see, we’re not sure precisely how computer vision differs from our own. As Jeff Clune, one of the researchers who conducted the study, puts it, when it comes to AI, “we can get the results without knowing how we’re getting those results.”
I'm reminded of a paper for which the authors generated different pictures of static that fooled neural network image classifiers into confidently identifying them as different objects: https://arxiv.org/abs/1412.1897
Wired summary: https://www.wired.com/2015/01/simple-pictures-state-art-ai-s...
> Computer vision and human vision are nothing alike. And yet, since it increasingly relies on neural networks that teach themselves to see, we’re not sure precisely how computer vision differs from our own. As Jeff Clune, one of the researchers who conducted the study, puts it, when it comes to AI, “we can get the results without knowing how we’re getting those results.”