Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

yes, when I’m preparing my prestigious project with my Ivy League peers I make sure not to sanitize or vet my inputs so as not to bias the results of my programs or publication success. truly it is remarkable that the training set we chose to use is racist but no single person who could have stopped this or be held responsible is


If your data set was compiled by a single person, well, easy enough.

If your data set was compiled by thousands of mechanical turk workers, well, you got a lot of people to blame a little bit. As it goes, "everybody is a little racist sometimes..." and apparently that shows on a big enough data set.


amazing, how when even more people are involved with selecting and tagging the input, it becomes even less likely any one person was to blame.




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: