HN2new | past | comments | ask | show | jobs | submitlogin

I don't think extinction is a risk, that is an exaggeration but I think some of the warnings are plausible. At worse it could be a civilization collapse. The most likely negative outcome will be dealing with the disruption in every area of life, government, economics, etc.

The milestone to be worried about is when AI gains proper agency, which is both the most useful part of AI, and also the most dangerous. We want intelligent machines to make decisions and work on their own towards high level goals with little supervision. That is of course a double edged sword, and where costly mistakes can be made. So while I don't think we're there yet, we are certainly closing in on it.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: