HN2new | past | comments | ask | show | jobs | submitlogin

> Having spent a lot of time with GPT-4 recently, it's possible the new generation will not have the same issues we did. For example they can write some code and say, "adjust for best practice".

Not iteratively forever, the only feedback it will (eventually) get is it's own feedback, because humans will be using the AI best practices, and what it deems as best practice is almost certainly going to be too opaque to us mere mortals.



Training data is key, a lot of people will be employed in creating, curating, and labeling it. They will utilize AI to help them but it will be a partner. If AI reaches the point of creating and using its own training data then you can hang up your hat and call it a day because no human input or work will be needed at that point.


Yeah kind of an interesting journey. We went from people writing content for people, to bots writing content for people, to people curating content for bots (ultimately for people). The only permutation left is bots curating/writing content for bots and people are consuming only bot-generated content. In a way we are partially there since some content that was in these models was written by bots.


Nah, it'll still be human readable. that was the whole point of C and Java. Let the computers to the hard stuff in bytecode or asm. Compiler development will get even more complex though.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: