HN2new | past | comments | ask | show | jobs | submitlogin

[flagged]


Even GPT3 has better humor.


the LoC count has nothing to do with the amount of processing required for training -- if anything they're probably inversely related.

the title (correctly) describes a small LoC count, which makes no assertions about the other things you mentioned.


the article DOES NOT describe GPT3 and this article title is misleading, and the author is lying.


Anger… fear… aggression. The dark side are they. Easily they flow, quick to join you in a fight. If once you start down the dark path, forever will it dominate your destiny, consume you it will, as it did Obi-Wan’s apprentice. -- Master Yoda, Return of the Jedi


What's the lie? The article title is "GPT in 60 lines of NumPy".

I'd agree with you if it said "GPT3 in 60 lines of NumPy" - but it doesn't say that.



That's a bit overly harsh and missing the point of the article


Terrible joke and COMPLETELY missing the point.


I would better use pytorch api than this bogus code. https://pytorch.org/docs/stable/generated/torch.nn.Transform...


Is this because you don't know how to implement the weight updates in NumPy yourself?


also for curious mind here is a more authentic tutorial on building gpt (precursor to gpt3) by andrej karpathy https://www.youtube.com/watch?v=kCc8FmEb1nY My point is if you want to spend time, spend on authentic material not some bogus material.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: