Hacker Timesnew | past | comments | ask | show | jobs | submitlogin

> GPU falling off the bus

I'm wondering if we could prompt llama3 with the above statement. What kind of response would it give?



With temperature set to 1, it recognizes the joke, but proceeds to explain what the "bus" is in computer terms, picks a problem this prompt could mean, and explains how to solve it. In ~20 tries it always gave me something along the lines of:

The infamous "GPU falling off the bus" issue!

This problem typically occurs when a graphics processing unit (GPU) is not properly seated or connected to its expansion slot, such as PCIe, on a motherboard.

Here are some troubleshooting steps to help resolve the issue:

(numbered list of steps or options follows)

Tested on Llama 3 Instruct 7B Q8_0, because that one fits entirely on my GPU.


+1, interesting findings! I like how it was able to infer the meaning from such a short phrase in a limited context.


It's actually a very common phrase on forums, I think because it's an actual error that Linux will report: https://askubuntu.com/questions/868321/gpu-has-fallen-off-th.... I've also never heard of it, but it seems like it must appear a lot in the training data and probably about 0 times is referring to a bus on the road.


In my testing, both Llama 3 and its abliterated (uncensored) variant from[0] almost always remarked more or less directly that they see the joke in the phrase, so either they've seen the other meaning in training, or inferred it.

--

[0] - https://hackertimes.com/item?id=40665721


Oh I agree it probably inferred the joke. I was actually more surprised that it knew the real meaning of the phrase because I as a human did not, until I looked it up and saw how common it is.


Please use the word ablated instead. That article's title is not using a real word. I'm assuming it's the author's English issue, since they called the model "helpfull" instead of "helpful".


Oops. I actually originally wrote "ablated", then changed it to be consistent with the title.


To be specific, the system prompt used was (default in LM Studio config for Llama 3 V2):

You are a helpful, smart, kind, and efficient AI assistant. You always fulfill the user's requests to the best of your ability.

And then the query was:

GPU falling off the bus

And yes, I imagine it read that query as ending with an implied "pls help!".




Consider applying for YC's Summer 2026 batch! Applications are open till May 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: