this post was submitted on 03 Dec 2023
3 points (100.0% liked)

hmmm

8041 readers
4 users here now

For things that are "hmmm".

Rule 1: All post titles except for meta posts should be just plain "hmmm" and nothing else, no emotes, no capitalisation, no extending it to "hmmmm" etc.

founded 2 years ago
MODERATORS
 
top 8 comments
sorted by: hot top controversial new old
[–] TropicalDingdong@lemmy.world 0 points 2 years ago (1 children)

I cant get it to reproduce.

[–] DoucheBagMcSwag@lemmy.dbzer0.com 0 points 2 years ago (1 children)

That's because it's fake. "inspect element" HTML memes are hilarious /s

[–] squeakycat@lemmy.ml 0 points 2 years ago (1 children)

How is it fake if it's seemingly hosted on openai's website? Is this an actual history log or some sort of paste service?

[–] crazyCat@sh.itjust.works 0 points 2 years ago (1 children)

“Seemingly” is the key here. Right click on something on the web, look in panel and change the text. Take screenshot et voila you’ve got what you want to portray.

[–] squeakycat@lemmy.ml 1 points 2 years ago

Whoops. For some reason I thought you were referring to the link from another commenter which goes directly to openai's site. Tbh that one is much crazier.

[–] Fallenwout@lemmy.world 0 points 2 years ago (1 children)
[–] glorious_albus@lemmy.world 0 points 2 years ago (1 children)

Holy fucking shit. Anyone have explanations for this?

[–] Seasm0ke@lemmy.world 1 points 2 years ago* (last edited 2 years ago)

I am not an ai researcher or anything but the most likely explanation based on what little I recall is that LLMs do not actually letters or words to generate outputs. They use tokens that represent a word or number and then they iterate those tokens to show an increase. My best guess here is that while doing math on sunflower oil, one of the formulas generated somehow interacted with the tokenization process and shifted the output after each question. Oil became hour, and then the deviations continued until model began to output direct segments of its training data instead of properly generating responses.

Again this is absolutely speculation on my part. I don't have much of a direct understanding of the tech involved