_g_be

joined 2 years ago
[–] _g_be@lemmy.world 4 points 2 days ago (2 children)

This one is fun to say.

Calamari exists on English too, I wonder where it is borrowed from

[–] _g_be@lemmy.world 3 points 3 days ago

You'll have to speak up, I can't hear you over this rotund tits

[–] _g_be@lemmy.world 3 points 3 days ago

My fav pokemon species

[–] _g_be@lemmy.world 17 points 4 days ago (2 children)

Damn autists, taking everything literally

[–] _g_be@lemmy.world 3 points 4 days ago

Both can be true.

The data can say whatever you want if you torture it enough

[–] _g_be@lemmy.world 3 points 5 days ago

No I think it's because they can be arranged in an ∞ figure

[–] _g_be@lemmy.world 15 points 1 week ago (1 children)

First I can't own the libs, now I can't own a gun. The commies have taken everything from me

[–] _g_be@lemmy.world 8 points 1 week ago

It's the argument style, and the "it's not X, it's Y" conclusion. Very common in LLM speak. Doesn't mean it is, though

[–] _g_be@lemmy.world 4 points 1 week ago

"You got games on your phone" but it's carrots in your fridge

[–] _g_be@lemmy.world 3 points 1 week ago

If you can't afford bones for everyone, then you can't bring bones to class

[–] _g_be@lemmy.world 6 points 2 weeks ago (7 children)

ᕕ(گرج )ᕗ

Sorry if insensitive lol

[–] _g_be@lemmy.world 3 points 2 weeks ago

I think the heart of the OP's sentiment is to see all of the true results from the question, whereas an LLM could only provide imaginary possibilities. There's nothing inherently more accurate about AI output than human imagination, since it is trained on it.

view more: next ›