this post was submitted on 18 Oct 2024
1 points (100.0% liked)

Programming Horror

2334 readers
1 users here now

Welcome to Programming Horror!

This is a place to share strange or terrible code you come across.

For more general memes about programming there's also Programmer Humor.

Looking for mods. If youre interested in moderating the community feel free to dm @Ategon@programming.dev

Rules

Credits

founded 2 years ago
MODERATORS
 
top 5 comments
sorted by: hot top controversial new old
[–] orca@orcas.enjoying.yachts 0 points 2 years ago (1 children)

From my experience with ChatGPT:

  1. It will NEVER consistently give you only the value in the response. It will always eventually add in some introductory text like it’s talking to a human. No matter how many times I tried to get it to just give me back the answer alone, it never consistently did.
  2. ChatGPT is terrible with numbers. It can’t count, do math, none of that. So asking it to do byte math is asking for a world of hurt.

If this isn’t joke code, that is scary.

[–] dev_null@lemmy.ml 0 points 2 years ago (1 children)

I refuse to believe you are not certain this is a joke

[–] orca@orcas.enjoying.yachts 0 points 2 years ago

I know it is, but I’ve also seen people try to use ChatGPT for similar things as a serious endeavor.

[–] zaphod@sopuli.xyz 0 points 2 years ago (1 children)

You don't need to cast the return value from malloc.

[–] Subverb@lemmy.world 0 points 2 years ago* (last edited 2 years ago)

This isn't malloc though. I have to assume the cast is because the user has experience with the output from an LLM being untrustworthy.