Feyd

joined 2 years ago
[–] Feyd@programming.dev 0 points 1 week ago (1 children)

I'd be way more concerned about other things in that interview than nitpicking the language when firefox checks notes listened to feedback

[–] Feyd@programming.dev 0 points 1 week ago (3 children)
  1. You have to long click or open from the control menu to get link preview
  2. The link preview itself has a direct link to the settings page where you can disable it
  3. It's true that the link preview content that shows before you activate the AI summary is not AI. It is just showing the open graph content, same as any chat app or whatever
  4. If you like the open graph content, but don't want the AI summary, you can hit the chevron to fold the summary panel to window decoration size, and it stays folded for all future link previews even through reboots.
  5. The AI model doesn't download until first use

I don't like that link preview is how they spent their time, but the misinformation and subsequent overreaction to it is insane. I only even know this stuff or that it exists at all because I thought surely it wasn't as egregious as people were saying so I checked it out and boy was I right.

I'd be surprised if users that don't talk about firefox on the internet even know link previews exist.

[–] Feyd@programming.dev 0 points 1 week ago (12 children)

In firefox they are enabled by default in the sense that you can get to them via the UI unless you flip some settings, but they don't do anything unless you use them, including not even downloading the models until first use. So yes, I would prefer firefox not have them (except local translation which uses a local model instead of shipping it off to a remote service, which is a useful feature that is better then the alternative ways to do it), but I wouldn't say it's as far as hypocrisy.

[–] Feyd@programming.dev -3 points 1 week ago (1 children)
  1. Floating point math is deterministic.
  2. Systems don't have to be programmed with race conditions. That is not a fundamental aspect of an LLM, but a design decision.
  3. Systems don't have to be programmed to tie break with random methods. That is not a fundamental aspect of an LLM, but a design decision.

This is not hard stuff to understand, if you understand computing.

[–] Feyd@programming.dev 0 points 2 weeks ago

Haibane is one of my favorites. I wish it would get put on a streaming service so I can get people to check it out more easily

[–] Feyd@programming.dev -3 points 2 weeks ago* (last edited 2 weeks ago) (3 children)

You can, at that will cause the same output on the same input if there is no variation in floating point rounding errors. (True if the same code is running but easy when optimizing to hit a round up/down and if the tokens are very close the output will diverge)

There are more aspects to the randomness such as race conditions and intentionally nondeterministic tiebreaking when tokens have the same probability, apparently.

I actually think LLMs are ill suited for the vast majority of things people are currently using them for, and there are obviously the ethical problems with data centers bringing new fossil fuel power sources online, but the technology is interesting in and of itself

[–] Feyd@programming.dev -3 points 2 weeks ago* (last edited 2 weeks ago)

I said I was wrong in that my statement was overly broad and not applicable to the systems most people are using in my initial response to you, then clarified that it is not an intrinsic character of the technology at large but that the implementations that are most used have it.

You apparently think that conversations are a battle with winners and losers so the fact you were right that the biggest systems are nondeterministic for reasons outside of temperature configuration means it doesn't matter why, doesn't matter that those factors don't have to apply to every inference system, and doesn't matter that you have no idea what determinism means.

In any case talking to you seems like a waste of time, so enjoy your sad victory lap while I block you so I don't make the mistake of engaging you assuming you're an earnest interlocutor in the future.

[–] Feyd@programming.dev -3 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

Deterministic systems are always predictable, even if you never ran the system. Can you determine the output of an LLM with zero temperature without ever having ran it?

You don't have to understand a deterministic system for it to be deterministic. You are making that up.

And even disregarding the above, no, they are still NOT deterministic systems

I conceded that setting temperature to 0 for an arbitrary system (including all the remote ones most people are using) does not mean it is deterministic after reading about other factors that influence inference in these systems. That does not mean there are not deterministic implementations of LLM inference, and repeating yourself with NO additional information and using CAPS does NOT make you more CORRECT lol.

[–] Feyd@programming.dev -1 points 2 weeks ago (4 children)

You also have to run the model with the input to determine what the output will be, no way to determine it BEFORE running. With a deterministic system, if you know the code you can predict the output with 100% accuracy without ever running it.

This is not the definition of determinism. You are adding qualifications.

I did look it up and I see now there are other factors that aren't under your control if you're using a remote system, so I'll amend my statement to say that you can have deterministic inference systems, but the big ones most people use cannot be configured to be by the user.

[–] Feyd@programming.dev -1 points 2 weeks ago (11 children)

You can actually set it up to give the same outputs given the same inputs (temperature = 0). The variability is on purpose

[–] Feyd@programming.dev 1 points 2 weeks ago

It isn't just that Democrats are winning seats. It is young progressive Democrats in many cases

[–] Feyd@programming.dev 14 points 2 weeks ago (2 children)

I like fighting games and I also think mortal kombat should be a different genre

view more: next ›