Qwen 3.5 can be run via ollama
WolfLink
Qwen 3.5 is one of the best of the open-weight (self-host able) models right now. It’s not as good as some of the extra massive proprietary models like the bigger Claude models.
These days we have websites for exactly that purpose.
A lot of the major furniture in my apartment came from people getting rid of stuff that I found via free-and-for-sale pages.
Sure, but to get the communication started you would start with facts you’d agree on, like the positions of stars or basic chemistry.
The model we currently have for the universe goes well beyond anything we could learn with our natural senses and the way we intuitively think about the world because of those senses.
It’s true that we keep refining our models and it’s very possible that an alien would have slightly different models, but at the end of the day, we are trying to describe the same universe and those models are going to overlap a lot because of that.
First of all, there has been a lot of research into what the minimal set of assumptions you need is to reproduce what we consider “basic math” and also what happens if you tweak those assumptions.
Second of all, the main goal for science and the type of math we use for science is to effectively model the world we live in.
Any aliens that live in the same universe are subject to the same physics, and any civilization advanced enough to detect our messages will know some basic universal facts about the world, and those facts are what we hope to use as the basis for starting communication.
Just make it “dee” like how people pronounce “D20” and it can keep “dice” as its plural.
Signal already has that setting. It’s up to the user to decide their level of convenience vs security.

NileRed is basically an actually skilled scientist trying to do alchemy with the vibes of a high school student.
data security in that case had nothing to do with the llm
That’s kinda my point.
“I don’t trust companies to hold their promises” is a very different argument from:
LLMs are inherently bad at data security and there is no way these companies can, in good faith, promise HIPPA compliance
It is certainly possible to implement a secure LLM service.
You can run Linux on the ARM MacBooks.