Because their proximity in space and time to where it originated let them preferentially be at the top of the pyramid scheme.
BioMan
Not really part of the back and forth but I find this illuminating of their recent travails, regarding it not being a step to far to prevent them from posting:
"This isn’t super relevant since it’s not like the standards are super high but ever since the enormous onslaught of LLM psychosis posters, the default of people who try to post to LW is to get rejected from posting here"
Sounds like the mods have had to deal with a lot of unbalanced people lately, and are not having it.
- Devs, following a completely different set of people with delusions of grandeur
2a) Fiasco, by Stanislaw Lem, just so I can see a priest and a supercomputer talking about blowing up a moon
2b) There Is No Antimemetics Divison, for the brain breaking horrors
-
Sunshine, because of its wildly different parts and how they clash and how it plays with light
-
Dune, to further round out the wildly different interpretations
The link to the guide to setting up a retrofitted boxtruck to continue AI alignment research in with local copies of the internet archive after civilization collapses in 2025 is fun
Uh oh
https://www.rollingstone.com/culture/culture-features/gwen-danielson-zizians-interview-1235552043/
Going on a media tour now
“So how does the Epstein drive work? Very well."
― James S.A. Corey, Leviathan Wakes
Is the crappy dragon fursona related to Peter Thiel being an anagram for "the reptile"?
I'm a huge fan of Greg Egan's fiction and a huge fan of him pissing off the rats. He's been explicitly needling them and making fun of them in his fiction for over a decade. Making calm contradictions against them for over two decades, after noticing weirdos being fans of his.
Friend of Ziz and cofounder of the 'rationalist fleet' pops up out of the woodwork trying to clear Ziz's name
I find myself noticing things rather detached from the typical Ziz funnybusiness more strongly than I notice the stuff about that whole situation.
"I'm Gwen Danielson, a neuroscientist and bioengineer, who decided as a child that I would end Death (and bring people back if I could) and that I would become a dragon and help generally facilitate a fantastical transhumanist future."
"I dream of non-Euclidean geometries, of countless worlds visible and accessible in the daytime sky, of competent infrastructure, of soul forges continually working to bring back the dead... I dream of reaching through warps in the spacetime fabric to save the dying across time"
"Signed, the dragon of creation Creatrei (cree-AH-trey) also known as Gwen Danielson or as Char and Astria (when referring to my hemis as distinct individuals)"
The reactions are fun. "This post is not actually doing a good job of making me trust you and think this conversation is safe to have[1], and I notice that as I am saying this that I am afraid that this will now somehow result in someone trying to murder me in my sleep"
I will never forget the time I calculated the energy output on one of the torpedo engines of The Expanse and realized it was higher than the total wattage of all human civilization in 2020
I have a vague hypothesis that I am utterly unprepared to make rigorous that the more of what you take into your mind is the result of another human mind, rather than the result of a nonhuman process operating on its own terms, the more likely you are to have mental issues.
On the low end this would include the documented protective effect of natural environments against psychotic episodes compared to urban environments (where EVERYTHING was put there by someone's idea). But computers... they are amplifiers of things put out by human minds, with very short feedback loops. Everything is ultimately in one way or another defined by a person who put it there, even it is then allowed to act according to the rules you laid down.
And then an LLM is the ultimate distillation of the short feedback loop, feeding back whatever you shovel into it straight back at you. Even just mathematically - the whole 'transformer' architecture is just a way to take imputed semantic meanings of tokens early in the stream and jiggling them around to 'transform' that information into the later tokens of the stream, no new information is really entering it it is just moving around what you put into it and feeding it back at you in a different form.
EDIT: I also sometimes wonder if this has a mechanistic relation to mode collapse when you train one generative model on output from another, even though nervous systems and ML systems learn in fundamentally different ways (with ML resembling evolution much more than it resembles learning)
The existence of a currency that everyone can transact in is fundamentally a public good. It must be maintained with the public good in mind, rather than the good of whoever happens to have lots of it or whoever has the ability to personally influence it.