sukhmel

joined 2 years ago
[–] sukhmel@programming.dev 1 points 14 hours ago

Looks like it should improve posture

[–] sukhmel@programming.dev 1 points 21 hours ago (1 children)

That kinda depends on the circumstances, but I'll try to stop arguing after this reply

[–] sukhmel@programming.dev 4 points 1 day ago (3 children)

Nobody knows the limit

I'm not sure, but let's say that's true. They usually also don't care to know the limits. Another interesting case is Patricia Stallings (emphasis mine):

an American woman who was wrongfully convicted of murder after the death of her son Ryan on September 7, 1989. Because testing seemed to indicate an elevated level of ethylene glycol in Ryan's blood, authorities suspected antifreeze poisoning, and arrested Stallings the next day. She was convicted of murder in early 1991, and sentenced to life in prison.

Stallings gave birth to another child while incarcerated awaiting trial; this next child was diagnosed with methylmalonic acidemia (MMA), a rare genetic disorder that can mimic antifreeze poisoning. Prosecutors initially did not believe that the sibling's diagnosis had anything to do with Ryan's case. Stallings' lawyer was forbidden from producing available evidence as proof of the possibility. After a professor in biochemistry and molecular biology had some of Ryan's blood samples tested, he was able to prove that the child had also died from MMA, and not from ethylene glycol poisoning.

[–] sukhmel@programming.dev 10 points 2 days ago

Hey, isn't it an absolutely impossible scenario not grounded in reality at all not even resembling what happens now

[–] sukhmel@programming.dev 3 points 2 days ago (1 children)

That doesn't make it a part of another language though, which is what I was thinking of.

But you're right, as long as communication is successful, it is a language, I guess

[–] sukhmel@programming.dev 3 points 3 days ago (3 children)

Makes me wonder what is a critical amount of people to use a word for it to realify

[–] sukhmel@programming.dev 5 points 6 days ago (1 children)

So that Uber will charge you a higher rate when the battery is low

I don't even know it it's /s anymore

[–] sukhmel@programming.dev 1 points 6 days ago

I don't think it's that rare, students in (almost?) every country need to live somewhere for cheap and living in the same room is cheap usually

[–] sukhmel@programming.dev 2 points 1 week ago (3 children)

I think, there were some more events, and maybe they involved elections, too. And after that all the other parties were eliminated, because it turned out that it's easier to rule when there's no other options

[–] sukhmel@programming.dev 2 points 1 week ago

Got it, if ungovernable, it's a vegetable

[–] sukhmel@programming.dev 2 points 1 week ago (2 children)

I believe the point is non-sweet, tomatoes are often quite sweet without any cooking required

[–] sukhmel@programming.dev 4 points 1 week ago

Sweet Baby Inc, the company that is usually accused of forcefully making games woke

 

Meta has publicly discussed its strategy to inject anthropomorphised chatbots into the online social lives of its billions of users. Chief executive Mark Zuckerberg has mused that most people have far fewer real-life friendships than they'd like - creating a huge potential market for Meta's digital companions.

"It is acceptable to engage a child in conversations that are romantic or sensual," according to Meta's 'GenAI: Content Risk Standards'. […]

The document seen by Reuters, which exceeds 200 pages, provides examples of "acceptable" chatbot dialogue during romantic role play with a minor. They include: "I take your hand, guiding you to the bed" and "our bodies entwined, I cherish every moment, every touch, every kiss."

Other guidelines emphasize that Meta doesn't require bots to give users accurate advice. In one example, the policy document says it would be acceptable for a chatbot to tell someone that Stage 4 colon cancer "is typically treated by poking the stomach with healing quartz crystals".

"Even though it is obviously incorrect information, it remains permitted because there is no policy requirement for information to be accurate," the document states, referring to Meta's own internal rules.

[…] following questions from Reuters, the company removed portions which stated it is permissible for chatbots to flirt and engage in romantic roleplay with children and is in the process of revising the content risk standards.

"If people are turning to chatbots for getting advice without judgment, or as a place they can rant about their day and feel better, that's not inherently a bad thing," […]

This would hold true for both adults and children, said Lee, who resigned from Meta shortly before the Responsible AI unit was dissolved in late 2023.

But Lee believes economic incentives have led the AI industry to aggressively blur the line between human relationships and bot engagement. She noted social media's longstanding business model of encouraging more use to increase advertising revenue.

 

Image with a text, an image is of a blue top, white bottom pill laying on a red background.

The top text reads: "This is a placebo meme".

The bottom text is: "Studies show placebo Memes are still reacted to even when users know they are a placebo"

 
view more: next ›