

I checked the link, the thread elaborates a bit more:
to which my understanding is: this is absolutely true as well.
Only Bayes Can Judge Me
I checked the link, the thread elaborates a bit more:
to which my understanding is: this is absolutely true as well.
Hey, as long as we can learn from this, then we are doing our best to keep this bar nazi-free.
I think it’s accurate that the US and England didn’t join to stop the Holocaust. Sentence 3 is a little oversimplified, and sentence 4 is straight-up lunacy.
Yes, exactly. The same capitalist/colonialist/imperialist forces that exploit the global south are the same that want AI to automate everyone into abject poverty. We can do better and show class solidarity by not perpetuating these racist stereotypes.
Let’s drop the “indian” part of this and come up with something else, please.
Don’t look up who the president is, it might come as a shock /s
Making a startup named Sphinctr
You’re welcome!
Pre-watch: With the prior knowledge that her main job prior to this was wrestling promoter, the ol’ overtonussy is preeeetty loose
Post-watch: lol
New animation (1:20) by theforestjar about AI art.
How much of this is the AI bubble collapsing vs. Ohiophobia
slophistry
JFC I click on the rocket alignment link, it’s a yud dialogue between “alfonso” and “beth”. I am not dexy’ed up enough to read this shit.
Spooks as a service
Utterly rancid linkedin post:
Why can planes “fly” but AI cannot “think”?
An airplane does not flap its wings. And an autopilot is not the same as a pilot. Still, everybody is ok with saying that a plane “flies” and an autopilot “pilots” a plane.
This is the difference between the same system and a system that performs the same function.
When it comes to flight, we focus on function, not mechanism. A plane achieves the same outcome as birds (staying airborne) through entirely different means, yet we comfortably use the word “fly” for both.
With Generative AI, something strange happens. We insist that only biological brains can “think” or “understand” language. In contrast to planes, we focus on the system, not the function. When AI strings together words (which it does, among other things), we try to create new terms to avoid admitting similarity of function.
When we use a verb to describe an AI function that resembles human cognition, we are immediately accused of “anthropomorphizing.” In some way, popular opinion dictates that no system other than the human brain can think.
I wonder: why?
lmao fuck off
It’s an anti-fun version of listening to dark side of the moon while watching the wizard of oz.
You didn’t link to the study; you linked to the PR release for the study. This and this are the papers linked in the blog post.
Note that the papers haven’t been published anywhere other than on Anthropic’s online journal. Also, what the papers are doing is essentially tea leaf reading. They take a look at the swill of tokens, point at some clusters, and say, “there’s a dog!” or “that’s a bird!” or “bitcoin is going up this year!”. It’s all rubbish dawg
This needed a TW jfc (jk, uh, sorta)
some parrots are more stochastic than others