• 59 Posts
  • 501 Comments
Joined 2 years ago
cake
Cake day: February 2nd, 2024

help-circle









  • Wikipedia also just upped their standards in another area - they’ve updated their speedy deletion policy, enabling the admins to bypass standard Wikipedia bureaucracy and swiftly nuke AI slop articles which meet one of two conditions:

    • "Communication intended for the user”, referring to sentences directly aimed at the promptfondler using the LLM (e.g. "Here is your Wikipedia article on…,” “Up to my last training update …,” and "as a large language model.”)

    • Blatantly incorrect citations (examples given are external links to papers/books which don’t exist, and links which lead to something completely unrelated)

    Ilyas Lebleu, who contributed to the update in policy, has described this as a “band-aid” that leaves Wikipedia in a better position than before, but not a perfect one. Personally, I expect this solution will be sufficent to permanently stop the influx of AI slop articles. Between promptfondlers’ utter inability to recognise low-quality/incorrect citations, and their severe laziness and lack of care for their “”“work”“”, the risk of an AI slop article being sufficiently subtle to avoid speedy deletion is virtually zero.




  • I’m sure you can think of hypothetical use cases for Google Glass and Meta AI RayBans. But these alleged non-creepshot use cases already failed to keep Google Glass alive. I predict they won’t be enough to keep Meta AI RayBans alive.

    Its not an intentional use case, but its an easy way to identify people who I should keep far, far away from.

    It turns out normal people really do not like this stuff, and I doubt the public image of tech bros has improved between 2014 and 2025. So have fun out there with your public pariah glasses. If you just get strong words, count yourself lucky.

    On a wider note, I wouldn’t be shocked if we heard of Rapist RayBan wearers getting beaten up or shot in the street - if the torching of Waymos in anti-ICE protests, the widespread vandalism of Cybertrucks, and Luigi Mangione’s status as a folk hero are anything to go by, I’d say the conditions are right for cases of outright violence against anyone viewed as supporting the techbros.

    EDIT: Un-fucked the finishing sentence, and added a nod to Cybertrucks getting fucked up.





  • I don’t need that, in fact it would be vastly superior to just “steal” from one particularly good implementation that has a compatible license you can just comply with. (And better yet to try to avoid copying the code and to find a library if at all possible). Why in the fuck even do the copyright laundering on code that is under MIT or similar license? The authors literally tell you that you can just use it.

    I’d say its a combo of them feeling entitled to plagiarise people’s work and fundamentally not respecting the work of others (a point OpenAI’s Studio Ghibli abomination machine demonstrated at humanity’s expense.

    On a wider front, I expect this AI bubble’s gonna cripple the popularity of FOSS licenses - the expectation of properly credited work was a major aspect of the current FOSS ecosystem, and that expectation has been kneecapped by the automated plagiarism machines, and programmers are likely gonna be much stingier with sharing their work because of it.