Want to wade into the sandy surf of the abyss? Have a sneer percolating in your system but not enough time/energy to make a whole post about it? Go forth and be mid: Welcome to the Stubsack, your first port of call for learning fresh Awful you’ll near-instantly regret.

Any awful.systems sub may be subsneered in this subthread, techtakes or no.

If your sneer seems higher quality than you thought, feel free to cut’n’paste it into its own post — there’s no quota for posting and the bar really isn’t that high.

The post Xitter web has spawned soo many “esoteric” right wing freaks, but there’s no appropriate sneer-space for them. I’m talking redscare-ish, reality challenged “culture critics” who write about everything but understand nothing. I’m talking about reply-guys who make the same 6 tweets about the same 3 subjects. They’re inescapable at this point, yet I don’t see them mocked (as much as they should be)

Like, there was one dude a while back who insisted that women couldn’t be surgeons because they didn’t believe in the moon or in stars? I think each and every one of these guys is uniquely fucked up and if I can’t escape them, I would love to sneer at them.

(Credit and/or blame to David Gerard for starting this.)

  • mirrorwitch@awful.systems
    link
    fedilink
    English
    arrow-up
    11
    ·
    11 hours ago

    @cityofangelle.bsky.social comments:

    HAHAAHHAHAAHHAAA

    Anthropic has posted two jobs, both paying $200K+.

    FOR WRITERS. (Looks like a policy/comms hybrid.)

    ANTHROPIC.

    IS WILLING TO PAY HALF A MILLION A YEAR.

    FOR WRITERS.

    Whatsamatter boys, can’t your plagiarism machine make a compelling case for you?

    LOL. LMAO, even.

  • scruiser@awful.systems
    link
    fedilink
    English
    arrow-up
    14
    ·
    18 hours ago

    Continuation of the lesswrong drama I posted about recently:

    https://www.lesswrong.com/posts/HbkNAyAoa4gCnuzwa/wei-dai-s-shortform?commentId=nMaWdu727wh8ukGms

    Did you know that post authors can moderate their own comments section? Someone disagreeing with you too much but getting upvoted? You can ban them from your responding to your post (but not block them entirely???)! And, the cherry on top of this questionable moderation “feature”, guess why it was implemented? Eliezer Yudkowsky was mad about highly upvoted comments responding to his post that he felt didn’t get him or didn’t deserve that, so instead of asking moderators to block on a case-by-case basis (or, acasual God forbid, consider maybe if the communication problem was on his end), he asked for a modification to the lesswrong forums to enable authors to ban people (and delete the offending replies!!!) from their posts! It’s such a bizarre forum moderation choice, but I guess habryka knew who the real leader is and had it implemented.

    Eliezer himself is called to weigh in:

    It’s indeed the case that I haven’t been attracted back to LW by the moderation options that I hoped might accomplish that. Even dealing with Twitter feels better than dealing with LW comments, where people are putting more effort into more complicated misinterpretations and getting more visibly upvoted in a way that feels worse. The last time I wanted to post something that felt like it belonged on LW, I would have only done that if it’d had Twitter’s options for turning off commenting entirely.

    So yes, I suppose that people could go ahead and make this decision without me. I haven’t been using my moderation powers to delete the elaborate-misinterpretation comments because it does not feel like the system is set up to make that seem like a sympathetic decision to the audience, and does waste the effort of the people who perhaps imagine themselves to be dutiful commentators.

    Uh, considering his recent twitter post… this sure is something. Also" “it does not feel like the system is set up to make that seem like a sympathetic decision to the audience” no shit sherlock, deleting a highly upvoted reply because it feels like too much effort to respond to is in fact going to make people unsympathetic (at the least).

  • froztbyte@awful.systems
    link
    fedilink
    English
    arrow-up
    5
    ·
    20 hours ago

    sorry my brain’s already clocked out for the day (/week?), so [insert sneer here]

    context is this

    Meanwhile, the $800 million Kraken raised across its two recent rounds of financing further solidifies the company’s balance sheet before its planned IPO next year… Prior to the two rounds, Kraken had raised only $27 million in venture capital.

    simply a mere 29.6x amplification. perfectly normal. 12000000% in line with the economic market shift between checks notes Less Than A Decade Ago and now

    • istewart@awful.systems
      link
      fedilink
      English
      arrow-up
      3
      ·
      3 hours ago

      Reeks to me of a mad dash for all-important “exit” on the part of the VC firms who invested. Especially as public-market IPOs have become so rare compared to M&A deals or SPAC bullshit.

    • lagrangeinterpolator@awful.systems
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 days ago

      In my experience most people just suck at learning new things, and vastly overestimate the depth of expertise. It doesn’t take that long to learn how to do a thing. I have never written a song (without AI assistance) in my life, but I am sure I could learn within a week. I don’t know how to draw, but I know I could become adequate for any specific task I am trying to achieve within a week. I have never made a 3D prototype in CAD and then used a 3D printer to print it, but I am sure I could learn within a few days.

      This reminds me of another tech bro many years ago who also thought that expertise is overrated, and things really aren’t that hard, you know? That belief eventually led him to make a public challenge that he could beat Magnus Carlsen in chess after a month of practice. The WSJ picked up on this, and decided to sponsor an actual match with him and Carlsen. They wrote a fawning article about it, but it did little to stop his enormous public humiliation in the chess community. Here’s a reddit thread discussing that incident: https://www.reddit.com/r/HobbyDrama/comments/nb5b1k/chess_one_month_to_beat_magnus_how_an_obsessive/

      As a sidenote, I found it really funny that he thought his best strategy was literally to train a neural network and … memorize all the weights and run inference with mental calculations during the game. Of course, on the day of the match, the strategy was not successful because his algorithm “ran out of time calculating”. How are so many techbros not even good at tech? Come on, that’s the one thing you’re supposed to know!

      • CinnasVerses@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        ·
        11 hours ago

        So pre-teen me reading the Biggles books with the gag about the pilot who tries to do ballistic calculations during a dogfight was saving me from being as stupid as a Californian?

      • Soyweiser@awful.systems
        link
        fedilink
        English
        arrow-up
        9
        ·
        1 day ago

        He will train a neural network on GM games, then memorize the algorithm and compute the moves in his head.

        The Rationalists.

      • rook@awful.systems
        link
        fedilink
        English
        arrow-up
        14
        ·
        1 day ago

        Lord grant me the confidence of a mediocre white man, etc.

        alt text

        A screenshot of a tweet by yougov, a uk-based organisation, showing the results of a survey which say

        One in eight men (12%) say they could win a point in a game of tennis against 23 time grand slam winner Serena Williams

        Include in the screenshot is a response by longwall26,

        Confident in my ability to properly tennis, I take the court. I smile at my opponent. Serena does not return the gesture. She’d be prettier if she did, I think. She serves. The ball passes cleanly through my skull, killing me instantly

      • fullsquare@awful.systems
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 day ago

        This reminds me of another tech bro many years ago who also thought that expertise is overrated, and things really aren’t that hard, you know?

        lmao, what’s his lesswrong username?

  • o7___o7@awful.systems
    link
    fedilink
    English
    arrow-up
    11
    ·
    2 days ago

    Off topic: if the Culture novels are ever adapted for television, all of the Culture people/ships/etc should have Scottish accents.

  • blakestacey@awful.systems
    link
    fedilink
    English
    arrow-up
    17
    ·
    2 days ago

    Hey, remember Grokipedia?

    Its article on Newton’s law of gravity is, like, 50% rendering errors by weight.

    screenshot of Grokipedia's article "Newton's law of universal gravitation", showing garbled text and math formulas

    • Soyweiser@awful.systems
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 day ago

      I saw (because for some reason grokipedia is now high in google search) that grok calls Roko (from rokos basilisk) pseudo anonymous, but isnt he just using full name and face on twitter? Such a weird small error (and change to the Wikipedia page). Didnt click the link obv.

    • BlueMonday1984@awful.systemsOP
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      2 days ago

      The Audio Mods are doing God’s work keeping the portal slop-free. Its good to know there’s at least one place where human-made work is still valued.

    • mirrorwitch@awful.systems
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 day ago

      What you say is something on this note: Oh wow I have this amazing investment opportunity for someone like you, nobody has seen it yet, but with your intelligence and business acumen, we will get rich quick…

    • CinnasVerses@awful.systems
      link
      fedilink
      English
      arrow-up
      12
      ·
      edit-2
      2 days ago

      Show them the RationalWiki page where Scott Alexander promised that he could only absorb the smart racism from crazy bloggers and ignore the stupid stuff, Elizabeth Sandifer warned him this was like drinking sewer water with just one filter, and then Alexander posted about how all of a sudden he was feeling more conservative and maybe the things he was reading were connected to that

      https://rationalwiki.org/wiki/Scott_Alexander (also archive.is and other backups)

    • zogwarg@awful.systems
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      2 days ago

      Assuming they have any amount of good faith, I would make the illustration that using AI is like dunning-kruger effect on steroids. It’s especially dangerous when you think know enough, but don’t know enough to know that you don’t.

    • froztbyte@awful.systems
      link
      fedilink
      English
      arrow-up
      10
      ·
      2 days ago

      it sucks when you learn a thing like that about a person. it’s like blowing an efuse: generally one way, not easy to go back without a lot of work, and you may not want to bother

      • froztbyte@awful.systems
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 days ago

        I’m sure specifics and timing don’t matter. can always tune that efficient later, amirite?!

        (although once again a very I am the scream at how rapidly this toxic waste is becoming rampant in the commons :|)