• Thaurin@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      16 days ago

      A supercomputer running Windows HPC Server 2008 actually ranked 23 in TOP500 in June 2008.

      • tate@lemmy.sdf.org
        link
        fedilink
        arrow-up
        2
        ·
        edit-2
        16 days ago

        I always forget that Windows Server even exists, because the name is so stupid. “windows” should mean “gui interface to os.”

        edit: fixed redundacy.

          • KubeRoot@discuss.tchncs.de
            link
            fedilink
            English
            arrow-up
            6
            ·
            16 days ago

            I’d say having a GUI is not inherently stupid. The stupid part is, if I understand it correctly, the GUI being a required component and the primary access method.

          • dan@upvote.au
            link
            fedilink
            arrow-up
            2
            ·
            16 days ago

            The GUI is optional these days, and there’s plenty of Windows servers that don’t use it. The recommended administration approach these days is PowerShell remoting, often over SSH now that Windows has a native SSH server bundled (based on OpenSSH).

            • yogurtwrong@lemmy.world
              link
              fedilink
              arrow-up
              1
              ·
              edit-2
              15 days ago

              That gives me the idea of windows server installed on bare metal configured as a lightweight game runner. (much like a linux distro with minimal wm)

              I’ve seen people using slightly modified windows server as an unbloated gaming OS but I’m not sure if running a custom minimal GUI on windows server is possible. You seem knowledgeable on the subject, with enough effort, is it possible?

  • grue@lemmy.world
    link
    fedilink
    English
    arrow-up
    92
    ·
    edit-2
    17 days ago

    So basically, everybody switched from expensive UNIX™ to cheap “unix”-in-all-but-trademark-certification once it became feasible, and otherwise nothing has changed in 30 years.

    • Allero@lemmy.today
      link
      fedilink
      arrow-up
      39
      ·
      edit-2
      17 days ago

      Except this time the Unix-like took 100% of the market

      Was too clear this thing is just better

          • Grimpen@lemmy.ca
            link
            fedilink
            arrow-up
            28
            ·
            17 days ago

            I think it was PS3 that shipped with “Other OS” functionality, and were sold a little cheaper than production costs would indicate, to make it up on games.

            Only thing is, a bunch of institutions discovered you could order a pallet of PS3’s, set up Linux, and have a pretty skookum cluster for cheap.

            I’m pretty sure Sony dropped “Other OS” not because of vague concerns of piracy, but because they were effectively subsidizing supercomputers.

            Don’t know if any of those PS3 clusters made it onto Top500.

      • whaleross@lemmy.world
        link
        fedilink
        arrow-up
        8
        ·
        edit-2
        17 days ago

        Apple had its current desktop environment for it’s proprietary ecosystem built on BSD with their own twist while supercomputers are typically multiuser parallel computing beats, so I’d say it is really fucking surprising. Pretty and responsive desktop environments and breathtaking number crunchers are the polar opposites of a product. Fuck me, you’ll find UNIX roots in Windows NT but my flabbers would be ghasted if Deep Blue had dropped a Blue Screen.

    • Spezi@feddit.org
      link
      fedilink
      arrow-up
      61
      ·
      16 days ago

      Those were the basic entry level configurations needed to run Windows Vista with Aero effects.

      • Psythik@lemmy.world
        link
        fedilink
        arrow-up
        5
        arrow-down
        13
        ·
        16 days ago

        Meh, you just needed a discrete GPU, and not even a good one either. Just a basic, bare-bones card with 128MB of VRAM and pixel shader 2.0 support would have sufficed, but sadly most users didn’t even have that back in 06-08.

        It was mostly the consumer’s fault for buying cheap garbage laptops with trash-tier iGPUs in them, and the manufacturer’s for slapping a “compatible with Vista” sticker on them and pushing those shitboxes on consumers. If you had a half-decent $700-800 PC then, Vista ran like a dream.

        • porl@lemmy.world
          link
          fedilink
          English
          arrow-up
          17
          ·
          16 days ago

          No, it was mostly the manufacturers fault for implying that their machine would run the operating system it shipped with well. Well that and Microsoft’s fault for strong arming them to push Vista on machines that weren’t going to run it well.

          • Psythik@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            15 days ago

            APUs obviously weren’t a thing yet, and it was common knowledge back then that contemporary iGPUs were complete and utter trash. I mean they were so weak that you couldn’t even play HD video or even enable some of XP’s very basic graphical effects with most integrated graphics.

            Everyone knew that you needed a dedicated graphics card back then, so you can and should in fact put some blame on the consumer for being dumb enough to buy a PC without one, regardless of what the sticker said. I mean I was a teenager back then and even still I knew better. The blame goes both ways.

            • porl@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              1
              ·
              15 days ago

              No, if you weren’t “involved in the scene” and only had the word of the person at the store then you have no idea what an iGPU is, let alone why they weren’t up to the task of running the very thing it was sold with.

              You were a teenager in a time where teenagers average tech knowledge was much higher than before. That is not the same as someone who just learnt they now need one of those computer things for work. Not everyone had someone near them who could explain it to them. Blaming them for not knowing the intricacies of the machines is ridiculous. It was pure greed by Microsoft and the manufacturers.

        • olympicyes@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          16 days ago

          Most computers sold are the lowest end models. At work we never got anything decent so it was always a bit of a struggle. Our office stayed with XP for way longer than we should have so we skipped Vista altogether and adopted Windows 7 a few years late.

  • Z3k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    33
    arrow-down
    1
    ·
    17 days ago

    As someone who worked on designing racks in the super computer space about 10 q5vyrs ago I had no clue windows and mac even tried to entered the space

  • ipkpjersi@lemmy.ml
    link
    fedilink
    arrow-up
    23
    ·
    16 days ago

    Wow, that’s kind of a lot more Linux than I was expecting, but it also makes sense. Pretty cool tbh.

    • superkret@feddit.orgOP
      link
      fedilink
      arrow-up
      26
      ·
      17 days ago

      I think you can actually see it in the graph.
      The Condor Cluster with its 500 Teraflops would have been in the Top 500 supercomputers from 2009 till ~2014.
      The PS3 operating system is a BSD, and you can see a thin yellow line in that exact time frame.

    • A7thStone@lemmy.world
      link
      fedilink
      arrow-up
      4
      ·
      17 days ago

      Yes, in the linux stat. The otheros option on the early PS3 allowed you to boot linux, which is what most, of not all, of the clusters used.

      • BallsandBayonets@lemmings.world
        link
        fedilink
        arrow-up
        3
        ·
        17 days ago

        How can there be N/A though? How can any functional computer not have an operating system? Or is just reading the really big MHz number of the CPU count as it being a supercomputer?

        • superkret@feddit.orgOP
          link
          fedilink
          arrow-up
          5
          ·
          16 days ago

          Early computers didn’t have operating systems.
          You just plugged in a punch card or tape with the program you want to run and the computer executed those exact instructions and nothing else.
          Those programs were specifically written for that exact hardware (not even for that model, but for that machine).
          To boot up the computer, you had to put a number of switches into the correct position (0 or 1), to bring its registers in the correct state to accept programs.

          So you were the BIOS and bootloader, and there was no need for an OS because the userspace programs told the CPU directly what bits to flip.

        • sep@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          16 days ago

          They ofcouse had one, probably linux, or unix. But that information, about the cluster, is not available.

  • Mwa@lemm.ee
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    16 days ago

    Maybe windows is not used in supercomputers often because unix and linux is more flexiable for the cpus they use(Power9,Sparc,etc)

  • Sanctus@lemmy.world
    link
    fedilink
    English
    arrow-up
    8
    ·
    17 days ago

    We’re gonna take the test, and we’re gonna keep taking it until we get one hundred percent in the bitch!

    • superkret@feddit.orgOP
      link
      fedilink
      arrow-up
      39
      ·
      16 days ago

      Unix is basically a brand name.
      BSD had to be completely re-written to remove all Unix code, so it could be published under a free license.
      It isn’t Unix certified.

      So it is Unix-derived, but not currently a Unix system (which is a completely meaningless term anyway).

          • dan@upvote.au
            link
            fedilink
            arrow-up
            5
            ·
            16 days ago

            Microsoft could technically get Windows certified as UNIX.

            I don’t think they could now that the POSIX subsystem and Windows Services for UNIX are both gone. Don’t you need at least some level of POSIX compliance (at least the parts where POSIX and Unix standards overlap) to get Unix certified?

        • dustyData@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          16 days ago

          It means nothing, it’s just a paycheck you sign and then you get to say “I certify my OS is Unix”. The little bit more technical part is POSIX compliance but modern OSs are such massive and complex beasts today that those compliances are tiny parts and very slowly but very surely becoming irrelevant over time.

          Apple made OSX Unix certified because it was cheap and it got them off the hook from a lawsuit. That’s it.

    • dev_null@lemmy.ml
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      16 days ago

      To make it more specific I guess, what’s the problem with that? It’s like having a “people living on boats” and “people with no long term address”. You could include the former in the latter, but then you are just conveying less information.