• originalfrozenbanana@lemm.ee
      link
      fedilink
      arrow-up
      70
      arrow-down
      1
      ·
      16 days ago

      Don’t worry this post was written by a first year computer science student who just learned about C. No need to look too closely at it.

    • Mac@mander.xyz
      link
      fedilink
      English
      arrow-up
      6
      ·
      16 days ago

      I bet an LLM could have written this meme without making that mistake.

      Embarrassing.

      • krimson@lemmy.world
        link
        fedilink
        arrow-up
        52
        arrow-down
        1
        ·
        edit-2
        16 days ago

        If you create a meme like this at least inform yourself on how computers and software work.

      • flamingos-cant@feddit.uk
        link
        fedilink
        English
        arrow-up
        27
        arrow-down
        1
        ·
        16 days ago

        Screaming at my single-threaded, synchronous web scraper “Why are you so slow, I have a 4090!”

      • Grumpy@sh.itjust.works
        link
        fedilink
        arrow-up
        3
        ·
        16 days ago

        What makes you think python is in optimized and bloated?

        Did you know vast majority of AI development happening right now is on python? The thing that literally consumes billions of dollars of even-beefier-than-4090 GPUs like A100. Don’t you think if they could do this more efficiently and better on C or assembly, they would do it? They would save billions.

        Reality is that it makes no benefit to move away from python to lower level languages. There is no poor optimization you seek. In fact if they were to try this in lower level languages, they’ll take even longer to optimize and yield worse results.

  • Prunebutt@slrpnk.net
    link
    fedilink
    arrow-up
    55
    arrow-down
    1
    ·
    16 days ago

    I’m happy if it’s actually running in python and not a javascript app with electron.

    • LANIK2000@lemmy.world
      link
      fedilink
      arrow-up
      13
      ·
      16 days ago

      Idk, it’s rare for an electron app to literally not even run. Meanwhile I’m yet to encounter a python app that doesn’t require me to Google what specific environment the developer had and recreate it.

  • Diplomjodler@lemmy.world
    link
    fedilink
    arrow-up
    44
    arrow-down
    9
    ·
    16 days ago

    Ah yes, those precious precious CPU cycles. Why spend one hour writing a python program that runs for five minutes, if you could spend three days writing it in C++ but it would finish in five seconds. Way more efficient!

    • BCsven@lemmy.ca
      link
      fedilink
      arrow-up
      31
      arrow-down
      1
      ·
      edit-2
      16 days ago

      Because when it is to actually get paid work done, all the bloat adds up and that 3 days upfront could shave weeks/months of your yearly tasks. XKCD has a topic abut how much time you can spend on a problem before effort outweighs productivity gains. If the tasks are daily or hourly you can actually spend a lot of time automating for payback

      And note this is one instance of task, imagine a team of people all using your code to do the task, and you get a quicker ROI or you can multiply dev time by people

      • deegeese@sopuli.xyz
        link
        fedilink
        arrow-up
        17
        ·
        16 days ago

        That also goes to show why to not waste 3 days to shave 2 seconds off a program that gets run once a week.

        • BCsven@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          edit-2
          15 days ago

          Agreed. Or look at the manual effort, is it worth coding it, or just do it manually for one offs. A coworker would code a bunch of mundane tasks for single problems, where I would check if it actually will save time or I just manually manipulate the data myself.

      • Diplomjodler@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        16 days ago

        You can write perfectly well structured and maintainable code in Python and still be more productive than in other languages.

      • _pi@lemmy.ml
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        15 days ago

        SDLC can be made to be inefficient to maximize billable hours, but that doesn’t mean the software is inherently badly architected. It could just have a lot of unnecessary boilerplate that you could optimize out, but it’s soooooo hard to get tech debt prioritized on the road map.

        Killing you own velocity can be done intelligently, it’s just that most teams aren’t killing their own velocity because they’re competent, they’re doing it because they’re incompetent.

        And note this is one instance of task, imagine a team of people all using your code to do the task, and you get a quicker ROI or you can multiply dev time by people

        In practice, is only quicker ROI if your maintenance plan is nonexistent.

      • Diplomjodler@lemmy.world
        link
        fedilink
        arrow-up
        6
        arrow-down
        2
        ·
        15 days ago

        Welp, I’m not saying you should use Python for everything. But for a lot of applications, developer time is the bottleneck, not computing resources.

    • Ephera@lemmy.ml
      link
      fedilink
      arrow-up
      3
      ·
      15 days ago

      So, I’ve noticed this tendency for Python devs to compare against C/C++. I’m still trying to figure out why they have this tendency, but yeah, other/better languages are available. 🙃

    • eldavi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      1
      ·
      16 days ago

      exactly! i prefer python or ruby or even java MUCH more than assembly and maybe C

      • menemen@lemmy.ml
        link
        fedilink
        arrow-up
        10
        ·
        edit-2
        16 days ago

        I mean, I’d say it depends on what you do. When I see grad students writing numeric simulations in python I do think that it would be more efficient to learn a language that is better suited for that. And I know I’ll be triggering many people now, but there is a reason why C and Fortran are still here.

        But if it is for something small, yeah of course, use whatever you like. I do most of my stuff in R and R is a lot of things, but not fast.

        • eldavi@lemmy.ml
          link
          fedilink
          English
          arrow-up
          3
          ·
          16 days ago

          But if it is for something small, yeah of course, use whatever you like.

          or if you have a deadline and using something else would make you miss that deadline.

  • Postmortal_Pop@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    16 days ago

    I know it makes me sound like an of man shouting at clouds but the other day I installed Morrowind and was genuinely blown away by how smooth and reliable it ran and all the content in the game fitting in 2gb of space. Skyrim requires I delete my other games to make room and still requires a whole second game worth of mods to match the stability and quantity of morrowind.

  • pewpew@feddit.it
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    15 days ago

    “Python is bloat” wait until you look at NodeJS “node_modules” folder

  • vane@lemmy.world
    link
    fedilink
    arrow-up
    5
    ·
    15 days ago

    Armatures, I only write software using my hammer by punching holes in steel plates.

  • UnfortunateShort@lemmy.world
    link
    fedilink
    arrow-up
    3
    ·
    16 days ago

    It used to be pretty terrible, but the frameworks are getting there, starting with the languages they are based on.

    Believe it or not, Java has been optimized a ton and can be written to be very efficient these days. Another great example of a high-level, high-efficiency language is Julia. And then there is Rust of course, which basically only sacrifices memory-efficiency for C-speeds with Python-esque comfort. It’s getting better.

  • ThirdConsul@lemmy.ml
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    15 days ago

    Tbh this all seems to be related to following principles like Solid or following software design patterns. There’s a few articles about CUPID, SOLID performance hits, etc

    • it all suggests that following software design patterns cost about a decade of hardware progress.
    • _pi@lemmy.ml
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      15 days ago

      Absolutely not lol.

      If SOLID is causing you performance problems, it’s likely completely solvable.

      Most companies throwing out shitty software have engineers who couldn’t tell you what SOLID is without looking it up.

      Most people who use this line of reasoning don’t have an actual understanding of how often patterns are applied or misapplied in the industry and why.

      SOLID might be a bottle neck for software that needs to be real-time compliant with stable jitter and ultra-low latency, the vast majority of apps are just spaghetti code.

  • superkret@feddit.org
    link
    fedilink
    arrow-up
    3
    arrow-down
    5
    ·
    15 days ago

    “bloat” is just short for “your computer sucks”.
    Dump your peasant tier shit and go fill up that 42U rack.