• Lizardking27@lemmy.world
    link
    fedilink
    arrow-up
    92
    ·
    4 months ago

    Ugh. Can I just say how much I fucking HATE how every single fucking product on the market today is a cheap, broken, barely functional piece of shit.

    I swear to God the number of times I have to FIX something BRAND NEW that I JUST PAID FOR is absolutely ridiculous.

    I knew I should’ve been an engineer, how easy must it be to sit around and make shit that doesn’t work?

    Fucking despicable. Do better or die, manufacturers.

    • Doombot1@lemmy.one
      link
      fedilink
      arrow-up
      24
      ·
      4 months ago

      Most of the time, the product itself comes out of engineering just fine and then it gets torn up and/or ruined by the business side of the company. That said, sometimes people do make mistakes - in my mind, it’s more of how they’re handled by the company (oftentimes poorly). One of the products my team worked on a few years ago was one that required us to spin up our own ASIC. We spun one up (in the neighborhood of ~20-30 million dollars USD), and a few months later, found a critical flaw in it. So we spun up a second ASIC, again spending $20-30M, and when we were nearly going to release the product, we discovered a bad flaw in the new ASIC. The products worked for the most part, but of course not always, as the bug would sometimes get hit. My company did the right thing and never released the product, though.

      • /home/pineapplelover@lemm.ee
        link
        fedilink
        arrow-up
        6
        arrow-down
        1
        ·
        edit-2
        4 months ago

        It’s almost never the engineers fault. That whole Nasa spacecraft that exploaded was due to bureaucracy and pushing the mission forwards.

      • Allonzee@lemmy.world
        link
        fedilink
        arrow-up
        9
        ·
        edit-2
        4 months ago

        Capitalism: “Growth or die!”

        Earth: I mean… If that’s how it’s gotta be, you little assholes🤷👋🔥

        It’s kind of gallows hilarious that for all the world’s religions worshipping ridiculous campfire ghost stories, we have a creator, we have a remarkable macro-organism mother consisting of millions of species, her story of hosting life going back 3.8 billion years, most living in homeostasis with their ecosystem.

        But to our actual, not fucking ridiculous works of lazy fiction creator, Earth, we literally choose to treat her like our property to loot, rape, and pillage thoughtlessly, and continue to act as a cancer upon her eyes wide open. We as a species are so fucking weird, and not the good kind.

      • volodya_ilich@lemm.ee
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        Not really, and I say this being a communist myself. Capitalism just requires to extract the maximum profit from the capital investment, sometimes it leads to what you said, sometimes it leads to the opposite (e.g. no difference between i5 1st gen and i5 8th gen)

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      10
      ·
      4 months ago

      It’s not easy to make shit that doesn’t work if you care about what you’re doing. I bet there’s angry debates between engineers and business majors behind many of these enshitifications.

      Though, for these Intel ones, they might have been less angry and more “are you sure these risks are worth taking?” because they probably felt like they had to push them to the extreme to compete. The angry conversations probably happened 5-10 years ago before AMD brought the pressure when Intel was happy to assume they had no competition and didn’t have to improve things that much to keep making a killing. At this point, it’s just a scramble to make up for those decisions and catch up. Which their recent massive layoffs won’t help with.

    • 31337@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      4 months ago

      I’ve put together 2 computers the last couple years, one Intel (12th gen, fortunately) and one AMD. Both had stability issues, and I had to mess with the BIOS settings to get them stable. I actually had to under-clock the RAM on the AMD (probably had something to do with maxing-out the RAM capacity, but I still shouldn’t need to under-clock, IMO). I think I’m going to get workstation-grade components the next time I need to build a computer.

    • InputZero@lemmy.ml
      link
      fedilink
      arrow-up
      2
      ·
      4 months ago

      So this doesn’t apply to the Intel situation, but a good lesson to learn is that the bleeding edge cuts both ways. Meaning that anyone buying the absolute latest technology, there’s going to be some friction with usability at first. It should never surmount to broken hardware like the Intel CPUs, but buggy drivers for a few weeks/months is kinda normal. There’s no way of knowing what’s going to happen when a brand new product is going to be released. The producer must do their due diligence and test for anything catastrophic but weird things happen in the wild that no one can predict. Like I said at the top, this doesn’t apply to Intel’s situation because it was a catastrophic failure, but if you’re ever on the bleeding edge assume eventually you’re going to get cut.

    • Red_October@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      Welcome to capitalism. Infinite growth is required, and when a market is well and truly saturated the next step is cutting more and more costs.

      Incidentally, Cancer also pursues a similar strategy.

  • arefx@lemmy.ml
    link
    fedilink
    arrow-up
    43
    ·
    edit-2
    4 months ago

    Ryzen gang

    My 7800x3d is incredible, I won’t be going back to Intel any time soon.

      • felsiq@lemmy.zip
        link
        fedilink
        arrow-up
        14
        ·
        4 months ago

        To put this into context, the zen5 X3D chips aren’t out yet so this isn’t really an apples to apples comparison between generations. Also, zen5 was heavily optimized for efficiency rather than speed - they’re only like 5% faster than zen4 (X series, not X3D ofc) last I saw but they do that at the zen3 TDPs, which is crazy impressive. I’m not disagreeing with you about the 7800X3D - I love that chip, it’s def a good one - just don’t want people to get the wrong idea about zen5.

      • SuperIce@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        4 months ago

        Not sure how much longer I’ll be using the 5950x tbh. We’ve reached a point where the mobile processors have faster multicore (for the AI 370) than the 5950X without gulping down boatloads of power.

    • r00ty@kbin.life
      link
      fedilink
      arrow-up
      5
      ·
      4 months ago

      Also on the 7800X3D. I think I switched at just the right time. I’ve been on Intel since the Athlon XP. The next buy would have been 13/14th gen.

    • Rakonat@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      4 months ago

      Me who bought AMD cpu and gpu last year for my new rig cause fuck the massive mark up for marginal improvement on last gen stats.

      • LeadersAtWork@lemmy.world
        link
        fedilink
        arrow-up
        12
        ·
        edit-2
        4 months ago

        tldr: Flaw can give a hacker access to your computer only if they have already bypassed most of the computer’s security.

        This means continue not going to sketchy sites.

        Continue not downloading that obviously malicious attachment.

        Continue not being a dumbass.

        Proceed as normal.

        Because if a hacker got that deep your system is already fucked.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          4 months ago

          It’s more serious than normal because if your PC ever gets owned, a wipe and reinstall will not remove the exploit.

          “Nissim sums up that worst-case scenario in more practical terms: “You basically have to throw your computer away.””

      • arefx@lemmy.ml
        link
        fedilink
        arrow-up
        6
        ·
        4 months ago

        I’m not that worried about it effecting me lol, i would be more concerned about my intel cpu dying, especially since it’s been around for decades.

  • linkhidalgogato@lemmy.ml
    link
    fedilink
    arrow-up
    25
    ·
    4 months ago

    im a fan of no corporation especially not fucking amd, but they have been so much better than intel recently that im struggling to understand why anyone still buys intel

    • Angry_Autist (he/him)@lemmy.world
      link
      fedilink
      arrow-up
      16
      arrow-down
      2
      ·
      4 months ago

      Of all the CPU and GPU manufacturers out there, AMD is the most consistently pro-consumer with the least corporate fuckery, so I take mighty exception at your ‘especially not fucking amd’ comment.

    • WarlordSdocy@lemmy.world
      link
      fedilink
      arrow-up
      4
      arrow-down
      1
      ·
      4 months ago

      Most of the shopping I’ve been helping people with lately has been for laptops. And while there are slightly more AMD options then before laptops are still dominated by Intel for the most part. Especially if you’re trying to help someone pick something while on a tighter budget.

      • linkhidalgogato@lemmy.ml
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        4 months ago

        thats fair if u are looking for the cheapest laptops basically nothing is amd, also i bet most people dont know what those powered by x stickers even mean nor care and honestly why should they. i didnt consider that, i was more thinking about people making their own pcs but it is also wierd that laptop manufacturers and oems prefer intel so much maybe efficiency is the biggest factor i know amds cpus tend to be more power hungry

      • nlgranger@lemmy.world
        link
        fedilink
        arrow-up
        2
        ·
        4 months ago

        They are bad at writing software and firmware support is sketchy. That second point is technically the motherboard vendors fault but it could be due to confusing design and documentation on the AMD side. Hardware-wise they are great AFAIK.

        • saigot@lemmy.ca
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          Amd has always run really lean in terms of employees which hurts their quality imo. In 2016 (a year before ryzen 1 came out amds lowest point quality wise) intel had ~100k employees, at the same time amd had a little over 8000 and supported a wider portfolio of products, today amd is up to about 30k and it shows (although until last week intel was also up to 130k)

  • kamen@lemmy.world
    link
    fedilink
    arrow-up
    21
    ·
    4 months ago

    Don’t be a fan of one or the other, just get what’s more appropriate at the time of buying.

  • w2tpmf@lemmy.world
    link
    fedilink
    arrow-up
    19
    arrow-down
    2
    ·
    4 months ago

    This keeps getting slightly misrepresented.

    There is no fix for CPUs that are already damaged.

    There is a fix now to prevent it from happening to a good CPU.

    • exanime@lemmy.world
      link
      fedilink
      arrow-up
      15
      ·
      edit-2
      4 months ago

      But isn’t the fix basically under clocking those CPU?

      Meaning the “solution” (not even out yet) is crippling those units before the flaw cripples them?

      • Kazumara@discuss.tchncs.de
        link
        fedilink
        arrow-up
        10
        ·
        edit-2
        4 months ago

        They said the cause was a bug in the microcode making the CPU request unsafe voltages:

        Our analysis of returned processors confirms that the elevated operating voltage is stemming from a microcode algorithm resulting in incorrect voltage requests to the processor.

        If the buggy behaviour of the voltage contributed to higher boosts, then the fix will cost some performance. But if the clocks were steered separately from power, and the boost clock is still achieved without the overly high voltage, then it might be performance neutral.

        I think we will know for sure soon, multiple reviewers announced they were planning to test the impact.

      • w2tpmf@lemmy.world
        link
        fedilink
        arrow-up
        7
        ·
        4 months ago

        That was the first “Intel Baseline Profile” they rolled out to mobo manufacturers earlier in the year. They’ve roll out a new fix now.

          • w2tpmf@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            4 months ago

            I have a i7-13700k that’s been sitting in the box since I got a deal on it last month. I was pondering returning it and spending the extra couple hundred to get an AMD setup.

            I’ve been following all this then checked on the Asus site for my board and saw the BIOS updates…

            Updated with microcode 0x125 to ensure eTVB operates within Intel specificatIons…

            And this week there’s a beta release…

            The new BIOS includes Intel microcode 0x129…

    • scrion@lemmy.world
      link
      fedilink
      arrow-up
      22
      ·
      edit-2
      4 months ago

      For years, Intel’s compiler, math library MKL and their profiler, VTune, really only worked well with their own CPUs. There was in fact code that decreased performance if it detected a non-Intel CPU in place:

      https://www.agner.org/optimize/blog/read.php?i=49&v=f

      That later became part of a larger lawsuit, but since Intel is not discriminating against AMD directly, but rather against all other non-Intel CPUs, the result of the lawsuit was underwhelming. In fact, it’s still a problem today:

      https://www.extremetech.com/computing/302650-how-to-bypass-matlab-cripple-amd-ryzen-threadripper-cpus

      https://medium.com/codex/fixing-intel-compilers-unfair-cpu-dispatcher-part-1-2-4a4a367c8919

      Given that the MKL is a widely used library, people also indirectly suffer from this if they buy an AMD CPU and utilize software that links against that library.

      As someone working in low-level optimization, that was/is a shitty situation. I still bought an AMD CPU after the latest fiasco a couple of weeks ago.

    • Scrubbles@poptalk.scrubbles.tech
      link
      fedilink
      English
      arrow-up
      11
      arrow-down
      1
      ·
      4 months ago

      Honestly even with gpus now too. I was forced to team green for a few years because they were so far behind. Now though, unless you absolutely need a 4090 for some reason, you can get basically the same performance from and, for 70% of the cost

      • Cyborganism@lemmy.ca
        link
        fedilink
        arrow-up
        3
        ·
        4 months ago

        I haven’t really been paying much attention to the latest GPU news, but can AMD cards do ray tracing and dlss and all that jazz that comes with RTX cards?

        • natebluehooves@pawb.social
          link
          fedilink
          arrow-up
          6
          ·
          4 months ago

          DLSS is off the table, but you CAN raytrace. That being said I do not see the value of RT myself. It has the greatest performance impact of any graphical setting and often looks only marginally better than baked in lighting.

          • Cyborganism@lemmy.ca
            link
            fedilink
            arrow-up
            4
            ·
            4 months ago

            It depends greatly on the game. I’ve seen a huge difference in games like Control where the game itself was used to feature that… Well… Feature! You can see it in the quality of the lighting and the reflections. You also get better illumination on darker areas thanks to radiated lighting. It’s much more natural looking.

          • linkhidalgogato@lemmy.ml
            link
            fedilink
            arrow-up
            2
            ·
            4 months ago

            dlss is a brand name both amd and intel have their own version of the same thing, and they are only a little worse if at all.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          5
          ·
          4 months ago

          Yes, but by different names. They use FSR that’s basically the same thing, I haven’t noticed a difference in quality. Ray tracing too, just not branded as RTX

        • vithigar@lemmy.ca
          link
          fedilink
          arrow-up
          2
          ·
          edit-2
          4 months ago

          There is analogous functionality for most of it, though it’s generally not quite as good across the board.

          FSR is AMD’s answer to DLSS, but the quality isn’t quite as good. However the implementation is hardware agnostic so everyone can use it, which is pretty nice. Even Nvidia’s users with older GPUs like a 1080 who are locked out of using DLSS can still use FSR in supported games. If you have an AMD card then you also get the option in the driver settings of enabling it globally for every game, whether it has support built in or not.

          Ray tracing is present and works just fine, though their performance is about a generation behind. It’s perfectly usable if you keep your expectations in line with that though. Especially in well optimized games like DOOM Eternal or light ray tracing like in Guardians of the Galaxy. Fully path traced lighting like in Cyberpunk 2077 is completely off the table though.

          Obviously AMD has hardware video encoders. People like to point out that the visual quality of then is lower than Nvidia’s but I always found them perfectly serviceable. AMD’s background recording stuff is also built directly into their driver suite, no need to install anything extra.

          While they do have their own GPU-powered microphone noise removal, a la RTX Voice, AMD does lack the full set of tools found in Nvidia Broadcast, e.g. video background removal and whatnot. There is also no equivalent to RTX HDR.

          Finally, if you’ve an interest in locally running any LLM or diffusion models they’re more of a pain to get working well on AMD as the majority of implementations are CUDA based.

      • anivia@lemmy.ml
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        edit-2
        4 months ago

        I disagree. Processing power may be similar, but Nvidia still outperforms with raytracing, and more importantly DLSS.

        Whats the point of having the same processing power, when Nvidia still gets more than double the FPS in any game that supports DLSS

        • reliv3@lemmy.world
          link
          fedilink
          arrow-up
          4
          ·
          4 months ago

          FSR exists, and FSR 3 actually looks very good when compared with DLSS. These arguments about raytracing and DLSS are getting weaker and weaker.

          There are still strong arguments for nvidia GPUs in the prosumer market due to the usage of its CUDA cores with some software suites, but for gaming, Nvidia is just overcharging because they still hold the mindshare.

        • Scrubbles@poptalk.scrubbles.tech
          link
          fedilink
          English
          arrow-up
          2
          ·
          4 months ago

          I had the 3090 and then the 6900xtx. The differences were minimal, if even noticeable. Ray tracing is about a generation behind from Nvidia to and, but they’re catching up.

          As the other commenter said too fsr is the same as dlss. For me, I actually got a better frame rate with fsr playing cyberpunk and satisfactory than I did dlss!

      • sparkle@lemm.ee
        link
        fedilink
        arrow-up
        8
        ·
        4 months ago

        Are you just posting this under every comment? This isn’t even a fraction as bad as the Intel CPU issue. Something tells me you have Intel hardware…

  • lolcatnip@reddthat.com
    link
    fedilink
    English
    arrow-up
    14
    ·
    4 months ago

    I switched to AMD largely for better battery performance, but this mades me feel like I dodged a bullet.

    • papalonian@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      4 months ago

      Just out of curiosity, when you say better battery performance, what kind of battery are we talking about? Is this in a laptop, a desktop on some sort of remote/ backup system?

        • papalonian@lemmy.world
          link
          fedilink
          arrow-up
          1
          ·
          4 months ago

          I see, so is it a known thing that AMD CPU laptops generally have better battery life? I always see arguments for one CPU/GPU over another because of better power consumption, but I’ve never been in a position where I needed to worry much about it, so I’ve never looked much into the claims.

          • lolcatnip@reddthat.com
            link
            fedilink
            English
            arrow-up
            2
            ·
            4 months ago

            Seemed that way when I was shopping last, but that was over a year ago so I can’t cite sources. Supposedly their low mode uses less power and runs faster than Intel’s. I can’t confirm the faster part but it definitely lasts longer on battery power than any of the Intel laptops I’ve owned.

          • sparkle@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            4 months ago

            AMD CPUs indeed have better efficiency when it comes to energy used, or so I always hear.

  • angrystego@lemmy.world
    link
    fedilink
    arrow-up
    11
    ·
    4 months ago

    I thought the point would be a depressed and self deprecating “I’m something of an Intel CPU myself”.

  • littletranspunk@lemmus.org
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    1
    ·
    edit-2
    4 months ago

    Glad my first self-built PC is full AMD (built about a year ago).

    Screw Intel and Nvidia

    7700X is what it was built with

    • zaphodb2002@sh.itjust.works
      link
      fedilink
      arrow-up
      3
      ·
      4 months ago

      I loved my FX cpu but I lived in a desert and the heat in the summer coming off that thing would make my room 100F or more. First machine I built a custom water loop for. Didn’t help with the heat in the room, but did stop it from shutting down randomly, so I could continue to sit in the sweltering heat in my underpants and play video games until dawn. Better times.

      • rotopenguin@infosec.pub
        link
        fedilink
        English
        arrow-up
        2
        ·
        4 months ago

        You might want to go through the trouble of extending that radiator loop all the way out through a window.

      • Bytemeister@lemmy.world
        link
        fedilink
        Ελληνικά
        arrow-up
        2
        ·
        4 months ago

        I had the FX8350 Black Edition, and that thing would keep my room at 70f… In the winter… With a window open.

        Summer gaming was BSOD city. I miss it so much.

      • helpmepickaname@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        1
        ·
        4 months ago

        Of course it didn’t help the heat in the room, the heat from the CPU still has to go somewhere. Better coolers aren’t for the room, they’re for the CPU. in fact a better cooler could make the room hotter because it is removing heat at a higher rate from the CPU and dumping it into the room

  • gmtom@lemmy.world
    link
    fedilink
    arrow-up
    3
    arrow-down
    1
    ·
    4 months ago

    Can we talk about how utterly useless that default could cooler is? Like for relatively high end gaming CPU it really shouldn’t be legal for it to ship with something so useless.