• MangoPenguin@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    8 hours ago

    AMD needs to fix their software.

    I had an AMD GPU last year for a couple weeks, but their software barely works. The overlay didn’t scale properly on a 4k screen and cut off half the info, and wouldn’t even show up at all most of the time, ‘ReLive’ with instant replay enabled caused a performance hit with stuttering in high FPS games…

    Maybe they have it now, but I also couldn’t find a way to enable HDR on older games like Nvidia has.

  • Omega_Jimes@lemmy.ca
    link
    fedilink
    English
    arrow-up
    15
    ·
    19 hours ago

    I know it’s not indicative of the industry as a whole, but the Steam hardware survey has Nvidia at 75%. So while they’re still selling strong, as others have indicated, I’m not confident they’re getting used for gaming.

    • WolfLink@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      1
      ·
      6 hours ago

      Everyone and their manager wants to play with LLMs and and and Intel still don’t have a real alternative to CUDA and so are much less popular for compute applications.

  • ssillyssadass@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 hours ago

    I think Nvidia has better marketing. I never really hear anything about AMD cards, where I would I instead hear about Nvidia.

  • Luffy@lemmy.ml
    link
    fedilink
    English
    arrow-up
    28
    arrow-down
    2
    ·
    1 day ago

    „The Market” is not a good measure. Hell, its not even a measure at all. No consumer is able to pull any good info from this article.

    Its the equivalent of „How much money has been spent of products by company xy”, completely disregarding if the products sold are even competing with each other, let alone if the production of one company is even trying to sell at that scale

    Now regarding the article: they are not differentiating between enterprise and personal grade products. Of course Intel is non existent in Enterprise GPU sales, because they don’t even sell fucking Enterprise GPUs. Same with amd.

    This is like comparing a local steel working company with weckerle machines who mostly makes industry Make-up equipment (out of steel) and saying that Weckerle dominates the Market

    Or like saying „Gamers Beware: Pre-built PCs are dominating the market”, then showing a study about „ Computing devices”, and showing that the 2 main sources are Enterprise buying bulk and NUCs, both of which have nothing to do with what the article is implying in the first place, since, and say this with me

    • Enterprise devices are completely different from consumer devices, both in terms of price and in volume, and if compared directly (in the middle of an economic crisis) of course an Enterprise is going to spend way more money on one category.
    • Evil_Shrubbery@thelemmy.club
      link
      fedilink
      English
      arrow-up
      5
      ·
      edit-2
      1 day ago

      Y’all have pre-built phones? And even laptops? Or car computers??
      Weirdos.

      /s

      But def, this type of info is at best for the investors (and even then just unstructured info about market shares), not consumers.

    • GaMEChld@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      14 hours ago

      The vast majority of consumers do not watch or read reviews. They walk into a Best Buy or whatever retailer and grab the box that says GeForce that has the biggest number within their budget. LTT even did a breakdown at some point that showed how even their most watched reviews have little to no impact on sales numbers. Nvidia has the mind share. In a lot of people’s minds GeForce = Graphics. And I say all that as someone who is currently on a Radeon 7900XTX. I’d be sad to see AMD and Intel quit the dGPU space, but I wouldn’t be surprised.

      • MystikIncarnate@lemmy.ca
        link
        fedilink
        English
        arrow-up
        2
        ·
        17 hours ago

        Microsoft.

        Microsoft is buying them for AI.

        From what I understand, chatGPT is running on azure servers.

        • 9488fcea02a9@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          9
          arrow-down
          4
          ·
          1 day ago

          GPU hasnt been profitable to mine for many years now.

          People just keep parroting anti-crypto talking points for years without actually knowing what’a going on

          To be clear, 99% of the crypto space is a scam. But to blame them for GPU shortages and high prices is just misinformation

          • D06M4@lemmy.zip
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 day ago

            Most people are buying Nvidia because that’s what’s commonly recommended on reviews. “Want to use AI? Buy Nvidia! Want the latest DX12+ support? Buy Nvidia! Want to develop videogames or encode video? Buy Nvidia! Want to upgrade to Windows 11? Buy Nvidia!” Nonstop Nvidia adverts everywhere, with tampered benchmarks and whatnot. Other brands’ selling points aren’t well known and the general notion is that if it’s not Nvidia it sucks.

          • Tinidril@midwest.social
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            1 day ago

            Profitability of Bitcoin mining is dependent on the value of Bitcoin, which has more than doubled in the last 12 months. It’s true that large scale miners have moved on from GPUs to purpose designed hardware, but GPUs and mining hardware are mutually dependent on a lot of the same limited resources, including FABs.

            You are right that crypto doesn’t drive the GPU market like it used to in the crypto boom, but I think you are underestimating the lingering impact. I would also not rule out a massive Bitcoin spike driven by actions of the Trump.p administration.

            • Taldan@lemmy.world
              link
              fedilink
              English
              arrow-up
              9
              ·
              edit-2
              1 day ago

              Profitability of Bitcoin mining is dependent on the value of Bitcoin

              No it isn’t. It’s driven by the supply of miners and demand of transactions. Value of bitcoin is almost entirely independent

              ASICs, which are used to mine Bitcoin are using very different chips than modern GPUs. Ethereum is the one that affected the GPU market, and mining is no longer a thing for Ethereum

              A massive Bitcoin spike would not affect the GPU market in any appreciable way

              Crypto mining is pretty dumb, but misinformation helps no one

              • Tinidril@midwest.social
                link
                fedilink
                English
                arrow-up
                2
                ·
                19 hours ago

                ASICs and GPUs do share significant dependencies in the semiconductor supply chain. Building FABS fast enough to keep up with demand is difficult and resource constrained, both by expertise and high quality materials.

                You are wrong about the market value of Bitcoin’s impact on the profitability of Bitcoin mining.

                https://www.investopedia.com/articles/forex/051115/bitcoin-mining-still-profitable.asp

                Another thing to consider is that many coins still use proof of work, and an ASIC designed for one might not work for others. Some miners (especially the most scammy ones) choose the flexibility to switch coins at will. That doesn’t change the fact that ASIC now dominates, but GPUs do still have a share, especially for some of the newer scam coins.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        11
        ·
        edit-2
        2 days ago

        Not as many as you’d think. The 5000 series is not great for AI because they have like no VRAM, with respect to their price.

        4x3090 or 3060 homelabs are the standard, heh.

        • MystikIncarnate@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          17 hours ago

          Who the fuck buys a consumer GPU for AI?

          If you’re not doing it in a home lab, you’ll need more juice than anything a RTX 3000/4000/5000/whatever000 series could have.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            16 hours ago

            Who the fuck buys a consumer GPU for AI?

            Plenty. Consumer GPU + CPU offloading is a pretty common way to run MoEs these days, and not everyone will drop $40K just to run Deepseek in CUDA instead of hitting an API or something.

            I can (just barely) run GLM-4.5 on a single 3090 desktop.

            • MystikIncarnate@lemmy.ca
              link
              fedilink
              English
              arrow-up
              1
              ·
              9 hours ago

              … Yeah, for yourself.

              I’m referring to anyone running an LLM for commercial purposes.

              Y’know, 80% of Nvidia’s business?

              • brucethemoose@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 hours ago

                I’ve kinda lost this thread, but what does that have to do with consumer GPU market share? The servers are a totally separate category.

                I guess my original point was agreement: the 5000 series is not great for ‘AI’, not like everyone makes it out to be, to the point where folks who can’t drop $10K for a GPU are picking up older cards instead. But if you look at download stats for these models, there is interest in running stuff locally instead of ChatGPT, just like people are interested in internet free games, or Lemmy instead of Reddit.

                • MystikIncarnate@lemmy.ca
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  52 minutes ago

                  The original post is about Nvidia’s domination of discrete GPUs, not consumer GPUs.

                  So I’m not limiting myself to people running an LLM on their personal desktop.

                  That’s what I was trying to get across.

                  And it’s right on point for the original material.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            9
            arrow-down
            3
            ·
            2 days ago

            Yeah. What does that have to do with home setups? No one is putting an H200 or L40 in their homelab.

              • brucethemoose@lemmy.world
                link
                fedilink
                English
                arrow-up
                7
                ·
                edit-2
                2 days ago

                It mentions desktop GPUs, which are not part of this market cap survey.

                Basically I don’t see what the server market has to do with desktop dGPU market share. Why did you bring that up?

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      13
      arrow-down
      2
      ·
      2 days ago

      Nvidia is the only real option for AI work. Before Trump lifted the really restrictive ban on GPUs to china they had to smuggle in GPUs from the US, and if you’re Joe Schmo the only GPUs you can really buy are gaming ones. That’s why the 5090 has been selling so well despite it being 2k and not all that much better than the 4090 in gaming.

      Also AMD has no high end GPUs, and Intel barely has a mid range GPU.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        2 days ago

        To be fair, AMD is trying as hard as they can to not be appealing there. They inexplicably participate in the VRAM cartel when… they have no incentive to.

          • brucethemoose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            6 hours ago

            Basically, consumer VRAM is dirt cheap, not too far from DDR5 in $/gigabyte. And high VRAM (especially 48GB+) cards are in high demand.

            But Nvidia charges through the nose for the privilege of adding more VRAM to cards. See this, which is almost the same silicon as the 5090: https://www.amazon.com/Blackwell-Professional-Workstation-Simulation-Engineering/dp/B0F7Y644FQ

            When the bill of materials is really only like $100-$200 more, at most. Nvidia can get away with this because everyone is clamoring for their top end cards


            AMD, meanwhile, is kind of a laughing stock in the prosumer GPU space. No one’s buying them for CAD. No one’s buying them for compute, for sure… And yet they do the same thing as Nvidia: https://www.amazon.com/AMD-Professional-Workstation-Rendering-DisplaPortTM/dp/B0C5DK4R3G/

            In other words, with a phone call to their OEMs like Asus and such, Lisa Su could lift the VRAM restrictions from their cards and say 'you’re allowed to sell as much VRAM on a 7900 or 9000 series as you can make fit." They could pull the rug out from under Nvidia and charge a $100-$200 markup instead of a $3000-$7000 one.

            …Yet they don’t.

            It makes no sense. They’re maintaining an anticompetitive VRAM ‘cartel’ with Nvidia instead of trying to compete.

            Intel has more of an excuse here, as they literally don’t manufacture a GPU that can take more than 24GB VRAM, but AMD literally has none I can think of.

        • Diplomjodler@lemmy.world
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          2 days ago

          My theory is that they’re just scared to annoy Nvidia too much. If they priced their GPUs so as to really increase their market share, Nvidia might retaliate. And Nvidia definitely has the deeper pockets. AMD has no chance to win a price war.

            • Holytimes@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              2
              ·
              2 days ago

              It’s fear of failure not success because success isn’t an option.

              Cause if they start to “succeed” then they actually fail since they will be crushed by Nvidia.

              Their options are to either hold the status quo or lose more because they angered the green hulk in the room

              • ganryuu@lemmy.ca
                link
                fedilink
                English
                arrow-up
                3
                ·
                1 day ago

                Wait wait wait… If I push your theory a bit, it then means that Nvidia could crush AMD at any time, becoming a full fledged monopoly (and being able to rake in much more profits), but they are… Deciding not to? Out of the goodness in their hearts maybe?

        • fuckwit_mcbumcrumble@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          1
          ·
          2 days ago

          That article is a year old and is missing the latest generation of cards. Neither AMD nor Nvidia produce those GPUs anymore. AMDs best GPU from their 9000 series competes with Nvidias 5070/5070ti. The 5090 and 5080 are unmatched.

          • BCsven@lemmy.ca
            link
            fedilink
            English
            arrow-up
            5
            ·
            edit-2
            2 days ago

            Kind of my point, these were high end and still usable by 95% of people. Everyone is chasing 1% gains for twice the price. I have an new RTX via work equipment for rendering, I play games on the side but that RTX dosnt really make the gameplay that much better. It looks great with the shine on metal, or water reflections, but when totally immersed in game play that stuff is wasted

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              3
              arrow-down
              2
              ·
              edit-2
              2 days ago

              Honestly stuff like Unreal’s Lumen or Crytek’s SVOGI has obsoleted RTX. It looks freaking incredible, and runs fast, and you can put the rendering budget to literally anything else; who in their right mind would develop RTX over that?

      • Marthirial@lemmy.world
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        1
        ·
        21 hours ago

        At the end of the day I think it is this simple. CUDA works and developers use it so users get a tangible benefit.

        AMD comes up with a better version of CUDA and you have the disruption needed to compete.

        • MangoPenguin@lemmy.blahaj.zone
          link
          fedilink
          English
          arrow-up
          3
          ·
          8 hours ago

          I’m not sure that would even help that much, since tools out there already support CUDA, and even if AMD had a better version it would still require everyone to update apps to support it.

    • lemonySplit@lemmy.ca
      link
      fedilink
      English
      arrow-up
      8
      arrow-down
      1
      ·
      2 days ago

      Meanwhile framework’s new AMD offering has nvidia slop in it. Just why. We want AMD. Give us AMD.

      • notthebees@reddthat.com
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 days ago

        They did. There’s just no new amd mobile gpus. Like I think they only have 100 watt tdp or Max cooling to work with and the 7700S is the fastest amd mobile gpu currently.

        If amd makes a new mobile gpu, framework will probably make it into a module.

    • warm@kbin.earth
      link
      fedilink
      arrow-up
      6
      arrow-down
      1
      ·
      2 days ago

      They need dlss otherwise the triple a games they love so much wont reach 30fps!!

    • SoftestSapphic@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      5
      ·
      edit-2
      1 day ago

      I will never get another AMD card after my first one just sucked ass and didn’t ever work right.

      I wanted to try a Intel card but I wasn’t even sure if I could find linux drivers for it because they weren’t on the site for download and I couldn’t find anything specifying if their newer cards even worked on linux.

      So yeah, Nvidia is the only viable company for me to buy a graphics card from

      • ganryuu@lemmy.ca
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        3
        ·
        1 day ago

        That kind of comment always feels a bit weird to me; are you basing AMD’s worth as a GPU manufacturer on that one bad experience? It could just as well have been the same on an Nvidia chip, would you be pro-AMD in that case?

        On the Intel part, I’m not up to date but historically Intel has been very good about developing drivers for Linux, and most of the time they are actually included in the kernel (hence no download necessary).

        • njm1314@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          4
          ·
          21 hours ago

          What else would a consumer base things on except their own experiences? Not like it’s a rare story either.

          • ganryuu@lemmy.ca
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            2
            ·
            edit-2
            9 hours ago

            I don’t know, real world data maybe? Your one, or 2, or even 10 experiences are very insignificant statistically speaking. And of course it’s not a rare story, people who talk online about a product are most usually people with a bad experience, complaining about it, it kinda introduces a bias that you have to ignore. So you go for things like failure rates, which you can find online.

            By the way, it’s almost never actually a fault from AMD or Nvidia, but the actual manufacturer of the card.

            Edit: Not that I care about Internet points, but downvoting without a rebuttal is… Not very convincing

            • njm1314@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              9 hours ago

              A persons actual experience with a product isnt real world data? Fan boys for huge companies are so weird.

              • ganryuu@lemmy.ca
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                1
                ·
                7 hours ago

                Please read my entire comment, I also said your experience as one person is statistically insignificant. As in, you cannot rely on 1 bad experience considering the volume of GPUs sold. Anybody can be unlucky with a purchase and get a defective product, no matter how good the manufacturer is.

                Also, please point out where I did any fanboyism. I did not take any side in my comments. Bad faith arguments are so weird.

                • njm1314@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  arrow-down
                  1
                  ·
                  7 hours ago

                  Sure buddy, we’re all idiots for not liking the product you simp for. Got it.

        • SoftestSapphic@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          arrow-down
          3
          ·
          edit-2
          1 day ago

          That kind of comment always feels a bit weird to me; are you basing AMD’s worth as a GPU manufacturer on that one bad experience?

          Absolutely, if a company I am trying for the first time gives me a bad experience, I will not go back. That’s me giving them a chance, and AMD fucked up that chance and I couldn’t even get a refund for like a $200 card. Choosing to try a different option resulted in me wasting time and money, and it pushed back my rig working for half a year until i could afford a working card again which really pissed me off.

          I didn’t know that about intel cards, I’ll have to try one for my next upgrade if I can find on their site that they are supported.

    • darkkite@lemmy.ml
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      3
      ·
      2 days ago

      I do local ai stuff and i get more support with nvidia cuda, and you usually get exclusive gaming features first on nvidia like dlss, rtx, and voice

      I wish they shipped with more vram though

  • Brotha_Jaufrey@lemmy.world
    link
    fedilink
    English
    arrow-up
    60
    ·
    2 days ago

    IMO there’s zero reason to buy an nvidia gpu if there’s a similarly performing amd card because the price will just be better.

    • Garry@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      22
      arrow-down
      2
      ·
      2 days ago

      Amd promised a msrp of 600 for the 9070xt, it rarely goes below 750. All amd had to do was stick to their prices and have ample stock. Amd is satisfied with second place

        • Garry@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          7
          ·
          1 day ago

          5070 ti has come down to its normal price. You can find listings for $750. 9070 xt is better in some games than the 5070 ti in terms of raw raster. When it comes to upscaling, efficiency,and ray tracing. The 5070 ti is better.

      • ne0phyte@feddit.org
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        1
        ·
        2 days ago

        What does that have to do with anything? Pretty much all monitors also support FreeSync which works just as well.

        • Rai@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          4
          ·
          1 day ago

          I have a g-sync monitor. It supports g-sync. It is very nice, but unfortunately it does NOT support freesync.

        • ganryuu@lemmy.ca
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 day ago

          From what I can find, even though a lot of FreeSync monitors support at least partially G-Sync, the opposite seems rather rare, since G-Sync is fully proprietary and hardware-based. I’ve found a couple more modern monitors that officially support both but they seem to be the exception rather than the norm.

  • network_switch@lemmy.ml
    link
    fedilink
    English
    arrow-up
    69
    arrow-down
    3
    ·
    2 days ago

    AMD seems to be doing fine, Nvidia just doing finer. If this was 10 years ago, it’d be a lot more concerning but now AMD has a healthy home and server CPU business and GPU server business and they’re the standard for handheld PCs. Along with consoles, that’ll keep FSR relevant and their server stuff will keep funding for UDNA. Samsung uses AMD GPUs for their Exynos chips and that sounds like it may make it’s flagship return with the next Galaxy phones. They’re not drowning

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      2
      ·
      edit-2
      2 days ago

      Ehh, 5% market share is not fine.

      AMD’s server GPU division is not fine, either, so don’t bet on that saving them.

      AMD/Intel graphics divisions need R&D money from sales to keep up, and if this keeps up, they’re gonna drop out of dGPUs and stick to integrated graphics (which Intel is already at extremely severe risk of doing).

  • thisNotMyName@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    2 days ago

    Intel cards are awesome in a homeserver for media transcoding. Super cheap, super capable, power saving compared to other cards with the features. And although Intel has become a shitty company, I’d really like to see more competition on the gpu market

  • brucethemoose@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    ·
    edit-2
    2 days ago

    I don’t get this.

    Well, if this includes laptops, I get that. Just try to find a dGPU laptop with AMD or Arc these days.


    …But in desktops, everyone seems to complain about Nvidia pricing, yet no one is touching Battlemage or the 9000 series? Why? For gaming specifically, they seem pretty great in their price brackets.

    Maybe prebuilts are overshadowing that too?

    • empireOfLove2@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      14
      ·
      2 days ago

      But in desktops, everyone seems to complain about Nvidia pricing, yet no one is touching Battlemage or the 9000 series? Why?

      Its always been this way: they want AMD and Intel to compete so Nvidia gets cheaper, not that they will ever buy AMD or Intel. Gamers seem to be the laziest, most easily influenced consumer sector ever.

      • notthebees@reddthat.com
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 day ago

        People who say buy Intel and amd probably either did or will when they upgrade, which is probably not anytime soon with the way everything seems to be going.

      • Alchalide@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 days ago

        That one stung XD. I went with an AMD GPU in 2023 after only owning Nvidia for decades. I went with AMD because I was not satisfied with the amount of Vram Nvidia offers and I did not want burning power connectors. Overall it’s stable and works great. There are some bugs here and there, but zero regrets.

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          23 hours ago

          No shame in that; AMD and Nvidia traded between ‘optimal buys’ forever. There were times where buying AMD was not the best idea, like with how amazing the Nvidia 900/1000 series was while AMD Vega was very expensive.

          Others, it wasn’t obvious at the time. The old AMD 7000 series was pricey at launch, for instance, but aged ridiculously well. A 7950 would still function alright these days.

          This market’s such a caricature now though. AMD/Intel are offering these obvious great values, yet being looked over through pure ignorance; I can’t remember things ever being like this, not all the way back to Nvidia Fermi at least.

    • Pycorax@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      7
      ·
      2 days ago

      We’ve come to a point where PC gaming is so mainstream that the average PC gamer likely doesn’t even know that AMD makes GPUs. They’ll just complain about the prices and then pay for Nvidia directly or indirectly via prebuilts.

      • brucethemoose@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        2 days ago

        I buy this.

        And I can’t really blame people for not diving into components and wanting stuff to just… Work.

        No one (on average) knows their graphics card off the top of their head.

    • brucethemoose@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      They need dGPUs worth buying for HPC, other than servers that cost more than a house, so devs will actually target them.

      • notthebees@reddthat.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        They have the hardware for hpc, with their instinct cards. Software support is slowly growing. Rocm is fine, ZLUDA is pain and suffering on amd cards. I have a 6800xt, so old but decently supported and even then it’s annoying

        • brucethemoose@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 days ago

          What I mean is they need to sell reasonable high VRAM cards that aren’t a MI325X, heh.

          There’s not really a motivation to target them over a 3090 or 4090 or whatever, but that would change with bigger VRAM pools.

          • notthebees@reddthat.com
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 day ago

            7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes. There’s no point of shoving that much vram into it if support is painful and makes it hard to develop. I’m probably biased due to my 6800xt, one of the earliest cards that’s still supported by rocm, so there’s a bunch of stuff my gpu can’t do. ZLUDA is painful to get working (and I have it easier due to my 6800xt), ROCM is mostly works but vram utilization is very inefficient for some reason and it’s Linux only, which is fine but I’d like more crossplatform options. Vulkan compute is deprecated within pytorch. AMD HIP is annoying as well but idk how much of it was just my experience with ZLUDA.

            Intel actually has better cross platform support with IPEX, but that’s just pytorch. Again, fine.

            • brucethemoose@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              1 day ago

              7900xtx is 24 gb, the 9700 pro has 32 gb as far as high end consumer/prosumer goes.

              The AI Pro isn’t even availible! And 32GB is not enough anyway.

              I think you underestimate how desperate ML (particularlly LLM) tinkerers are for VRAM; they’re working with ancient MI50s and weird stuff like that. If AMD had sold the 7900 with 48GB for a small markup (instead of $4000), AMD would have grassroots support everywhere because thats what devs would spend their time making work. And these are the same projects that trickle up to the MI325X and newer.

              I was in this situation: I desperately wanted a non Nvidia ML card awhile back. I contribute little bugfixes and tweaks to backends all the time; but I ended up with a used 3090 because the 7900 XTX was just too expensive for ‘only’ 24GB + all the fuss.

              There’s lingering bits of AMD support everywhere: vulkan backends to popular projects, unfixed rocm bugs in projects, stuff that works but isn’t optimized yet with tweaks; the problem is AMD isnt’ making it worth anyone’s while to maintain them when devs can (and do) just use 3090s or whatever.


              They kind of took a baby step in this direction with the AI 395 (effectively a 110GB VRAM APU, albeit very compute light compared to a 7900/9700), but it’s still $2K, effectively mini PC only, and kinda too-little-too-late.

              • notthebees@reddthat.com
                link
                fedilink
                English
                arrow-up
                2
                ·
                edit-2
                22 hours ago

                I’m well aware. I’m one such tinkerer. Its a catch 22. No good software support means that no one really wants to use it. And since no one really wants to use it, amd doesn’t make stuff. Also amd is using much denser memory chips so an easy double in vram capacity isn’t as possible.

                It took them a few months iirc to get proper support for 9070 in rocm.

  • humanspiral@lemmy.ca
    link
    fedilink
    English
    arrow-up
    1
    ·
    2 days ago

    Just saw a “leak video” that 9070 is outselling 5070, and so 5070 and 5080 supers are being released very soon.