• skisnow@lemmy.ca
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    2
    ·
    3 days ago

    The tweet was specifically talking about their $299 card that also has a 16Gb version. OP is shitstirring for clicks.

    • recursive_recursion they/them@lemmy.caOP
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      6
      ·
      edit-2
      3 days ago

      The tweet was specifically talking about their $299 card that also has a 16Gb version. OP is shitstirring for clicks.

      This is one of the most uninformed comments I’ve read so far.


      I shared this vid to try and spread awareness that Frank Azor, AMD’s Chief Architect of Gaming Solutions and Gaming Marketing, of whom made that needless badfaith comment as it holds back the advancement of gaming.

      AMD’s 9060XT 16GB ($350) released recently but we’ve yet to see if they’re able to provide them to consumers at AMD’s own stated MSRP price of $350 USD; something that was unmet in their previous launch.


      For PC building, you walk around this show Computex and talk to case manufacturers and cooler manufacturers, and they’ll sync up their launches to Nvidia GPU launches cause they don’t sell things in between at the same velocity. And so if Nvidia launches a GPU and the interest falls off a cliff because people just feel like they either can’t get a card or they get screwed if they get a card, it I think actively damages the hobby.

      I remember even when the RTX 3070 came out and I gave that a positive review I said it was a great value product because by all metrics/measurements that we had at the time it was a good product. We had very few examples that we could point to where 8 gigabytes wasn’t enough. Of course the competing card the upcoming competing card we knew had a 16 GB vram buffer so; it/that doesn’t necessarily make that a valid thing. Like it can be like if you had a 32GB buffer now on that product, you’d be like “Well it’s got enough vram”. It’s probably nothing.

      But because we did see, and even when you were looking at like dedicated used vram, a lot of games like 7, 7 and a ½GB [usage of vram increasing]. So you could see it creeping up over the years from like 4, 5, 6, 7; you could see where it was trending right? Which is why I always find it funny when people [say] “Why are we using more than 8 now? Like 8 should be enough”.

      • First quote paragraph from Steve Burke of Gamers Nexus, second and third quote paragraph from Steve Walton of Hardware Unboxed
      • Is Nvidia Damaging PC Gaming? feat. Gamers Nexus
      • if you replace Nvidia’s name in the provided quotes with AMD it still holds the same force in that AMD would ruin the gaming landscape for the benefit of only themselves at the cost of literally everyone else.

      Clicks literally has no value to me as what I care about the most is trying to inform gamers so that people aren’t exploited by badfaith actors, especially hardware manufacturers as they dictate the limitations that game developers must work within. I’m not paid or affiliated with any of the hardware manufacturers.

      shitstirring for clicks

      “Shitstirring for clicks” literally does nothing for me. It would actually be detrimental for my reputation if I were to do so.


      No one should get a free pass if behaving badly. If Nvidia acts poorly they should be called out. Same for AMD, same for Intel. 0 exceptions.

  • Nate@programming.dev
    link
    fedilink
    English
    arrow-up
    50
    arrow-down
    2
    ·
    4 days ago

    Sorry but I don’t understand why this is a controversy. They have a 16gb model, if you need more than 8gb then go and buy that? They aren’t forcing your hand or limiting your options.

    • cows_are_underrated@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      5
      ·
      4 days ago

      The problem is, that games, especially the new big titles, are quite badly optimised quickly burning through your VRAM. So even if you maybe could play a game with 8gb you can’t, because the game is unoptimised.

    • Contramuffin@lemmy.world
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      16
      ·
      4 days ago

      The big deal is that the vast majority of gamers aren’t techies. They don’t know to check VRAM. 8 GB is insufficient nowadays, and any company that sells an 8 GB card and doesn’t make it obvious that it’s low-end is exploiting consumers’ lack of knowledge

      • Nate@programming.dev
        link
        fedilink
        English
        arrow-up
        17
        ·
        4 days ago

        I run most games just fine with my 3070 8gb. While I would’ve preferred to have more when I bought it, it’s held up just fine.

        While the 9060 XT isn’t released yet, everything I’ve seen so far has made the difference pretty clear. I have no problem with offering a lesser sku if the difference is clear. Not like Nvidia and their 1060 3gb and 6gb where they also cut the cores and memory bandwidth. If these differ on release my stance would be different.

        Also gaming isn’t the only reason to have a GPU, I still use my 1060 in my server for transcoding and it works just fine. If I needed something to replace it, or if I was building a new one from scratch, 9060 XT 8gb or an arc would be a fine choice.

      • Diurnambule@jlai.lu
        link
        fedilink
        English
        arrow-up
        8
        ·
        edit-2
        4 days ago

        What a load of crap… Only the first sentence is true. VRAM is usually written big on ads pages. Then using alternative fact you create imaginary crimes…

  • who@feddit.org
    link
    fedilink
    English
    arrow-up
    32
    ·
    4 days ago

    Perhaps AMD could convince game developers/publishers to spend some time learning optimisation. Many of the popular games I’ve seen in the past 5 years have been embarrassingly bad at working with RAM and storage specs that were considered massive until relatively recently. The techniques for doing this well have existed for generations.

    • Detun3d@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      ·
      4 days ago

      Was about to comment about this. The problem isn’t a lack of beefier components but a lack of interest in efficient use of available resources.

  • ryper@lemmy.ca
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    2
    ·
    edit-2
    4 days ago

    He said most people are playing at 1080p, and last month’s Steam survey had 55% of users with that as their primary display resolution, so he’s right about that. Ignore what’s needed for the 4K monitor only 4.5% of users have as their primary display; is 8GB VRAM really a problem at 1080p?

    • leave_it_blank@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      arrow-down
      2
      ·
      4 days ago

      Absolutely. Why pay more if less is good enough?

      They are open about it, and give the option to get more RAM if you want it. Fine by me.

      No one with a 4k monitor will by them anyway.

      • yeehaw@lemmy.ca
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        7
        ·
        4 days ago

        Absolutely. Why pay more if less is good enough?

        Different problem, IMO.

    • AngryMob@lemmy.one
      cake
      link
      fedilink
      English
      arrow-up
      9
      ·
      4 days ago

      8GB VRAM is definitely a problem even at 1080p. There are already benchmarks showing this from various outlets. Not in every game of course, and it definitely hurts AAA mre than others. but it will get worse with time, and people buy GPUs to last several years, so it shouldn’t have a major issue on the day you buy it!

    • obsoleteacct@lemm.ee
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      3
      ·
      4 days ago
      1. Why would they be buying a new card to play how they’re already playing?
      2. What does the long term trend line look like?

      You can confidently say that this is fine for most consumers today. There really isn’t a great argument that this will serve most consumers well for the next 3 to 5 years.

      It’s ok if well informed consumers are fine with a compromise for their use case.

      Misrepresenting the product category, and misleading less informed consumers to believe that it’s not a second rate product in the current generation is deeply anti-consumer.

      • imecth@fedia.io
        link
        fedilink
        arrow-up
        5
        ·
        4 days ago

        You can confidently say that this is fine for most consumers today. There really isn’t a great argument that this will serve most consumers well for the next 3 to 5 years.

        People have been saying that for years, my 8gb card is chugging along just fine. The race to vram that people were expecting just hasn’t happened. There’s little reason to move on from 1080p and the 75+ million ps5s aren’t going anywhere anytime soon.

        • obsoleteacct@lemm.ee
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          3 days ago

          There’s no NEED to move on from 1080p, but there are a lot of reasons.

          I wouldn’t even object to his position on 1080p if he said “the RX 5700 is fine for most users, don’t waste your money and reduce e-waste”.

          He’s telling consumers that they should expect to pay a premium price to be playing a slightly better than 2019 experience until 2028 or 2030. There are gullible people who won’t understand that he’s selling snake oil.

          • imecth@fedia.io
            link
            fedilink
            arrow-up
            2
            ·
            3 days ago

            He’s telling consumers that they should expect to pay a premium price

            What exactly do you expect him to say? Get the same GPU, but with more vram, at no extra cost?

            If you want more performance, you pay more, that’s the way it’s always worked, and he’s correct in that 8gb is good enough for most people.

            • obsoleteacct@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              arrow-down
              1
              ·
              3 days ago

              No, I expect him to gaslight naive consumers. Which is what he did. I just don’t get why others are defending it.

              In this case, at 1080p it’s barely more performance for a lot more money. And if we’re falling back on “good enough for most people” then an RX 5700 or 6600 is also “good enough for most people”.

              It’s a free market, he can sell a low value product to suckers. That’s his right. You’re free to defend him and think it’s not scummy. But it’s scummy, and hopefully most people who know better are going to call it out.

  • the_q@lemm.ee
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    8
    ·
    4 days ago

    This really feels like AMD’s “don’t you guys have phones” moment.

    • Dnb@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      Did you read the tweet? It said most people don’t need more, but that they offer a 16gb version if you do. That’s completely reasonable.

    • Lucy :3@feddit.org
      link
      fedilink
      English
      arrow-up
      2
      ·
      4 days ago

      Used 7800 XT for 500€ for >60Hz 4k then

      (>60Hz as in theoretically more than 60Hz, but my monitor only supporting 60 lol)

  • kemsat@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 days ago

    I’ve been playing Jedi Survivor off Game Pass, and the 12GB I have is getting maxed out. Wishing I woulda had the extra money to get one of the 16GB gpus.

  • Vik@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    4 days ago

    I can agree that the tweet was completely unnecessary, and the naming is extremely unfair given both variants have the exact same brand name. Even their direct predecessor does not do this.

    The statement that AMD could easily sell the 16 GiB variant for 50 dollars less and that $300 gives “plenty of room” is wildly misleading, and from that I can tell they’ve not factored in BOM at all.

    They blanketly state that GDDR6 is cheap and I’m not sure how they figure.

    • Dnb@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 days ago

      They’ve had same sku different vram for a while. 480 4gb and 8gb.

      As long as the core counts and such are the same it’s fine.

      If vram isn’t a bottleneck performance should be equal

      • Vik@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 days ago

        As some commentators have mentioned, that was mostly fine at the time of Ellesmere (2016ish?) where games wouldn’t so frequently shoot past that limit. In today’s environment, we find that a much higher proportion of games will want more than 8 GiB of VRAM, even at lower resolutions.

        Notably, the most recent predecessor in this sort of segment (RX 7600 series) used the XT suffix to denote a different SKU to customers, though it’s worth mentioning that the XT was introduced quite a bit later in the RDNA3 product cycle.

  • Sheldan@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    4 days ago

    I agree if that would mean that those options are cheaper. You can be a very active gamer and not need more, why pay more.

  • Red_October@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    3
    ·
    4 days ago

    At least we know nVidia aren’t the only ones being shitty, they just lead the market in it.

  • IngeniousRocks (They/She) @lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    20
    ·
    4 days ago

    AMD doesn’t know I selfhost generative AI models.

    8GB is barely sufficient for my needs and I often need to use multigpu modifications to deploy parts of my workflow to my smaller GPU which lacks both enough cuda cores and enough vram to make an appreciable difference. I’ve been searching for a used K80 in my price range to solve this problem.

      • IngeniousRocks (They/She) @lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        5
        arrow-down
        9
        ·
        4 days ago

        I know I’m not but that doesn’t mean that gamers wouldn’t benefit from more VRAM as well.

        Just an example, Nvidia’s implementation of MSAA is borked if you’ve only got 8gigs of VRAM, so all those new super pretty games need to have their gfx pipelines hijacked and the antialiasing replaced with older variants.

        Like, I’m not gonna go around saying my use case is normal, but I also won’t delude myself into thinking that the average gamer wouldn’t benefit from more VRAM as well.

        • KoalaUnknown@lemmy.world
          link
          fedilink
          English
          arrow-up
          15
          arrow-down
          2
          ·
          4 days ago

          Sure, but if they want more VRAM they can just buy the 16gb version. It doesn’t hurt to have a cheaper option.