It seems AMD might be shifting its focus towards the mid-range segment (once gain), with the introduction of a new successor to the RDNA3 architecture. This move would bring AMD back to its roots when the company prioritized the mid-range segment with RDNA1/Polaris.

  • Navarian@lemm.ee
    link
    fedilink
    arrow-up
    10
    arrow-down
    1
    ·
    1 year ago

    This sounds like a win for a majority of consumers. There has been an overabundance lately at the high end range of the GPU market. They muscled back into the CPU market by providing quality and affordable units that rivalled anything competitors had on the market. They could absolutely do the same thing in the GPU market given the opportunity.

    • mustardmanOP
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      High-end cards are also in an arms race for performance at the expense of power consumption, whereas “mid-range” is still pretty powerful. A low-end RX 6600 from 2021 performs the same as the nVidia GTX 1080 that came out 5 years before it and still seems to perform well at FHD with more modern games.

      In this article, AMD mentioned they didn’t make a competitor to the 4090 since it would be a power and cost hog:

      Technically, it is possible to develop a GPU with specs that compete with theirs (NVIDIA) . However, the GPU developed in this way was introduced to the market as a graphics card with a TDP (thermal design power) of 600W and a reference price of $1,600 (about 219,000 yen)‘’, and was accepted by general PC gaming fans . After thinking about it, we chose not to adopt such a strategy.

      When this article launched, the sentiment on Reddit was that it was “copium” from AMD but it sounds like they are owning the fact they couldn’t scale with price and power to compete with nVidia. Maybe AMD’s definition of “mid-range” is going to be simply the modest generation-over-generation performance increases previously seen. The high-end GPU market reminds me of the frequency wars that Intel pulled with the NetBurst architecture, which was high performance at the expense of high power consumption. That ended when Intel went back to (initially) slower clock speeds with the Core era.