Post by starprince on Dec 19, 2020 3:48:54 GMT
I've been having an issue with Gears of War 5 that leads me to believe there is untapped potential, or an issue, with the Vega64 card. This happens with Matt's August, November and December Red drivers, in addition to the recent official Apple + AMD 20.10 drivers.
Computer and drivers: 8 core iMac Pro with Vega64. Apple + AMD 20.10 drivers without the Radeon software installed.
Monitoring: Using msi afterburner and Riva Tuner Statistics.
Settings: Ultra at 1440p. (More on the settings in a moment.)
The problem: This game will run flawlessly at Ultra settings for a minute or two, then experience sudden, momentary core clock drops below 200 MHz with FPS drops down to 10 fps. Jarring, to say the least. This can even happen on the home screen after a minute or two.
What's happening: When Gears 5 starts, the GPU core clock runs up to 674 MHz, the temp runs between 98C and 100C, and the GPU usage is between 97-98%. During the initial cut scene, the core clock understandably varies but generally remains high. Usually the first few drops happen near the end of the first cut scene, although there is nothing to suggest the card is being taxed harder than at any other point during the cut scene. Once the the gameplay starts, the core clock tends to stay around 674 MHz with some dips to 633 MHz. When the aforementioned core clock and fps drops occur, the core clock goes back up to 674 MHz.
What's weird: Changing the video settings doesn't make a difference. Normally if a game has fps drops, you can adjust the resolution, shadows, textures, etc. In this game, 1440p at Ultra or at Medium will result in the same core clock speeds (full screen with Vsync on) and high GPU temperatures.
The solution: This is where changing the video settings becomes weird. I'm not sure exactly what triggers the game to work properly, but switching multiple times between resolutions, presets (Low to Recommended, i.e. Ultra), Vsync on/off, and fps limits eventually forces the GPU core clock to settle around 599 MHz and the temperature stays stable between 94C to 95C. If the GPU usage goes up, it doesn't change the game's stability.
Settings after stable core clock: Once the core clock has been trigger to run at 599 MHz, the game runs at a flawless 60 fps with settings at Ultra on 1440p, Ultra/High on 3200 x 1800, Ultra/High/Med mix at 4k, and even a solid 30 fps at 5K with High/Med/Low mix.
Questions: What is happening here? Why does the game suddenly work after several minutes of switching resolution, presets, Vsync, etc.? What causes this particular game to settle into a stable GPU core clock speed? Is this an issue that Apple and AMD, or you, Matt, can possibly address? I'm finding this year that games, when pushed hard, won't just drop a few FPS but will massively drop below 30 fps then back up, although normally adjusting shadows or resolution fixes these problems for most games. (Incidentally, Shadow of the Tomb raider used to run at 3200x but has issues at 60 fps since the May update.)
I'm left wondering if this is an issue with the Vega64 card, drivers not being optimized appropriately for this card, my card specifically, or something else. I'd be happy for any thoughts on this.