|
Post by john5 on Jan 11, 2019 21:05:37 GMT
I thought I'd try out a custom fan control setting to see how it affects gaming performance. The setup: - Use Macs Fan Control to use max rpm (2500 I think) in MacOS. Then reboot to Windows 10
- Benchmark 20 minutes of a BO4 Zombies replay using the MSI Afterburner overlay
- Do the same without custom fan control settings Results: - With maxed out fans, min/avg/max fps were 99/148/228. Minimums at 1% and 0.1% were 106/90. - With default fan speeds, the results were 100/149/232 and 105/89.
So, effectively the same. The GPU temp with maxed out fans was about 89 degrees, and 92 with default (slightly cooler with the custom fan speed).
Disclaimer: in regular game play, I have noticed temps as high as 97 degrees which wasn't replicated in this test. Although, I assume when the GPU is that hot, fans are maxed anyway.
My thoughts: there doesn't seem to be a performance benefit with custom controls, but there are costs - louder computer when not stressed, hassle of booting to MacOS first, and perhaps wear and tear on the fan? I'm sure under moderate load the custom controls reduce temps, but I'm not sure if that matters. The iMac Pro, as far as I know, has a good track record for reliability, and the 3 year warranty is pretty cheap compared to the cost of the machine.
|
|
alkar
Junior Boot Camper
Posts: 19
|
Post by alkar on Jan 12, 2019 14:12:07 GMT
Thanks for this. The GPU temp warning in game is irritating me and if > 85 it can't be disabled, really lame...
Also do you know of any performance trick for this game except decreasing resolution ? 60fps in 5K but scaling to 60% (with Vega 56), wish i could increase scaling a little more without sacrificing details that much. Also I think it's better to have the highest FSAA at this point, since the visual quality is dramatically improved in lower resolutions.
|
|
|
Post by john5 on Jan 12, 2019 18:21:32 GMT
Yeah, the GPU temp warning bugs me too. Trying to get rid of it was part of my motivation for this test.
> Also do you know of any performance trick for this game except decreasing resolution
I think lowering shadows as much as possible helped the most for me. I'm using an external 1080p/240 monitor, but if I were using the builtin monitor I'd consider running 1440p (exactly 50% linear resolution).
|
|
|
Post by Mat HD on Jan 13, 2019 9:16:10 GMT
Thanks for this. The GPU temp warning in game is irritating me and if > 85 it can't be disabled, really lame... Also do you know of any performance trick for this game except decreasing resolution ? 60fps in 5K but scaling to 60% (with Vega 56), wish i could increase scaling a little more without sacrificing details that much. Also I think it's better to have the highest FSAA at this point, since the visual quality is dramatically improved in lower resolutions. You might be better to use 4k resolution rather than 5k, disable anti aliasing & vsync & instead set an fps limit in amd settings to 59 fps. That should help to max out fps a bit better.
|
|
|
Post by mike on Jan 13, 2019 18:28:39 GMT
Well, whether you have Vega Pro 64 or Vega Pro 56 also play a factor here. I assume Vega 64 runs hotter. Either way, using the more recent Adrenalin drivers (except for 18.12.3), the GPU temps have plummeted. I only wish we could overclock now, because there is A LOT of thermal headroom available.
Using with the stock fan profile during gaming, that will create frequent large temperature fluctuations. Running at a constant high temperature (within spec) is not a big deal actually, but lots of big temperature fluctuations over time represent considerably more stress on the system. IMO to be on the safe side, just set the fan profil at static 1800rpm in macOS (Macs Fan Control) if you're playing the newest games, and then boot into Windows. I never reach above mid-high 80s Celsius on the GPU. And at 1800rpm, the fans are nearly inaudible with normal moderate sound volume from your speakers.
This is with a Vega 64 and 8c CPU btw., you can probably use a slightly lower static fan speed with Vega 56.
|
|
|
Post by john5 on Jan 13, 2019 18:43:24 GMT
Is there a technical reason to use a 59 fps cap? I would have thought 60 on a 60hz monitor would provide a smoother experience.
At least in theory, a 59 fps cap on a 60hz monitor would result in one skipped frame per second. That is, every second, one rendered frame will display for 1/30s instead of 1/60s, causing stutter.
I’m discounting screen tearing, because hopefully with a 60fps cap, the machine is at least powerful enough to render each frame in, say, 1/70s, and display each frame without tearing.
|
|
|
Post by john5 on Jan 13, 2019 19:02:08 GMT
I think the temperature will based on whether, with a particular game and settings, performance is GPU bound, CPU bound, or neither (artificially capped). If GPU bound, the Vegas will run hot, which is the case with BO4 with uncapped frames.
I have no doubt that for games where the GPU isn’t so stressed that forcing higher fans will lower temps and be better longer term for the GPU, and that it’s possible that Apple’s profile optimizes for quiet fans over hardware longevity. But how much should I care? Is it worth the effort? Have there been any reports of GPU failures with iMac pros?
I know Apple has a bad track record with GPUs in MacBooks, iMacs, and the trash can Mac Pro, but the iMac pro seems different/better, so far.
Note that I’m not trying to convince anyone to burn out their GPU, but I thought it was an interesting topic to discuss, especially to see if anyone has any hard evidence to justify the extra effort.
|
|
|
Post by mike on Jan 14, 2019 19:49:06 GMT
Is there a technical reason to use a 59 fps cap? I would have thought 60 on a 60hz monitor would provide a smoother experience. At least in theory, a 59 fps cap on a 60hz monitor would result in one skipped frame per second. That is, every second, one rendered frame will display for 1/30s instead of 1/60s, causing stutter. I’m discounting screen tearing, because hopefully with a 60fps cap, the machine is at least powerful enough to render each frame in, say, 1/70s, and display each frame without tearing. 59 fps cap on 60hz monitor is supposed to give less lag when using triple buffered vsync (which is a must IMO on regular 60hz monitor). I remember experimenting with this 6-7 years ago, and it felt like 59 fps cap gave less lag, so that's what I've been using ever since...
|
|
|
Post by Mat HD on Jan 15, 2019 7:46:12 GMT
Is there a technical reason to use a 59 fps cap? I would have thought 60 on a 60hz monitor would provide a smoother experience. At least in theory, a 59 fps cap on a 60hz monitor would result in one skipped frame per second. That is, every second, one rendered frame will display for 1/30s instead of 1/60s, causing stutter. I’m discounting screen tearing, because hopefully with a 60fps cap, the machine is at least powerful enough to render each frame in, say, 1/70s, and display each frame without tearing. 59 fps cap on 60hz monitor is supposed to give less lag when using triple buffered vsync (which is a must IMO on regular 60hz monitor). I remember experimenting with this 6-7 years ago, and it felt like 59 fps cap gave less lag, so that's what I've been using ever since... One skipped frame isn't noticeable to the human eye and it actually helps to reduce screen tearing too when not using Vsync.
|
|
|
Post by john5 on Jan 15, 2019 18:21:20 GMT
> 59 fps cap on 60hz monitor is supposed to give less lag when using triple buffered vsync (which is a must IMO on regular 60hz monitor). I remember experimenting with this 6-7 years ago, and it felt like 59 fps cap gave less lag, so that's what I've been using ever since...
Thanks, that's interesting. So from googling it seems there is a problem with DirectX wanting to display the _oldest_ buffered frame instead of discarding it when there is a newer frame available. I found some discussion that a 60 fps cap may work as well, but I suppose that is more risky since if you do wind up with an extra buffered frame, it may never clear out.
> One skipped frame isn't noticeable to the human eye and it actually helps to reduce screen tearing too when not using Vsync.
IDK. At 59hz and a fast computer, latency will also vary from frame to frame (first frame max latency and slightly less latency each successive frame until the skipped frame, every second). I've noticed really bad stutter with a game that had a 70fps cap for this reason, even though no frames were totally missed at 60hz.
RE screen tearing, that makes sense since it will help the tear location to shift, but I've found that without sync, uncapped is best for for that because 1. the tear location becomes even more random, and 2. if rendered fps is high, the difference in image above and below the tear point will be smaller (e.g. maybe 1/120s image difference rather than ~1/60s).
So, Mike's point was that 59 fps cap + vsync-60 is useful for latency, but it's not clear to me from this discussion that the cap is worth it _without_ vsync. (Mat HD's recommendation is to disable vsync, or did I misread that?)
|
|
|
Post by mike on Jan 15, 2019 18:44:34 GMT
An fps cap is very useful either way, because then the GPU doesn't have to work harder than necessary, ie. producing more fps than what the monitor can display. Anything over 60fps is only causing unnecessary heat, not to mention possible coil whine in game menus where the fps can run in the several hundreds.
|
|
|
Post by john5 on Jan 15, 2019 18:58:51 GMT
An fps cap is very useful either way, because then the GPU doesn't have to work harder than necessary, ie. producing more fps than what the monitor can display. Anything over 60fps is only causing unnecessary heat, not to mention possible coil whine in game menus where the fps can run in the several hundreds. But from the discussion so far, it seems your way is the best way - a cap + vsync. Without vsync, there is at least _some_ justification to go uncapped (tearing).
|
|