NVIDIA FrameView download v0.9.4124.26691055

Videocards - Handy Utilities 86 Updated by Hilbert Hagedoorn

Click here to post a comment for NVIDIA FrameView download on our message forum
https://forums.guru3d.com/data/avatars/m/260/260048.jpg
I can bet the figures won't be biased towards Nvidia gpu's.
https://forums.guru3d.com/data/avatars/m/271/271684.jpg
cryohellinc:

I can bet the figures won't be biased towards Nvidia gpu's.
^This. I hope no serious reviewer even considers using this in their benchmarks.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
Vananovion:

^This. I hope no serious reviewer even considers using this in their benchmarks.
You both are very clueless to what this tool actually is and haven't even tried it.
https://forums.guru3d.com/data/avatars/m/45/45709.jpg
Wait a minute - is this a wannabe replacement for Afterburner? 😕
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
DLD:

Wait a minute - is this a wannabe replacement for Afterburner? 😕
no, remember how you used to use fraps and fcat together to get frame measurements?.
https://forums.guru3d.com/data/avatars/m/48/48799.jpg
More of an efficiency benchmark then a performance one. But really cool, definitely going to play around with it.
data/avatar/default/avatar36.webp
POWERS Watts not working with my MSI GTX 1080 Ti gaming x. What means flickering T?
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
I've just done some testing on it, and it doesn't seem to be giving reliable results for my GPU. For a start it's not reading my GPU Power right, but that's ok, that's because I'm using a Gigabyte GTX 1070 vBIOS on a Zotac GTX 1070 card, and it reads power about 50W lower using the Gigabyte vBIOS. But, more importantly in relation to the unreliable results of NVidia FrameView, it's displaying average framerate accurately, but 90th/95th/and 99th framerate percentiles are well off and make no sense. I'll explain. In BF1 I cap my framerate to 141 fps at the moment using the inbuilt game engine config commands of BF1, and it's pretty much always stable at 141 fps because I've lowered the graphics details, but anyway it's showing 90th as 114fps, 95th as 109fps, and 99th as 98fps. Now, it's also showing the average framerate to be 141 fps, which is impossible if those percentile framerates are correct, because that would have dragged down the average framerate because I've set 141 as the maximum fps. I tested using RTSS to limit the fps to 141 fps rather than the BF1 game engine setting, and it's the same strange results. I know it's stable at 141fps in the game because the on screen display is showing it as 141 without fluctuation. However, in the logs it does seem to indicate that max framerates are in the region of 19157fps, which is gobbledegook because this is all taken from a short section of the game where I'm in a server on my own running around not doing anything intensive like shooting whilst looking at the displayed framerate to make sure it's not deviating from said & expected 141fps. Although that might explain why it can think the average framerate is still 141fps when the different percentile framerates are all lower like I listed before (e.g. 95th Perentile at 109fps). I did wonder if this application is measuring framerate so finely that it's measuring some kind of 'microstutter', but I don't see how that can happen with those massively high max framerates, when I've got a G-sync screen and telling it to limit the fps to 141 fps. I think the program is bugged at the moment unless me using a different vBIOS has somehow confused the hell out of it! If I enable the on-screen overlay for NVdia FrameView, then it will also show in real time that the 90th, 95th, and 99th Percentile framerates are skewed lower than 141fps, just like I described regarding the logs in the previous paragraph. This is all whilst displaying the same average framerate of 141 fps. Must be bugged for me. By the way, the performance cost of the program during the monitoring is 0.5-1% lowered CPU performance (for my 6700K @4.65Ghz), and zero degredation in GPU performance - as measured by Firestrike results. EDIT: reverting to my stock vBIOS for my GPU, and will retest. Same strange results on stock vBIOS, not related to that then. EDIT#2: and here is the instruction manual for this software (https://www.nvidia.com/content/dam/en-zz/Solutions/GeForce/technologies/frameview/FrameView_Beta_User_Guide.pdf) EDIT#3: Did some more testing on different settings in BF1 and also other games to see if there was this weird fluctuation in framerates during framerate lock in other games too. It turns out that some games have more fluctuation than others for the Percentiles: BF1 has the most, Rise of the Tomb Raider/Far Cry 5/Dirt Rally don't have much fluctuation at all. Remember, this is all with a locked and stable framerate according to 'normal' framerate monitoring tools. I'm not sure what to make of this, perhaps this software is measuring so fine-grained that it's picking up all the rapid frame time variances that would otherwise be covered up by 'the rolling average' framerate that is shown in most normal frame rate display tools. If that's the case then perhaps BF1 is more prone to microstutter than say Rise of the Tomb Raider or Far Cry5. I did some more testing in BF1 to see if removal of a framerate cap reduced the framerate variance - even when just stood still staring at a wall at a stable average framerate of say 160fps (with G-sync off) there was still these big variances in all the Percentiles. The BF1 results though still don't make sense to me on a mathematical & logic level like I described before in this post (so I won't type that bit out again). We could do with some tech sites to do an in depth investigation of this tool to see if it's just bugged or reporting reality.
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
Robbo9999:

I've just done some testing on it, and it doesn't seem to be giving reliable results for my GPU. For a start it's not reading my GPU Power right, but that's ok, that's because I'm using a Gigabyte GTX 1070 vBIOS on a Zotac GTX 1070 card, and it reads power about 50W lower using the Gigabyte vBIOS. But, more importantly in relation to the unreliable results of NVidia FrameView, it's displaying average framerate accurately, but 90th/95th/and 99th framerate percentiles are well off and make no sense. I'll explain. In BF1 I cap my framerate to 141 fps at the moment using the inbuilt game engine config commands of BF1, and it's pretty much always stable at 141 fps because I've lowered the graphics details, but anyway it's showing 90th as 114fps, 95th as 109fps, and 99th as 98fps. Now, it's also showing the average framerate to be 141 fps, which is impossible if those percentile framerates are correct, because that would have dragged down the average framerate because I've set 141 as the maximum fps. I tested using RTSS to limit the fps to 141 fps rather than the BF1 game engine setting, and it's the same strange results. I know it's stable at 141fps in the game because the on screen display is showing it as 141 without fluctuation. However, in the logs it does seem to indicate that max framerates are in the region of 19157fps, which is gobbledegook because this is all taken from a short section of the game where I'm in a server on my own running around not doing anything intensive like shooting whilst looking at the displayed framerate to make sure it's not deviating from said & expected 141fps. Although that might explain why it can think the average framerate is still 141fps when the different percentile framerates are all lower like I listed before (e.g. 95th Perentile at 109fps). I did wonder if this application is measuring framerate so finely that it's measuring some kind of 'microstutter', but I don't see how that can happen with those massively high max framerates, when I've got a G-sync screen and telling it to limit the fps to 141 fps. I think the program is bugged at the moment unless me using a different vBIOS has somehow confused the hell out of it! If I enable the on-screen overlay for NVdia FrameView, then it will also show in real time that the 90th, 95th, and 99th Percentile framerates are skewed lower than 141fps, just like I described regarding the logs in the previous paragraph. This is all whilst displaying the same average framerate of 141 fps. Must be bugged for me. By the way, the performance cost of the program during the monitoring is 0.5-1% lowered CPU performance (for my 6700K @4.65Ghz), and zero degredation in GPU performance - as measured by Firestrike results. EDIT: reverting to my stock vBIOS for my GPU, and will retest. Same strange results on stock vBIOS, not related to that then.
To be honest and i think it is normal [in the above quoted message] because: 1. Modded: using a Gigabyte GTX 1070 vBIOS on a Zotac GTX 1070 card 2. It's like having a swimming pool breeding fishes and the statistics shows ZERO homo-sapiens in the latter ;) ......so ......improvement(s) are most welcomed.....from NVIDIA (example: temps, clock speeds, frame-time graph/chart...etc)
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
Caesar:

To be honest and i think it is normal [in the above quoted message] because: 1. Modded: using a Gigabyte GTX 1070 vBIOS on a Zotac GTX 1070 card 2. It's like having a swimming pool breeding fishes and the statistics shows ZERO homo-sapiens in the latter ;)
I'm not with you man, don't understand what you mean? I did an EDIT to my post as you can see (because you quoted it), and at the end there you can see that I have same strange fps readings when using the stock vBIOS, so it wasn't related to the vBIOS used.
https://forums.guru3d.com/data/avatars/m/263/263710.jpg
Robbo9999:

I'm not with you man
I'm with you!!! 🙂
https://forums.guru3d.com/data/avatars/m/56/56686.jpg
the power info just show 0 for me, which was only reason i download this, I want to see how power was actual being pulled
https://forums.guru3d.com/data/avatars/m/245/245459.jpg
tsunami231:

the power info just show 0 for me, which was only reason i download this, I want to see how power was actual being pulled
If you're looking at the log files in a spreadsheet (rather than the onscreen overlay), then for me the titles of the columns were out of whack with the information underneath - the titles had to be shifted over 1 or 2 columns in order to match the correct data, so perhaps power is not reading zero for you, instead you might just need to "move the columns over". It seems to read power correctly for me in the on screen overlay. EDIT: ignore what I said there, make sure you use only the "comma" as a means of seperation when you load the logged data into the spreadsheet, that will make sure all columns aligned correctly. EDIT: found the manual for this software (https://www.nvidia.com/content/dam/en-zz/Solutions/GeForce/technologies/frameview/FrameView_Beta_User_Guide.pdf) You might need to run the Install.bat within the folder if you haven't already done so, it says in the guide there that Power Information might not be shown if you don't run the Install.bat - in fact they say to run it "again" if you still don't have Power Information.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
DW75:

Nvidia will be sure to include their telemetry spyware too.
only the thoroughly detached from reality think driver telemetry is spyware, but no - there is none in frameview.
https://forums.guru3d.com/data/avatars/m/225/225084.jpg
When does the app right to the telemetry log? Is it written only when you use the bench mark tool or is it constantly writing to the log?
data/avatar/default/avatar15.webp
ah just what we need; a gameworks version of a benchmark utility. even comes with the standard anti-AMD shiv.
https://forums.guru3d.com/data/avatars/m/273/273678.jpg
holler:

ah just what we need; a gameworks version of a benchmark utility. even comes with the standard anti-AMD shiv.
it works on amd just fine, you didn't even look at it or its documentation and come in this thread just to write some slag.
https://forums.guru3d.com/data/avatars/m/242/242573.jpg
Astyanax:

it works on amd just fine, you didn't even look at it or its documentation and come in this thread just to write some slag.
You pretty much hit the nail right on the head. It's like some people have nVidia derangement syndrome and just make crap up in order to feel better about having second rate hardware.
DW75:

Nvidia will be sure to include their telemetry spyware too.
You act as if this is an nVidia thing. https://www.amd.com/en/corporate/privacy