GPU Meter

RohanRohan Registered User regular
I recently got a Logitech G15 keyboard, and when I'm running a game that doesn't have a program to communicate between the game and the keyboard, I usually run the Logitech Performance Monitor. While playing Crysis recently, I noticed that both CPU cores were maxing around 95/96%, while the memory was relatively stable at around 73%. It's then that I noticed that I have never seen a similar performance meter for a GPU, from any graphics card. Not that it's an essential tool, but I only run the Logitech Performance Monitor on my G15 out of curiousity anyway. Why do you think we haven't seen a GPU meter, for example in benchmarking software such as Sisoft Sandra? It's odd that these programs will measure even hard drive performance but we see no such similar measuring of the GPU.

...and I thought of how all those people died, and what a good death that is. That nobody can blame you for it, because everyone else died along with you, and it is the fault of none, save those who did the killing.

Nothing's forgotten, nothing is ever forgotten
Rohan on

Posts

  • GihgehlsGihgehls Registered User regular
    edited March 2008
    Maybe GPU load is a harder thing to measure than CPU load?

    I found this while digging.
    The CPU percentage is the amount of a time interval (that is, the sampling interval) that the system's processes were found to be active on the CPU. If top (or task manager) reports that your program is taking 45% CPU, 45% of the samples taken by top found your process active on the CPU. The rest of the time your application was in a wait. (It is important to remember that a CPU is a discrete state machine. It really can be at only 100%, executing an instruction, or at 0%, waiting for something to do. There is no such thing as using 45% of a CPU. The CPU percentage is a function of time.)

    I'm no graphics programmer, but it seems to me that generally the GPU is going to be at 100% (by that metric) since it's such a specialized processor.

    Gihgehls on
    PA-gihgehls-sig.jpg
  • AzioAzio Registered User regular
    edited March 2008
    GPUs work on a different and more specialized category of calculations than CPUs and it's not a simple matter of simply measuring how hard it's working. It's all very complicated.

    Azio on
  • ArcticMonkeyArcticMonkey Registered User regular
    edited March 2008
    When a GPU has rendered a frame it immediately starts rendering a new frame, so GPU load measured like CPU load will be 100%. Unless Rendering is locked to Vsync (fps can not exceed screen refresh rate).

    The most meaningful measure of GPU performance is Frames per second.

    ArcticMonkey on
    "You read it! You can't unread it!"
    venstre.giflobotDanceMiddle.gifhoyre.gif
  • darkgruedarkgrue Registered User regular
    edited March 2008
    The most meaningful measure of GPU performance is Frames per second.

    Agreed. In which case, what the OP would most likely want is FRAPS, which does have the capability to output statistics to a G15 display built into it.

    darkgrue on
  • RookRook Registered User regular
    edited March 2008
    I would have thought temperature would probably have been a better measure as to how hard it's working. But essentially, the GPU is best thought of as a ridiculously multi-cored processor. And exactly how useful would a CPU utilization chart be for a 128 cored processor?

    Rook on
  • darkgruedarkgrue Registered User regular
    edited March 2008
    Rook wrote: »
    I would have thought temperature would probably have been a better measure as to how hard it's working.

    Temperature doesn't correlate tightly enough to utilization. There's basically three or four temperature thresholds on GPUs:
    1. Off (no power)
    2. Idle (reading Penny Arcade)
    3. 3D (playing WoW or Crysis)
    4. Overdrive meltdown (running some nonsensical synthetic benchmark designed to fully load the chip)

    Trying to divide it any more than that will significantly exceed the noise levels in the cheap thermal diodes and A/D converters that are used to measure temperatures in commodity computing devices. If you had an extremely high-resolution temperature reading you still wouldn't get much more resolution than that, since the processor load will change much faster than the chip can shed or gain heat.
    Rook wrote: »
    But essentially, the GPU is best thought of as a ridiculously multi-cored processor. And exactly how useful would a CPU utilization chart be for a 128 cored processor?

    Pipelines != Cores (CPUs have piplines too), so your metaphor fails (although it really fails on account of not being a metaphor including the use of automobiles).

    However, you are correct in that GPU utilization would really need to be granular on a pipeline level, and the chips just don't support it - mostly because it'd require a lot of the chip's die space that's needed to actually do things people buy video cards for just to support a relatively useless feature in order to tell people how busy it is doing it.

    darkgrue on
Sign In or Register to comment.