Greater than 60 fps?

ravager

New member
Jul 20, 2012
100
0
I know the difference between refresh rate and actual rendered frames, bro. :) I was talking about the frames per second. "hz" just means frequency per second, sorry if my unit of choice confused you.

Frequency per second is a little redundant since frequency is actually cycles per second, represented by Hz.

But more importantly, what is the story behind that hat?
 

Sumez

New member
Nov 19, 2012
985
0
A frequency can be measured per hour, day, year, millennium, anything you want. I'm pretty sure hz is always per second. :) But we're getting sidetracked here anyway, it's not an important detail. :)
 

Deathtickle

New member
Dec 10, 2012
28
0
Having 2 -60hz (32"), 1 -120hz (42') & a 240hz (55") all in regular use; I must say that watching sports that are fast paced like Hockey and Football the 240 looks much smoother. Of course other factors come into play on the viewing experience other than just the cycles. I had a plasma (55") that I gave to my sister after I had kids, and sports looked smooth on that as well.
 

Timelord

Member
Oct 29, 2012
543
0
60 FPS is adequate to facilitate smooth motion viewing. This is because 1/30 of a second is the point where the human nervous system blurs sequential still images into smooth motion. No motion blur is required, this is an artifact from the camera. Perfectly focused stills will look wonderfully fluid at 60fps. This is also why 48 FPS on progressive scan devices will perform adequately. This is great for watching movies, etc but there's a lot more going on with video games.

With video games there are two possibilities:
1) the game is synced to display frame rate.
2) the underlying game engine runs independently form the display rate but the display routine is called on specified time intervals.

I very much doubt that PA is synced to display but I can't be certain without source code. Some of the games I programmed were hardware synced (notably arcade machines) , but most were not. I would be very surprised that any modern non-networked game actually uses that type of synchronization anymore.


If the game engine is synced to the display rate even speeds of 60 fps yield user action latency of at best 500 ms or a half of a second. The only reason one needed 60 fps was for interlaced displays in order to update the even and odd fields which were rasterized separately. 30fps of an interlaced display would still blur adequately, but would cause eye fatigue in some individuals.


If a non-synchronized game falls below the display refresh rate then is is CPU (or GPU) starved which will increase user interface latency and ruin the user immersion illusion, a cardinal sin in game programming.

Timelord ...
 
Last edited:

ravager

New member
Jul 20, 2012
100
0
A frequency can be measured per hour, day, year, millennium, anything you want. I'm pretty sure hz is always per second. :) But we're getting sidetracked here anyway, it's not an important detail. :)

Yes, but we were referring to Hz (cycles/sec)! Hz per second makes no sense in the first order. You are right about frequency, it can occupy any temporal interval that the cycle lives in. (i.e. lunar cycle, menstrual cycle, annual mean temperature cycle in the northern hemisphere, etc.)

But you are right, we are just getting pedantic. I guess the real point is that the 60 fps is independent of the 60 Hz refresh interval.
 

brakel

New member
Apr 27, 2012
2,305
1
I know the difference between refresh rate and actual rendered frames, bro. :) I was talking about the frames per second. "hz" just means frequency per second, sorry if my unit of choice confused you.

I was not confused. The way that you phrased your post I did not know if you knew the difference or not and it did not matter. It is easily confused and I thought that it would be helpful with this discussion to mention the difference. Bro. :)
 

brakel

New member
Apr 27, 2012
2,305
1
To answer the OP's question it is a combination of human perception, technical conventions and hardware limitations. Although technical conventions were based on human perceptions within hardware limitations. The television specifications in the US were decided back in the 1940's and weren't changed until HD specs were added in the 90's except for the adoption of color TV specs in the late 50's. They decided back in the 40's on a refresh rate of just under 30hz but we call it 30hz for convenience. All other specs followed that lead. It wasn't until the rise of the personal computer that most people ever looked at a crt that wasn't 480i running at 30hz. It was so much a convention that most of us in the US didn't know we were watching 480i at 30hz! It was just TV.
 

Metalzoic

New member
Jun 8, 2012
907
0
I'm just praying for a return to the 60hz standard this generation of consoles, at least on the ps4 this should be possiible.

A return to 60 as standard? 60 was never a standard for consoles, the majority of console games have always run at about 30 (or less) and it was mostly shooters and fighting games that aspired to 60. I do agree with you though that I would like the next gen to make 60 the standard they try for from now on.
 

Sumez

New member
Nov 19, 2012
985
0
Sure, 60 was a standard, but it's been a long while, and it's mostly something you saw on Japanese games. Western developers don't give frame rate as high a priority, and still don't.
Except Id. Rage is a beautiful 60 frames per second, and the difference is obvious!
 

ravager

New member
Jul 20, 2012
100
0
Sure, 60 was a standard, but it's been a long while, and it's mostly something you saw on Japanese games. Western developers don't give frame rate as high a priority, and still don't.
Except Id. Rage is a beautiful 60 frames per second, and the difference is obvious!

Id has always done things right. I remember just how good that Quake engine looked when it first came out. Rage looks fabulous.
 

Timelord

Member
Oct 29, 2012
543
0
To answer the OP's question it is a combination of human perception, technical conventions and hardware limitations. Although technical conventions were based on human perceptions within hardware limitations. The television specifications in the US were decided back in the 1940's and weren't changed until HD specs were added in the 90's except for the adoption of color TV specs in the late 50's. They decided back in the 40's on a refresh rate of just under 30hz but we call it 30hz for convenience. All other specs followed that lead. It wasn't until the rise of the personal computer that most people ever looked at a crt that wasn't 480i running at 30hz. It was so much a convention that most of us in the US didn't know we were watching 480i at 30hz! It was just TV.

The reason this worked so well with CRTs was a phenomenon known as phosphor persistence. When the the phosphor lining the inside of the CRT was excited by the electron stream the glow persisted long enough until the next time it was refreshed. When the phosphor faded, it did not just "go black all at once". This combined with the fact that the refresh was actually 60 times a second, one pass for even fields, one for odd fields made the combined signal move smoothly without visible flicker. The original spec was 60 FPS with the "F" meaning Fields.

We also had a lot less expectations from TV back in those days. It was wonderful enough not to have to constantly adjust the vertical and horizontal hold knobs when solid state took over from the all-tube TV sets. There were many incremental improvements as TV broadcasts improved, but you are correct that the actual rasterization of NTSC remained remained untouched for decades.

When one considers "Motion Pictures" or movies as we all know them now, a different set of specs of 24 or 30 FPS was used, with the F meaning frames . This definition applies to "film" movies only as digital movies are "rasterized" and would follow the definition of TV and computers.

I'm not claiming one or the other definition is more correct, just pointing out how the similar terms can be confused during a discussion. Until the development of progressive scan display devices, the term fields per second was probably more appropriate, but all meanings tend to shift as time passes.

Both set of specs, however, are not absolutes but rather conventions that were considered "practical". What is considered practical shifts as time goes by along with the associated meanings as technology evolves.

Timelord ...
 
Last edited:

Members online

No members online now.

Members online

No members online now.
Top