PlanetSide Universe - View Single Post - Tweet hints at FPS numbers
View Single Post
Old 2012-04-27, 11:47 PM   [Ignore Me] #13
kaffis
Contributor
Major
 
Re: Tweet hints at FPS numbers


Originally Posted by CViolet View Post
I can see the difference as well. Interestingly though. I can't really see any choppiness in movies that were recorded at 24fps (the old standard). My guess is that since video game fame rates are never constant, it causes a conflict with your monitors refresh rate and thus drops a frame here and there.

Interestingly, nvidia's new cards advertise the ability to slow down frame rates when the hardware is overkill for rendering certain games. I wonder if this would make for a smoother looking 30fps if all frames were delivered at precisely that rate?
It's very simple. Video games don't motion blur. The camera movies are recorded with captures motion blur naturally. Motion without the appropriate amount of blur looks "choppy" to the human eye.

This is because your eye continuously accepts input, but doesn't process it continually. Instead, the brain perceives small slices of time, and it expects objects in motion to have made an impression on the nerves in the retina at all points between the start end end of that slice of time.

When a game renders crisp still images and puts them up on the screen one at a time, objects in "motion" don't move through the intervening space between their position in one frame and the next, no matter how fast the framerate is. Instead, what we do is try to speed up the framerate so you can cram multiple frames into each slice of time the brain actually processes, creating the illusion of motion.

Shot footage, on the other hand is recorded in such a way that each frame of a movie or video (whether it's digital or film) has that 1/24th (1/30th if it's designed for TV) of a second captured in the same frame. This is why pausing TV or movies, even on DVD, looks blurry if it's an action shot or a fast pan. It actually IS blurry, it's just blurry in a way that your brain expects when it's being shown one frame after another.


So, yeah. If game engines were written with shaders or something that could apply motion blur to individual frames, the quest for maximum framerate would be a lot less of a problem, and we could get away with movie/TV-level framerates (and lower! individuals who get bothered/notice even 20 fps when properly motion blurred are *rare* -- most people's perception limits are in the 15-17 range). The exciting thing is, I think GPU hardware is catching up to the processing requirements to make that a practical engine feature for real-time rendering, so we may actually see engines do this in the near future.
kaffis is offline  
Reply With Quote