Can someone explain to me, without trolling, why more than 30 FPS is so desirable?
I have looked at this comparison -
- and I can tell the difference but I don't put much weight on that because both examples made me sick anyway. I don't usually play the sorts of games that would require frantic whipping back and forth.
In videos, 24 FPS looks more natural and 48 looks like a cartoon. Why should this be different for video games?
And if I don't notice a big difference in performance, why should I pay a big difference in price?
So, because I can, I have started lowering fps on my friends games in their computers to see if they notice, just the ones who make a big deal out of this.
After 3 months, a grand total of 0 have noticed.
If anything, it's what people are used to
But if you ever played either, you'd find 60FPS is a lot more enjoyable
No really--assuming you have a gaming computer, or even The Last of Us for the PS4
Start playing at 30FPS, and then switch to 60FPS
It feels sluggish
spoilers: the problem was you all along, OP
Medium rare is the best, prove me wrong.
I'll ignore the bait in your post, but riddle me this:
what happens when your game chugs?
at a base of 30fps, you drop down to some horrendously unplayable framerate.
at 60 fps, you get a noticeable downgrade, but it's still playable.
and if you're one of those idiots who pretends not to notice any difference above 60, then at 90 fps you won't notice any chugging whatsoever.
60 FPS makes everything faster and smoother.
In a fast-paced game, it also gives you more frames to follow the action and react.
Otherwise, it just looks nice.
It'll probably look weird the first few times you use it, as you're used to lower frame rates, but after you've put some time into gaming at 60 FPS, you'll hate going without it.
The only extra expense is hardware that can handle it.
When devs lock the game at 30 FPS for the PC release, they're not doing it because it would cost them more to allow FPS to go higher, they're only doing it because they don't want console gamers to feel like they're buying an inferior version.
Great, now it's that time of the night.
The reason 24fps film looks smooth is because of exposure time. Each frame is exposed for a significant amount of time, so when there is movement the image is not crisp but very blurry: there is essentially a time element included in the picture.
Your brain does a lot of cool computations to transform a series of blurry still images to what appears to be crisp stuff in motion.
Video games on the other hand render only a single moment per frame. You could actually make a 24fps video game look perfectly fine if you rendered at, say, 240fps and then combined each 10 images together. There's a guy that makes csgo videos on youtube that does something like this, but combines them down to 60fps, it looks ridiculously smooth.
I've got nothing better to do so I'll take the bait.
Movies and TV shows have the advantage of capturing light to make footage, therefore motion blur is captured as well. the Motion blur is enough to smooth things out when a movie is playing at 24fps. (coincidentally a 24fps movie has a smaller reel compared to a 48fps movie, easier to transport.)
Video Games, however, render in real time and do not have the benefit of natural motion blur (artificial Motion blur is dogshit and should be ignored.) So without a natural way to smooth out movements, Video games have to rely on higher framerates in order to smooth out "footage." A higher Framerate is always preferable to a lower one when dealing with Real Time renders. 60fps just so happens to be the sweet point for most people, as many monitors have a refresh rate of 60hz (the screen refreshes 60 times in a single second.)
That's the basic gist of it.
Movie and tv cameras capture blur in each frame, rendered frames do not and never can in real time.
That said, I think 30fps is tolerable and better for consoles. Otherwise games these days would be held back even more by weak console specs.
>The only extra expense is hardware that can handle it.
Is this not a significant expense, or would you say that most entry-level PCs would be able to run games at 60 FPS nowadays?
>In videos, 24 FPS looks more natural and 48 looks like a cartoon. Why should this be different for video games?
Movies have motion blurring which combines frames together helping to smooth out motion, this is what makes it cinematic.
Video Games don't have motion blurring, so nothing gets smoothed out, so at lower resolutions, everything is fucking jittery.
If anyone tells you that 24/30 fps is supposed to be cinematic in a video game, they are trolling you or fucking retarded.
>Video Games don't have motion blurring
I found it weird when devs start adding motion blur in their games. Then I realized it was to help consoles and their low frame-rate. That's why I remove it every time I have the option to do it.
I know this is baity as fuck, but try driving a car in GTA IV on console, and then do so on PC. Console handling feels much more sluggish due to the lower frame rate.
>inb4 terrible car physics anyway.
Depends on the game. Grandma's facebook machine could handle old games at 1080p 60 FPS.
If it's new and you want to run at 1080p 60 FPS, you'll probably want a decent CPU and proper non-integrated video card, which is really what separates a pleb PC from a mustard PC.
60 FPS is great, but 30 FPS is tolerable. Out of all of the boons that come out of powerful PC hardware, FPS is probably the least important in most cases.
If I had to make a decision between good anti-aliasing @ 30 FPS and no anti-aliasing @ 60 FPS, it wouldn't be a tough choice. AA > 60 FPS any day.
Over the past 6 years I slowly began to consciously notice individual frames in movies, making it impossible to enjoy a movie without seeing a series of still images blending into one another - it's not that I don't get the motion sensation but nothing is smooth enough for me to not ruin immersion. Does anyone else have this issue? Is this a side effect of constant high-FPS gaming?
I never asked for this
OP I am gonna assume you are trolling but...
I agree that fps isn't as important as visual clarity, but you should be aiming to be 60 fps and 1080p.
IMO, having a stable fps > having a high resolution > having a high fps
Moshi moshi, baito-desu
try it yourself go start a game you can frame limit and move the screen around quick and take a screenshot while moving around quick
do this for both 30 fps and 60 fps
or go there
Medium rare is so perfect the other choices don't even feel like choices. Whenever I hear that someone prefers their meat cooked longer or shorter than medium rare it just seems like they aren't even from this planet.
I suppose it could work but like, it's lazy as fuck. Why not have a gif or webm of actual gameplay showing both stages off? Does anybody have the one where it switches from 30 to 60 during the game? I think that one showed it pretty well.
>he doesn't eat it blue rare
>he doesn't live in the northern woods and beat caribou to death with his fists
>he hasn't asserted his dominance over nature
I can't stand jaggies. They make the game look fucking horrible.
Jaggies are why I consider consoles to be a prison for games. When dealing with that sort of gutless hardware, there's just no escape from jaggies.
>Can someone explain to me, without trolling, why more than 30 FPS is so desirable?
Literally no one can do this, webm related
24 is a limitation created by cost to produce and distribute film, not to mention editing is done by frame and expensive. 22fps was originally going to be the stardard as it is the minimum framerate to create Flicker Fusion, the phenomenon by which the denominator of mankind can perceive motion without stroking the fuck out. 24fps is a hold-out from 100+ years ago that needs to be replaced.
>The only extra expense is hardware that can handle it.
Or, you just tone down on effects like anti-aliasing, ambient occlusion, texture resolution, shaders, etc. PS4 and Xbone could be running games at a solid 1080p 60fps if devs didn't pile shit onto games to make them look "better" at the cost of performance