At what point does a certain resolution become ideal so that it is detailed enough to not have any pixilation, blurriness, or visual artifacts? Same goes for framerate, at what numeric value or range is it ideal so that everything is extremely fast and smooth with virtually no lag or stuttering? And when does it become “overkill” – where anything beyond a certain point shows little no difference to our eyes thus only being a waste of performance and potential for price and hardware? I’ll share my thoughts below and what I think is the best, worse, and average out of those two.
I think the ideal resolution actually depends on a variety of factors: screen size, pixel density, medium (game, video, etc.), viewing distance, and soft/hardware. Let’s be honest here, you won’t be able to notice the difference between 720p, 1080, 1440p, and 4K on your smartphone, but it is very noticeable on a laptop or TV screen. Likewise, on a laptop display, anything beyond 1440p is essentially a waste of potential on the hardware used to render it. On a TV, you’ll only be able to distinguish better or worse resolutions when you view it very far away or very up close so tough luck there too.
As for framerate, I personally consider 45-60 fps to be normal, 25-40 fps to be extremely bad, anything below 25 to be unplayable or shit, 60-80 fps to be great, 90-120 fps to be amazing, and anything beyond 120 fps to be dangerous for the video card. Most people honestly can’t perceive the differences of framerates beyond 60fps as that’s the highest refresh rate for almost all monitors. Also even monitors meant for gaming or with higher refresh rate only go up to 120 to 144 HZ so anything beyond that is also a waste. Eventually, we have to set limits and prevent it from going even further.
I’ve done some research and apparently while we humans can perceive up to 1000 fps, most people can’t notice the difference beyond 150 fps even when trained. Military pilots that go through even more training can only perceive images rendered up to 220 fps whereas the common house fly perceives reality up to 240 fps. Clearly with all these deductions and evidence, the ideal framerate would be 50-75 fps for normal games and then 100-120 fps for shooters, fighting, and racing games. Now you honestly don’t need such ridiculously high fps unless you’re playing competitively.
Yes I know that most games that don’t require fast reflexes and high-speed action that lower framerates are more suitable and actually recommended. But the times and technology have changed so we can’t always stay at 20-30 fps forever right? Granted, I don’t really mind if a game like, say The Sims 3, has an average fps range of 30-50 fps but once it goes below a certain number (for me it’s 40) it begins to get laggy and choppy. Obviously, there’s not really much of a point in having fps that goes beyond the refresh rate of the monitor, but it makes it smoother and less input lag.
With resolution, I personally find that anything below 768p HD is just way too pixelated and blurry which disgusts me. For years I had to deal with playing on the Nintendo Wii (sorry my parents never bought me a PS3 and couldn’t get a job) and I hated how I had to play in standard definition since it looked ugly even when far away. 720p I can’t stand at all and I don’t even know why people consider that to be even high definition. 900p is actually decent and looks very good whereas 1080p is the ideal sexy resolution that me and many others strive for – it is the average resolution for Steam gamers.
Now depending on the other factors, ideal resolutions can change drastically as 1080p actually looks a little blurry on 20+ inch monitors used for PC gaming. I’ve seen UHD resolutions before and honestly, 4K isn’t even necessary unless you’re using a monitor or TV that goes beyond 40 something inches (forgot what the credible sources claimed). So much like how anything beyond 720p on a phone is useless, 4K and beyond – or the all new 8K resolution – is basically wasteful. My personal preference and opinion is that 1440p is the perfect resolution for PC monitors beyond 20 inches.
Ok, depending on the software and hardware the detail and clarity of what’s being shown on screen definitely changes a lot. For example, with my ultra-book that had Intel HD Graphics, I noticed that the games I played were still a bit pixelated at native 1080p; however, with my current laptop using AMD Radeon R5 Graphics, even at 768p it wasn’t as pixelated (plus the graphics looked better overall). At 900p and 1080p the games looked way better than what my ultra-book could ever hope to render even with anti-aliasing…well, it couldn’t handle anti aliasing or ultra settings sadly.
Also, newer games tend to have higher polygon count and higher resolution textures with better geometry and lighting, so naturally they’re visually superior. A game like Grand Theft Auto V for example will look good at 720p low settings than compared to say Garry’s Mod at 1080p high settings in terms of how pixelated and blurry it looks. So although my current laptop is only native 768p, because it has a better video card the quality of the graphics are vastly superior than my older computer at native 1080p with a crappy integrated potato card (ok, R5 is also integrated but it’s an APU).