Master Racier

 Pages 1 2 3 NEXT
 

The master race has some snazzy clothing. Also, can the human eye notice the difference between 60 and 120 fps? I know there's a cap before it becomes redundant to increase fps, but I forget

Renegade-pizza:
The master race has some snazzy clothing. Also, can the human eye notice the difference between 60 and 120 fps? I know there's a cap before it becomes redundant to increase fps, but I forget

Yes, it can. But, does the human care in the first place? ;)

That was great!

Renegade-pizza:
The master race has some snazzy clothing. Also, can the human eye notice the difference between 60 and 120 fps? I know there's a cap before it becomes redundant to increase fps, but I forget

Depends on the person in question, some people can't tell the difference between 30 and 60, others tend to max out at 70, the average last I heard was about 72 FPS. Some people can identify up to 120 FPS, but that's apparently about the maximum. It depends on both eye health and effective reaction speed of the person in question.

I literally just got a 2k monitor last night. Damn, just made the cut! That was a close one!

120fps... well, I can run Morrowind Overhaul at 150fps. Does that count? >.>

Renegade-pizza:
The master race has some snazzy clothing. Also, can the human eye notice the difference between 60 and 120 fps? I know there's a cap before it becomes redundant to increase fps, but I forget

I remember reading some study that said airforce pilots stop noticing differences after around 250 fps

I do have my old CRT monitor somewhere, I've been meaning to get it out and test it sometime for myself.

Renegade-pizza:
The master race has some snazzy clothing. Also, can the human eye notice the difference between 60 and 120 fps? I know there's a cap before it becomes redundant to increase fps, but I forget

At that point it's harder to notice and harder still to care, certainly less than the difference between 30 and 60, but the difference is still obvious to many. I have a 144Hz monitor and the difference between 144 and 60 can be really striking to me depending on the game.

Although interestingly, I notice it most in the desktop with mouse cursor movement. Since that cursor flies across the screen at high speeds, more frames is very noticeable.

P.S. Thanks

Been happily using my 1440p monitor for a long time now. And got a 4K TV hooked up via HDMI too. I play Secret of Mana (SNES, 1993) on that!

Aw man, this is exactly why we can't elect all video game reviewers to the board that decides the entry metrics...

Kenjitsuka:
Been happily using my 1440p monitor for a long time now. And got a 4K TV hooked up via HDMI too. I play Secret of Mana (SNES, 1993) on that!

I didn't know there were hipster versions of the "master race" :p

....Man...How the hell do they get up to 120 FPS? Most Developers don't even go that high O_O

Diablo1099:
....Man...How the hell do they get up to 120 FPS? Most Developers don't even go that high O_O

Crossfire (multiple video cards) usually. Or just keep everything else in the game at really crappy specs if you have a decent card.

Diablo1099:
....Man...How the hell do they get up to 120 FPS? Most Developers don't even go that high O_O

Gsync and Freesync are designed to go that high (well upto 144 hertz) Both are designed to get rid of screen tearing and reduce input lag.

Nvidia explain it pretty well on their website. but the short version is that the framerate of the GPU and monitor match all the time rather than the GPU trying to hit the max fps of the monitor all the time.

This may be all fun and all for gamers who are really committed to gaming but how far and how much can the human eye take when it comes to higher and higher frame rates, resolution and VR tech. Don't our eyes have limits?

Just me or is that PS4 guy in the first panel scavenging what looks to be a 80 mm case fan? What does a PS4 need with PC parts?

(As someone who's been doing a lot of computer repair these are the things I notice)

Technically, no. Human eyes don't have a limit.

The processing time of our neurons might start lagging at some point, becoming unnoticable.

That said; I personally can occasionally notice the difference between 30 and 60 fps, but have yet to have a reason to care. This will save me a fortune in hardware costs over the coming years.

First they came for Nintendo, and I did nothing because lol Nintendo is for babby.

Then they came for Microsoft, and I did nothing because XBox is for dudebro douchebags.

Then they came for Sony, and I did nothing because they were console peasants.

Then they came for me, and nobody did anything because this parody is forced and labored.

Renegade-pizza:
Also, can the human eye notice the difference between 60 and 120 fps? I know there's a cap before it becomes redundant to increase fps, but I forget

I have good news for you, the answer is well over 200 for most average human beings, so there is a long way to go before we reach the fps cap. If you spend a lot of time using screens for detail and motion oriented work (like say... gaming) it can go far, far over the 220fps average.

However, around 50-60fps is where human beings stop perceiving the flickering between images. This is why movies are projected at 72fps (24 times 3 to be exact) because actually projecting 24images per second makes films look horrific, whilst most TV is broadcast at 60fps (30x2) or 50fps (25x2).

If you watch a film on a projector at an actual 24fps you can see the black spaces between images, it's awful. On games at least the previous image stays in place until the new one is rendered, so flicker at 30fps isn't as much of an issue.

Don't worry, there is plenty more time for increasing frames per second a resolutions to keep those master race wallets empty.

The "Wooten tag" greeting killed me. Love the artwork on this one. Nicely done, Cory.

To be fair, mechanical keyboards.

<drool begins>

<drool never stops>

[I'm not doing this right am I]

Renegade-pizza:
The master race has some snazzy clothing. Also, can the human eye notice the difference between 60 and 120 fps? I know there's a cap before it becomes redundant to increase fps, but I forget

This is a nice short text about the topic.
It explains quite a few things about fps and how we see/don't see things.

http://www.100fps.com/how_many_frames_can_humans_see.htm

Frankly, the people who claim that they can enjoy anything below 120fps, 1440p are just lying to themselves. Anything below that is objectively not fun

fix-the-spade:

However, around 50-60fps is where human beings stop perceiving the flickering between images. This is why movies are projected at 72fps (24 times 3 to be exact) because actually projecting 24images per second makes films look horrific, whilst most TV is broadcast at 60fps (30x2) or 50fps (25x2).

If you watch a film on a projector at an actual 24fps you can see the black spaces between images, it's awful. On games at least the previous image stays in place until the new one is rendered, so flicker at 30fps isn't as much of an issue.

Don't worry, there is plenty more time for increasing frames per second a resolutions to keep those master race wallets empty.

I've heard a number of people involved in graphics saying that motion blur at 30fps looks better than no blur at 60fps. Can't say I have enough experience to evaluate it one way or the other, but it deals with the issue of gaps between objects from one frame to the next, which is a large part of what our brains don't like about low frame rates.

Hah I've pretty much always been a PC peasant! While it makes for a humorous comic though while some PC gamers like to rag on console gamers, while there can certainly be the dick measuring contest turning on fellow PC gamers for being under spec'd generally isn't a thing (from what I've seen at least)... unless maybe someone is trying to game on a laptop.

About the only time it ever may have been a big deal was in ye olden days while 56K modems were still around and you had the dreaded 'lag' users in multiplayer games.

Renegade-pizza:
The master race has some snazzy clothing. Also, can the human eye notice the difference between 60 and 120 fps? I know there's a cap before it becomes redundant to increase fps, but I forget

It's untested, the only honest to god blinded experiment (y performance rather than survey) for video games only checked below 30, 30, and 60 fps on quake 3 (with the way the theoretical model works, that it has to do with not only watching but needing to rapidly respond, some games may not see improvement at the same rates as well).

The thought of 120fps honestly makes me giddy.... but I know that as a peasant, I will not be able to afford the prerequisites in a long time. I do happen to own a 1440 monitor though. Shit's cash... but the color sucks. Maybe cuz it's old.

Sure we can't go to 1080p, but give the PS4 credit. It runs MGS:V at 60.

I'm in the gaming middle class.

That sure is a juicy carrot in front of me though...

The Almighty Aardvark:
I've heard a number of people involved in graphics saying that motion blur at 30fps looks better than no blur at 60fps. Can't say I have enough experience to evaluate it one way or the other, but it deals with the issue of gaps between objects from one frame to the next, which is a large part of what our brains don't like about low frame rates.

I think that's code for at 60fps everyone can see the bondo and duct tape.

In movies I'm inclined to agree, motion blur is part of the shot but should be used sparingly. One of the things that struck me about the The Hobbit high frame rate in the cinema was how much more obvious the sets and effects looked against the same movie in normal.

In games the opposite applies, the more frames the better.

Hey, I'm fine with being a peasant.

There are three Pillars to the PC Master Race, they are FPS, Resolution, Modifications, and Backwards Compatibility.

Four, there are four Pillars to the PC Master Race, they are FPS, Resolution, Modifications, Backwards Compatibility, and Free Multiplayer.

Five, there are five Pillars to...

fix-the-spade:

I think that's code for at 60fps everyone can see the bondo and duct tape.

In movies I'm inclined to agree, motion blur is part of the shot but should be used sparingly. One of the things that struck me about the The Hobbit high frame rate in the cinema was how much more obvious the sets and effects looked against the same movie in normal.

In games the opposite applies, the more frames the better.

Motion blur definitely gives you a lot more freedom to cut corners in animation

One thing worth noting, motion blur in movies is usually better implemented than in games. Movies don't have to work within realtime restraints, and can be a lot more accurate. Games tend to use approximations, so the reason why motion blur work as well in games could very well be because it actually doesn't look as good

ravenshrike:

*snip*

I'm sorry, but Monty Python references are dealt with severely on this site. Someone fetch me the comfy chair

Reinterpretation: Erin is not really a member of the PC Master Race, but an infiltrator from the Console peasants to mess with the PCMR from the inside, and those two dolts were too stupid to know any better.

ravenshrike:

There are three Pillars to the PC Master Race, they are FPS, Resolution, Modifications, and Backwards Compatibility.

Four, there are four Pillars to the PC Master Race, they are FPS, Resolution, Modifications, Backwards Compatibility, and Free Multiplayer.

Five, there are five Pillars to...

Blimey! I wasn't expecting a PC Inquisition!

So... I'm a PC Master Peasant?

No problem for me, still master race here.

 Pages 1 2 3 NEXT

Reply to Thread

Log in or Register to Comment
Have an account? Login below:
With Facebook:Login With Facebook
or
Username:  
Password:  
  
Not registered? To sign up for an account with The Escapist:
Register With Facebook
Register With Facebook
or
Register for a free account here