I can't imagine people like that really exist?!

I can't imagine people like that really exist?!
What the fuck, are they just brainwashed console faggots or literary autists?

youtube.com/watch?v=PLhPvS0hZSs

>30 is fine
>I agree, 30fps is about all our eyes can handle... even 15 isn't bad, its barely choppy like a ps2 game
>ur a joke... if that's the case then 60fps sucks dick and u should go spend ur life savings on somthing that plays 120... ur telling me ur eyes can comprehend 60 pictures per second?

Other urls found in this thread:

guru3d.com/articles-pages/msi-nx-7800-gtx-review,8.html
youtu.be/hjWSRTYV8e0
youtube.com/watch?v=XfVgZm3r_Bo
youtu.be/XYGxWGduCCQ
youtu.be/IkOsvs0MB3M
twitter.com/SFWRedditGifs

Just the average console gamer

Well, I guess if you never played at solid 60 it's hard to comprehend, they won't notice the actual difference from just the video, yeah.

>can still see the stuttering at 60fps

Either this webm was made like shit, or my eyes are just too accustomed to more frames.

>60 jumping around slightly out of pattern
Whoever made this webm is a faggot

Yeah, that webm is quite shit, but it gets the point over.

...

Not me, but the thread is not about the webm though

60 isnt perfect, but is a decent compromise between cost and quality

I never seen more than 120hz monitors, but i assume that ~200 frames should be the true cap where it is nearly impossible to notice a difference.

30 literally is fine though.

60 is unequivocally better but solid 30 is perfectly playable.

The real question is if it's better to have shinier grafix at 30 or slightly less shiny grafix at 60, to which the answer is usually the latter.

That doesn't have anything to do with the quality of the image.

>literary autists

But this isn't 2007, you don't have to pick between shitty graphics at 60 or less shitty at 30.
60 has been a norm for ages now, it's again consoles who stay behind, at least the new generation utilises 60 FPS more.

There's a difference but oftentimes smoother isn't better
Movies and television look like utter shit at higher frame rates, something under your direct control feels better, however.
It's all about context, not every game needs 60 fps

>not every game needs 60 fps

>headaches
>lack of control
>input lag

60 is so easy to achieve, what's the problem with it?

>Movies and television look like utter shit at higher frame rates
This is only because you associate 24fps with feature films and higher framerates with daytime television with low budgets.

>input lag
30fps has 0.033s "input lag" you fucking autist, something you can't even notice you fucking autist

>There's a difference but oftentimes smoother isn't better
what the fuck are you talking about you retard
>Movies and television look like utter shit at higher frame rates, something under your direct control feels better, however.
that's because they use motion interpolation, which isn't the same thing at all

That is always something that needs to be chosen between though.

Newer, better hardware doesn't magically make it any easier to do 60FPS. Developers can always use the extra resources that put them from 30 to 60 to make the graphics better instead.

>has no idea what input lag is

If you actually played at 60, specially on a monitor, right in front of with, with a mouse, you will feel that 30 is unplayable, maybe when you start at 30 and stay at that, you will accept it, but if you are asked to play at 30 right after 60, it feels like uther shit.

Not refresh rate or frame time, input lag, you dip.

>FPS affects input lag
That's not how input lag works you fucking aspie.
33ms input lag isn't insignificant though. It's very minor, but it's a thing.

Is there a problem with that? This isn't an "opinions" thing, television objectively looks like shit at higher fps. The only thing it's good for is sports
>everyone is me because I have a narrow worldview
I switch between 60 and 30 all the time and have zero problems with it. Is it noticable? Yes. Does it matter? No.

It is also very important to mention is that many console TARGET 30 frames, but many times fail to do so.

Slowdowns to 20 and 15 frames are common.

You move your mouse at the beginning of the frame, it takes 33ms to render that frame and get your input over to the screen VS if it's 60 FPS and it only takes 16ms.

That's why adaptive sync is even better, your input does not have to wait for monitor refresh.

If it doesn't matter and it's all subjective, why are you getting so bent out of shape?

>I switch between 60 and 30 all the time and have zero problems with it. Is it noticable? Yes. Does it matter? No.
But it matters, the only way you could be defending it, is that you love console gaming and feel personally offended because developers must choose 30FPS because of the limited resources available.

Do me a fav and add 144 next time

I've regularly played at 60 all my life, on a monitor right in front of me, and 30 feels fine. Not /as good/, but absolutely fine.

If you're switching directly between 60 and 30 you'll always notice it and it'll be awful but 30 is completely and utterly playable if you don't have a frame-rate counter forcing you to focus on it.

>literary
Don't make us nuke you again, Japan.

>the only way you could be advocating that something doesn't matter is that you have a personal investment in the opposing side
Literal autism

>people say 60FPS is unnatural while playing a game set within a virtual reality

Side note, what games actually feature 60FPS cinematics? Metro 2033 is the only recent one that really comes to mind that both takes away the camera from the player and keeps the framerate up.

most places don't stream that high

The only real reason someone would even defend 30 FPS is that they can't play at 60 for whatever reason, this isn't a family van VS sports car thing, this is actual comfort and undeniable logic.

The higher the framerate, the smoother the picture.

Am I?
You're jumping to a lot of conclusions there, Mr. Wright.

Yes, literary, as in real fucking medically autistic people. Not just trolls.

If you notice it then it literally matters. You notice if the person approaching you is white or black why? Because it matters..

If you have the choice to play a game at 30fps or 60fps, which would you choose?

If you say 30, you are either full of shit or stupid.

For this reason alone 60fps is superior and arguing against it being the standard id extremely hypocritical. Screaming at the top of your lungs that it doesn't matter is in the end just your ignorant fucking opinion and opinions like that are stalling progress with hardware adaptation.

Why does 96 and 24 seem the most stable, or are my eyes fucking with me?

48 and 30 seem to be jolting around everywhere and it's a bit nauseating. 60 is somewhat tolerable.

15 is a cartoon.

>You're jumping to a lot of conclusions there, Mr. Wright.
How? Everybody agrees that 60 is better but 30 is OK, even in this thread, but those faggots say that 60 is actually worse.

>30 is completely and utterly playable if you don't have a frame-rate counter forcing you to focus on it.
incorrect, you can easily notice frame dips to 30 once you're used to 60+ constant, even without a counter explicitly telling you

The webm is shit.

That's the stupidest fucking thing I've ever read on this board

Let's not get carried away.

Of course, people can actually tell the difference between 60 and 120, 30 and 60 is right away noticeable.

Rock steady 30 fps in games where timely inputs isn't too important doesn't bother me. 30 fps with drops is just fucking laughable.
>solid 60 > solid 30 > 60 with drops >>>>>>> 30 fps with drops

Many games where the cinematics are in-engine have the same framerate of the rest of the game. So many i dont even know where to start. Like, all of them.

If the cinematics are pre rendered, then it is a valid point.

This. It's not input lag that's the problem, it's that the displayed gamestate is 33ms behind the actual gamestate. It SIGNIFICANTLY impacts your aim.

Neck yourself, cuck autist.

OP, 24fps and 50Hz are king. RE4 was made for 50 (fifty) Hz.

I love you...

60, obviously.

But saying that it doesn't matter isn't stalling jack shit. The reason their games are at 30 instead of 60 is because the developers put more weight on more taxing graphics. The same hardware improvements that let them get better graphics are the same hardware improvements that would let them target 60 fps more easily. Consoles don't hold shit back because of some asinine fetish for framerates, they hold shit back because they're only updated once every ~7 years.

My focusing vision is shit. Everything's blurry all of the time. Oddly enough though this does not impact my ability to very quickly tell the difference between 60fps and 120fps. My peripheral range is also stellar.

>you can notice changes in framerate
No shit you can. SOLID 30, however, is perfectly fine.

This, I have measured my time from mouse click to reaction and it's 18ms with my correct setup, 33ms is almost twice as that.

>screaming at the top of your lungs
Show me where in this thread this is happening

You are stupid. 60fps was possible for all consoles, always. A lot of games were made at 30 because people took shortcuts and nobody fucking complained. Now it's a disease and you're condoning it by saying it doesn't matter.

>the developers put more weight on more taxing graphics
That's because the consoles are underpowered.
My PC from 2012 can easily run games at locked 60FPS at 1080p, even games like Witcher 3 at Ultra, no problem.

It's a problem in the industry and people don't care, so the developers and console companies get away with it, nobody taking their gayming seriously would consider 30 FPS or a console.

Hivemind

>my pc from 2012
Lmfao. Post specs retard. This is where you go complete radio silence and never reply again. A 2012 PC can't play modern gsmes at 60fps 1080p without drops. Dont even fucking kid yourself. Top of the line card was what, rhe 780Ti? Or was it the OG Totan? Shit's weak as hell lmfao. Keep on lying PCuck, i know 60fps is the only thing that makes your life worth living to you lmaooo

60fps is great, my pc is 7 years old so I can only get it in some games like Overwatch and DiRT on medium, and indie games.

30fps is tolerable, as long as it's not for racing or fighting

15fps drives me nuts, but GTAV ends up running at 15 any time I'm anywhere near the city, though I turned the population density and variety way down and that helped.

>60 is fantastic
>30 is tolerable in most case
>15 is literally only okay for Sonic Adventure on the DC and Shadow of the Colossus on PS2
>60 with drops to fifteen = unplayable

30 fps significantly impacts my enjoyment of a game. If I'm using a controller, it's less noticeable because the controls already aren't precise. But any game where I'm rotating the camera with a mouse makes 30fps incredibly unappealing.

>What the developers choose to prioritize is dependent on the consoles being underpowered
It's not. Fancy grafix are valued by the plebs more than high framerate. Give them a less underpowered console and they'll make better looking graphics, not turn the framerate up.

>My PC from 2012 can easily run games locked at 1080/60, even games like Witcher 3 at Ultra.
Yes, because developers who make games dedicated for PC need to dial back their max settings so that even people mediocre PC's can put the settings on "max" settings and have it run fine. If they don't people complain about it being "poorly optimized", no matter how good it looks, or if they're perfectly capable on running at high/medium and having it look great.

What the fuck are you talking about? Are you 15? Do you even remember 2012?


Two 7970 in CFX with a FX-8350, full blown AMD box.

You're an imbecile.

That has slightly evolved over time. The old 60 fps standard was used in testing and benchmarking. A setup that averaged 60 fps would generally be assured of not dropping below 30 under the heaviest loads during a game.

30 fps was always the floor a game should not go under.

The only reason you silly faggots have become obsessed with 60 fps is because you don't have a 150lb HD CRT monitor that supports a 85-100hz refresh rate and you're all stuck using shitty monitors with a native resolution.

>From 2005
>guru3d.com/articles-pages/msi-nx-7800-gtx-review,8.html

Frames per Second (FPS)

Now what you need to observe is simple, the numbers versus the screen resolution. The higher the better. The numbers represent what we call FPS, this means Frames per second. A game's Frames per second is a measured average of a series of test. That test often is a timedemo, a recorded part of the game which is a 1:1 representation of the actual game(play). After forcing the same image quality settings this timedemo is then used for all graphics cards so that the actual measuring is as objective as can be for all graphics cards.

If a card reaches >30 FPS then the card is barely able to play the game. With 30 FPS up-to roughly 40 FPS you'll be very able to play the game with perhaps a tiny stutter at certain, intensive on the graphics card, parts.

When a graphics card is doing 60 FPS at average or higher then you can rest assured that the game will likely play extremely smooth at every point in the game.

You are always aiming for the highest possible FPS versus the highest resolution versus the highest image quality.

That's where you're wrong. It's a matter of habit.
Seeing games play at 60 fps makes me uncomfortable

Kek, 7970 is weak as fuck. I'm getting meme'd on apoarently. Retard thinks dual 7970s can 1080/60! LMFAO enjoy your medium settings and variable framerate, kid. I know your eyes light up whenever you see your counter hit 60 while playing minecraft lmfaoo

Not him but that's crazy. I have a 7950 and a 8350 and it still handles games well enough. Hadn't thought about how old it is til now because the only upgrades I've bought in recent were more, bigger SSD, HDD, and nicer peripherals.
I blame 8th gen consoles for being so shit. It's been years and they still have so little going for them.

>30fps is fine
youtu.be/hjWSRTYV8e0
F console gamers.

I think 30fps is fine for some types of games, as long as it never dips below that.
It looks bad when compared side to side with footage at double the frame rate, but it doesn't quite break my immersion so I stop caring after a while.

I played a ton of Demon's and Dark Souls on consoles and the only times I cared about the frame rate was when it got butchered in some areas or during backstabs.

>Seeing games play at 60 fps makes me uncomfortable

>A setup that averaged 60 fps would generally be assured of not dropping below 30 under the heaviest loads during a game.

You used VSync because there was no adaptive sync, you didn't run at variable frame rate, it caused tears.

PCfags, everyone

I know autism gets thrown around a lot here but this time it just fits.

Your autistic.

Random video from YouTube, Wither 3 at Ultra not going below 30 with a single card.

youtube.com/watch?v=XfVgZm3r_Bo

I know you're just baiting, but other people might actually think you're serious, no need to spread hate because your own personal problems.

I think it was 2 years ago when carmack said that even 60 fps is trash, shit is supposed to be beyond 75 and this people still defending lower than 30 drops in 2016

>Your

You can always tell who has won an argument because they're always called autistic.

How do people stand playing anything lower than 120fps?

I think he was talking about acceptable frame rates for VR.

youtu.be/XYGxWGduCCQ

Seeing games play at 60 fps makes me uncomfortable.

Try reading slower if that helps.

I don't even play anymore. I just lucid dream after watching a trailer and looking at some bullshots.

Which wasn't actually an issue with monitors that supported 100hz+ refresh rates like every CRT I ever owned.

Your pretty new, eh?
>won an argument
Lmoa, if that's how your little world works.

>he cant get stable 60
PCucks on suicide watch.

But your 2012 PC can 1080/60 though! Lmao you fucking retards are liars and you slurp shit for breakfast lmfao. Keep on lying, PCuck. Your fucking 7970 is trash just like you are. Wake up and grab a 1080 if you want to 1080, but even then, you wont get stable 60 lmfaoo. Dumbass pcucks lying upside and down lmao

...

>Movies and television look like utter shit at higher frame rates
Bullshit, maybe if you arnt used to it, it can feel a little weird but once you are its almost impossible to go back. Its like these old ppl saying how their TV from a decade ago looks better than a samsung series 9 or an OLED. You arnt used to it so you dont like it. Ive downloaded a few shows in higher frame rates than usual and they look fucking gorgeous.

Yeah. You lost. You have shit opinions. It's okay; a lot of people do. Go to a support group or something because you won't get any sympathy for your 30fps dick smoking here.

(You)

You OK dude?

0/10

I am and have been convinced for the past few years this meme has been going on that people are getting paid to shit on 60fps and say 30fps is superior because modern consoles aren't powerful enough to handle 60fps given what the market demands in terms of graphics. Or the developers just suck at optimization.

Just because you're so desperate for attention.

Don't drink the bleach too quickly, PNiggers. Savor your deaths as you see your fps counter tick "60" one last time before your fucking power inefficient bricks explode and cause a house fire lmao. A fucking 7970 doing 1080/60, what a goddamn lie. Retards. If a 380x OC cant do 1080/60 how the fuck is an7979 going to? Hahaha fucking idiots.

Stay cucked and keep on lying. Yoir teare are delicos.

They might just be developers of consoles and console games.

If you take all those people together, there will be a few thousand of them, enough to spread that shit on the internet.

>Its like these old ppl saying how their TV from a decade ago looks better than a
You know, except for when they're watching anything in less than 720p or 1080p which is being upscaled by the cheap bullshit chips built into the TV, in which case they are almost certainly correct.

Monitor/TV upscaling is kinda shit. Sony is launching a whole new PS4 just to include a custom hardware upscaling solution to go with their overpriced TV's because 1080p upscaled by a 4k TV looks like hot garbage.

Now imagine watching the Andy Griffith Show and Columbo upscaled from 240p to 1080p.

Firstly
>arnt
Secondly, HFR television looks like hot garbage
We've had, what, 6-8 years (however long 120hz televisions started being heavily advertised) to ""get used to"" it and the majority of people still turn that shit off, same with most post-processing garbage shipped with televisions these days.

>Y-you're just dumb if you disagree with me
lel

Mommy didn't buy you that graphics card?

But you're opinion is the shit one

youtu.be/IkOsvs0MB3M
:^)
PCs can do 4k@60fps

You're dumb for a host of reasons; many of which you're probably too dumb to understand.

Hes baiting, ignore him.
Only going to derail the thread.

Stable 30 fps isn't that bad for me unless i have a direct comparison to 60 fps. Of course, when possible i'd rather go for the higher framerate.

Constantly dropping framerate or sub 30 is where i really start to notice it

Kek, 1080s in dual SLI. Stay mad and poor lmao. Keep on lying pcuck.

>t.Mad as fuck 7970 owners readying their nooses for ritual suicide lmfaooo

You must be 18+ to post here.

Pick a different game. Adjust settings. Use Virtu MVP software. I can run Freelancer at 1080p at nearly 200 fps with my old 7770.

the problem is when watching a movie it looks like your watching people make a movie or a play or a home movie or something like that. It doesn't feel like a film.

Are you really that poor that you have to tell everybody what graphics you are running to show off because of your own personal insecurities?

>this shit I'm eating every time I'm watching a movie is what I'm used to so why would I ever change it?

>food analogy
IT'S BEEN A WEEK SINCE I'VE SEEN ONE GODDAMMIT!
REEEEE

Dude I live in Australia where normal tv is 480p with like 3 720p channels, yes it looks like shit. Thats no fault of the tv tech since everything is built around 1080 or 2160p now. You dont buy some high end sports car then use the cheapest petrol you can find in it.

>the majority of people still turn that shit off
The majority of people dont even touch the settings once they unpack it for the first time. I change my settings depending on what im watching.
>most post-processing garbage shipped with televisions these days
depends on the engines they use and what you are watching. Sure im gonna turn all that shit off for games but turn most of it back on and fiddle with it for movies to get it perfect as can be

Again cause you are used to the 24fps

The fact that you think literal human shit is food is very telling.

>implying
>eating shit
>you eat food
Threadly reminder analogies aren't arguments.

It looks awful, plain and simple
When I watch television and movies, I want the director's vision. Floaty soap opera crap isn't part of that vision (I would hope).

Man, I have a 144Hz monitor. These things are way more common than 120Hz monitors.

Hourly reminder: analogies are created to help stupid people understand what they initially failed to. If you find yourself constantly at the business end of analogies, you're probably wrong about everything you think. People are trying to help you understand by giving you the opportunity to see things from a different perspective, but your vehement refusal to do so is part of the reason why you have such shit opinions in the first place. If you were open minded to begin with nobody would feel compelled to teach you like they would a four year old.

tl;dr: pbbbbbpttbpbtpbtbtpbtpbtbt

>I want the director's vision
which is exactly what you dont get when watching most movies because the execs get involved cut a shitload of stuff move scenes around and generally fuck up the original vision

I'm used to both and I think its a better experience with the lower framerate of 24. Theres a reason why most movies are still at that framerate,

And I'd want to fuck it up further with that bullshit?

This webm is making my screen tear to to shit, so I can't accurately compare them.

>He fell for the SLI meme
Congrats spending double on a minuscule benefit.

>Theres a reason why most movies are still at that framerate
Yes its the exact same reason console devs spout 30fps is the best. Cause most people are used to it and dont know any better, so they cant be bothered changing it. When they do try it (the hobbit) they get beat down by all the normies who think it looks weird

Do not reply to this poster. He is not serious and is only here to bait.

This is one of the rare instances normies are right

do you even have hair on your nuts yet?

Movies and television have natural motion blur captured at recording, because it's a recording of actual events, instead of perfectly rendered frames. This is why motion blur in games exist, but it's still artificial which is why it's more often than not bad, unlike the natural in movies.

There is simply no need for anything above 30.

Waste of resources and money that could be better spent elsewhere.

I stop being able to tell the difference (but the motion blur is still obvious) at 48.
Can't test 96 vs 60

>literary autists