>using more than x4 anti aliasing
Using more than x4 anti aliasing
>using SMAA
>MSAA
Are we back in the 90s?
I like the jaggy look honestly and too much blur sometimes makes me nauseous. What's that effect all the modern games use for explosions nowadays? That shit absolutely slays me.
>Graphics settings
>Anti-Aliasing: On/Off
>not using DSR
What even are those?
It's not blur unless it's something like FXAA.
>tfw forcing 32x aa
Ther's like 4 different kinds of AA and I have no idea what any of them mean so I just press what's lowest because I assume that means it's the best.
>not just forcing SSAAx4
>mfw using downsampling to make an old game look much better
>mfw console-only plebeians will never know this feel
MSAA is not a viable anti-aliasing solution in 2016 because you need to store at least 3 screen-sized frame buffers for deferred lighting (used by just about every game in the 2010s) which means your VRAM gets BTFO if you multiply your screen size by 2x, 4x, etc. (which is practically what MSAA does). So yes, current year.
MSAA: works well, medium taxing
FXAA: vaseline filter, low taxing
TXAA: smart vaseline filter, low taxing
Supersampling: strongest form, high taxing
Then youre retarded
TAA is usually the lowest and """""""""""""""best"""""""""""""" but most of the time it just makes your game run at a lower resolution and upscales it, causing everything to look blurry as fuck
There is also DLAA, MLAA, etc. but they don't get talked about much. IIRC DLAA was used in the latest Killzone.
Oh damn, that explains perfectly why MSAA destroyed FPS in Mankind Divided.
Yes, I don't know why they even gave the option. It just causes people who don't understand it to complain. I guess they fell for the >anti-aliasing: on/off complaints meme.
Yeah it was pretty annoying people were bitching so much about framerates and all the benchmarks had MSAA cranked up, people obviously have no idea how demanding it is.
Depth of Field
Motion Blur
Or sometimes it's just bundled in under Effects
>smaa
>xfaa
>msaa
>fafa
>nasa
>fuckmyasa
WHY
ATLEAST HAVE A FUCKING TOOLTIP EXPLAING WHAT THEY ARE
I DONT KNOW THE FUCKING DIFFERENCE
why don't they just make graphics without jaggies?
>I DONT KNOW THE FUCKING DIFFERENCE
They are all buzzwords used by either Nvidia or AMD to sell their graphic cards. Just like 4K and VR.
>not just rendering at double ppi
Because it's stupid to limit games.
Vega10 is already confirmed for 16GB and it's probable that 1080Ti will be 12/24GB and this is only the coming generation of cards.
When I replay the game in 3-4 years time with an 64GB HBM3 RX-890, I don't think I'll care if it's using a little bit more VRAM.
I agree most people are stupid though and it was evident when idiots tried to use it today.
Anti aliasing has never looked good.
Why are you fucking up fine detail to smear fat on your monitor.
>motion blur
>using AA ever
>not enjoying that crisp look
>liking smudgy cuckshit
>playing GTA5 without AA
>babies who have only ever used FXAA have joined the thread
...
Why do people not understand what good AA does?
>motion blur is a setting in 90% of games now that is enabled by default
what retards think this looks good
for console games i can understand to give the illusion of smoother gameplay
>not using a HD7950 in 2016
>playing FPS
>ok I'm going to turn a bit
>ok now I'll turn fast-
>MOTION BLUR ACTIVATE
>screen goes all smudgy to simulate me having bad eyesight or having low framerate or a shitty monitor or something
>good AA
Thats an oxygen mormon if I ever heard one
So what's the best form of AA in this current age?
Off is pretty good
What happened to CSAA?
On
Just take it with a grain assault, please.
Someone just tell me what the best AA is.
I always just use the highest setting but I don't know if it's actually the best.
4K
Motion blur has good uses, it just gets abused by effect happy devs - much like bloom, DOF, filters. etc.
Use 1 is to make sub-30 fps motion seem more fluid.
Use 2 is to prevent headaches from gaming at 120+ fps
I liked the motion blur in Crysis. It was annoying when cranked up to the max but it can be nice if it's subtle enough.
What's the AA that Valve uses? Why can't they use that everywhere? That AA's great.
>using anti aliasing
>Depth of field
Whoever invented this is a fucking retard. I know depth of field exists in real life but thats because you're not actually focusing on the parts that are fucking blurry. It shouldn't be in vidya
How are they buzzwords? Picking the right AA method can help you avoid performance hits AND improve image quality a lot.
Cranking up DPI via downscaling or just monster resolution >> MSAA > SMAA > vaseline filters
I really don't understand the need for TAA from Nvidia or even FXAA when SMAA does what they do (hitting most annoying and easy to get rid of edges for little performance) better and without vaseline.
SMAA does a very good job at getting rid of aliased egdes (like treeline against the sky) and it doesn't really have any noticeable impact on frame rate.
Is this all one person?
this, I can look at more than just the middle of my screen in an fps I'm not an idiot
This.
>works in most if not all games out of the box without fucky incompatibilities in every second game just because it's nothing more than a simple resolution change
>impacts performance less than MSAA while looking just as good
If there are downsides, I haven't found them. There are some games with non-scaling UI, but it just means the devs are retarded.
Same thing with motion blur. Things moving fast are harder to see anyway, no need to coat my screen with vaseline.
SMAA is taxing. FXAA is FAST APPROXIMATE AA. So it is fast, but does not look as good.
There are people who actually believe this.
I mean, I do prefer no AA over some shitty vaseline-like FXAA, but come on.
The difference is too small to matter. Even with heavy SMAA you are looking for like 5% FPS impact. Having all that blur just isn't worth it.
>chromatic aberration
Yeah, that shit's fucking disgusting.
>not 8k downsampling
It exists to hide LOD, especially in console games
I'm blind enough in real life I don't need a fucking "lost my glasses" simulator when I play a game too
>Ambient Occlusion
For such a messily effect, shit takes up a lot of performance.
are all these retarded effects to make it look like I'm playing a movie or something?
Fucking grinds my gears. Why is this allowed?
an uncompressed 1080p frame is like 50MB~ plus one for normals and color grading and static shit and one for other postprocessing like blur. That's circa150MB per frame put out. That's not that much considering the average Vram is now 6GB this gen.
But AO makes a huge difference. You gotta actually worry about lightning besides shadow off/on if you want to reach photo-realism or even any other artstyle. Even artists shade their stuff.
What if I'm playing at 720p?
>Dying Light
>Can only turn it off through third party program
>FPS improves greatly when it's off
Why do devs use this shitty effect?
>1024x768
>4x MSAA
>anti-aliasing:
>off/FXAA/SSAA
I think it depends on a games style whether this looks good or not. Bloodborne is the only game I can think of that it looks decent.
They kind of make a good point. If you wanted good AA, you'd just downsample
One downside I found is that, if you're using a high refresh rate monitor, the maximum refresh rate will lower at higher resolutions.
For example, I can have 144Hz at 1080p but only 100Hz at 1440p and 60Hz at 1800p.
It's the modern form of shakycam, i.e. everything is filmed through a GoPro.
>ambient occlusion
>it's the shit cheap kind of ambient occlusion that causes white halos around everything
>Motionblur
[LOUD VOMITING]
I can imagine games set in CURRENT YEAR / near-future using the effect, but a Victorian-influenced game has no business using CA.
Where were you when Nintendo invented AA for consoles in 1996 that actually worked?
I am actually. 60fps in everything I throw at it, unless it's bound by my i7 960. Buying new cards is a meme
>Deferred lighting means we have to go back 15 years ago where SSAA was the only non-post-process AA method usable
Right around the time Majora's Mask ran at around 20 FPS constant.
>constant
I'm not so sure.
>post process AA apologists
>GTX970
>8GB RAM
>i5 4670K
>can't play games with AA on
I should have bought a PS4
It's the price for smooth jaggy free graphics, user.
Don't you like high image quality?
SMAA is a godsend.
>I don't understand the technical difference between different AA algorithms so it must be buzzwords
No excuse being this ignorant in 2016.
>impacts performance less than MSAA
It's fucking supersampling, the most heavy methid of anti-aliasing. MSAA was literally developed as a lightweight alternative you moooooroooooon.
What kinda games are we talking about
How do I get dsr to work in games? Do I need that nvidia geforce experience program?
>enable AA
>everything becomes blurry
no thanks.
SMAA doesn't mask aliasing very well.
Alien isolation is another example.
>SMAA
Nothing that blurs out text/ui edges is a godsend.
>Turn on DSR
>Go in options menu in-game
>Select desired resolution
>buzzwords
Nah. They're different techniques used to eliminate aliasing (the jagged edges you see especially in motion).
Fuck user, you want an explanation? I don't mind typing one out at the moment but I think you'll get a faster result just googling "anti aliasing techniques".
Good thing it's applied before any UI element is rendered, then.
Then explain to me why a few recent games with even x2 MSAA run worse for me than if I run them in 1440p via DSR
Looks good for racing games to give you that sense of speed but it's fucking stupid in just about everything else. FPS games with motion blur just gives me a headache. Why the fuck they spend so much resources on shit that blurs their otherwise impressive looking games is beyond me. Thank God PC games can turn those shit off most of the time.
I never realized how important AA was until I played 3D games on portables.
Deferred rendering is going out of vogue. Clustered forward rendering is where it's at in 2016 and you can use MSAA just fine.
Then you would be playing games on medium/high with no AA on.
Devs are just using Temporal AA now anyway
Why not both?
AA doesn't get applied to Scaleform UI elements
Apparently they added a toggle for CA in a recent-ish update. Can anyone confirm that shit?
MSAA + TAA = TXAA, at least Nvidia's implementation.
If you guys are curious how the N64 did anti-aliasing, here it is.
I think I have to set windowed mode?
Unless your desktop resolution is the DSR-only reolution you want to use, no.
>Mouse acceleration is on by default
>you have to configure the settings file to turn it off
Fucking kf2
>144Hz at 1080p
>60Hz at 1080p
Which one user?
Also, AA is shit, all of them.
>tfw have no idea what any of this is
>tfw just want to play gaem
I said 144Hz at 1080p and 60Hz at 1800p. It wasn't a typo.
>Also, AA is shit, all of them.
No.
>all of them
>what is MSAA/FSAA
Why did you come to the fucking thread?
Make your choice
Is there a sharpening filter or does it really look that bad?
...
Second. No choice really.
It's the in game sharpening filter
Damn. That makes the game extremely edgy.
>using sharpen post-processing
Antialiasing obviously helps, but why that? You just made the comparison inaccurate.
Why the fuck do they do that?????? SO many games look like trash. AA wouldn't even be needed if they fixed this filter.
I think the people who don't like AA just have blurry vision.
To be fair here's no aa and no sharpening
It eludes me why anyone would use sharpening this heavy. Instead of bringing out details like it's meant to it actually ends up crushing them.
>applying a filter that creates more jaggies
Why would anyone do this?
Unfortunately the only really good solution is downsampling. Most AA solutions today that aren't post-process don't cover everything and stop working if there are any alpha effects going on. Also I still haven't seen a solution for specular aliasing which is really bad in games like witcher or dark souls with lots of shiny metal.
I hated the TXAA in Fallout 4 since it blurred the whole screen when you moved the mouse but I found that the solution to that was to add LumaSharpen in the enb/sweetfx I had and it eliminated the blur almost completely and made the image a lot sharper without murdering my eyes like that witcher filter.
Yep.
Film has severe limitations that companies like Kodak worked for decades to improve.
Now that we can generate perfect images, complete fucking idiots are deliberately programming in the flaws back in.
>Film Grain.
May the Fuckwit who first thought of this be dragged backwards through a thorn bush.
>Played Outlast, horrible effects made me nauseous, was about to get refund but found a post which explained how to remove effects.
>Dumb fucking Devs almost coded themselves out of my money
>Buys Nvidia product expects good results
i like the effect in the kane and lynch sequel, but everything else eats shit.
It's still not particularly mainstream (though it probably will be in the majority of games within a year or two), and even if you use it you likely have a bunch of extra screen buffers, like DOOM still has two extra regular buffers and one HDR buffer in addition to the color buffer.
If the alternative is being unable to have more than 8 dynamic lights in a scene it's a good tradeoff.
>check for motion blur
>turn cross hair slightly
>game turns into a fucking unfinished painting for 2 seconds
>FOV: Regular/Expanded
>Graphics: On/Off
I got a 1070, with 2x msaa I'd get 44 fps, with it off 78. It's fine though, with a 1440p screen the jaggies are nowhere as bad as my old 1080p screen.
>Keybinds: On/Off
>FXAA
>everything looks blurry
>normal AA
>smooth edges
Why can't any form of AA do what downsampling does without effort? Why doesn't someone just make a type of AA that has the same effect on edges?
Downsampling is literally rendering shit at a higher resolution. Of course it would need a lot of power.
Hi friends,
I have a 1680x1050 display. If I run DSR for most games, would that look better than running 1080p at about the same DPI?
Do you have any idea of what your talking about?
There are so many things wrong in yout question i dont even know where to start
But if it has to be shrunk back down to size anyway, why bother with rendering it at such a high resolution at all, is the point. Why can't someone just make a form of AA that gives the same effect without a higher sample at a lower resolution?
>15 years ago
>"we give you AA options if you have to play at low resolution so your game will look better, no need to enable AA on higher resolutions, it will just look blurry"
>now
>"ENABLE 500XMULTIHYPERSUPEREXTRA AA ON 4K GAMING IF YOU DONT IT WILL LOOK LIKE SHIT"
The fucking shit hell happened. I swear to fuck.
The high resolution is what kills the jaggies. I don't think you're understanding.
Play 1080p with no AA any retard can see jaggies everywhere
But downsampling not only gets rid of jaggies, it gives over all better image quality so if you got the power why not go for it, aa is for poor people.
>get a 1440p monitor
>don't have to use AA in most games
ah yes
I like TAA. It gets a lot of shit for being blurry, but at least it actually anti-aliases, unlike FXAA and SMAA which makes things look blurrier and still jagged somehow.
Maybe if you are a dumb kid and put your face 5cm from the damn screen.
technology goes forward, clean your crypt sometimes geezer.
The edges look rougher but within objects the textures look sharper.
That's really the kind of trade off you take when using AA.
So does AA. But the way that AA obscures the "jaggies" doesn't hold a candle to what just downsampling from a higher resolution does. For all intents and purposes, Both standard AA and downsampling are forms of getting rid of jaggies.
If you're playing at 1080p, no matter whether you use traditional AA or downsampling, the image you're getting is always 1080p.
If both are literally the same size on screen yet downsampling looks much better for some reason, there should be some method of implementing an AA algorithm that has the same effect.
HD5850 here
retard
>If you're playing at 1080p, no matter whether you use traditional AA or downsampling, the image you're getting is always 1080p.
Not true. You'll notice in some games UI is still scaled to high resolutions.
Anyway, the only to come close to replicating downscaling is forcing x64 MSAA but good luck playing something modern at a decent framerate.
There are tons of AA types that give variable benefits for variable performance hits. But the best option will always be to just render the game at higher resolution so the game automatically smoothes stair-stepping.
Companies are constantly working on new techniques that give the benefit of a bigger resolution for less performance hit.
>Use sweet DSR.
>HUD is not adjusted proportionally.
fucking why
>technology goes forward
How the fuck is something that was for old shitty technology "going forward"?
I don't get people these days. For fuck's sake, at least try to think a little bit before showing your stupidity.
Not my fault modern game devs can't even make high resolution look good.
Zoom into that pic and see how the things are literally drawn as straight lines then move over a ton of pixels all at once just to make it look jaggy. Jesus fuck, learn to smooth shit out and high resolution will take over from there.
>15 ago
>use aa or look shit
>now
>better aa or downsampling with more powerful hardware not to look shitty
shut the fuck up grampa your grave is calling.
What the fuck are you on about.
You want to implement a form of AA that has all the benefits and does the exact same thing as SSAA but doesn't use as much processing power through some "algorithm"?
This is like Karl Pilkington talking about graphics. Why can't they do a algorithm what does the same thing but doesn't slow ye down.
what is depth of field exactly? should i take it off?
>Not true. You'll notice in some games UI is still scaled to high resolutions.
Makes 0 difference. The size of your monitor did not change and it did not magically grow pixels out of nowhere. The image on your screen is still 1920x1080 yet downsampling looks much better meaning it's doing something to a 1080p image that conventional AA cannot.
Hell, it's probably being rendered at low resolution and upscaled, that's why it looks like shit.
LEARN TO READ, KID.
15 years ago, if YOU USED LOW RESOLUTION (640x480) YOU USED AA, if you didn't, you didn't fucking bother.
Jesus fuck, kids. GO BACK TO SCHOOL AND LEARN BASIC SHIT OR SHUT YOUR FUCKING MOUTHS AND LET PEOPLE WHO ARE A LOT OLDER AND WISER THAN YOU TELL YOU HOW SHIT IS.
>tfw playing at 720x576
>tfw downscaling from 1024x576
Poorfag life ain't easy.
Two idiots in this thread:
>durr why can't they just make a better AA algorithm that's perfect?
>hurr all AA is bad and ruins the image
It's not a 1080p image. It's a 4K image squeezed onto to a 1080p monitor. You can see the benefits of a high res photo without needing a 4800x3600 monitor can't you?
It tries to mimic a camera. If a camera is focused on something in the foreground, the background will be blurred and vice versa. It's a preference really, but personally I turn it off because I don't particularly like most post-processing techniques.
seeing that blade of grass like that in the far off distance really make people that mad?
>oh no! theres some jaggies on a bladed of grass! COMPLETELY UNPLAYABLE!!
I think DOF is alright when applied sparingly, like in Crysis 2 when you customize your gun or in DOOM when pulling up the weapon wheel (though no one uses that anyway). Having some kind of always on DOF is apex retardation.
Downsampling aka. supersampling gives a smoother image because each pixel on your screen is the result of blending multiple pixels from a bigger "screen". That is the algorithm. How are you gonna sample from the pixels from the bigger screen without the bigger screen?
It doesn't make people mad, but if you can improve image quality, why wouldn't you?
...
you mad gramps the time is passing you?
15 y ago your "high" resolutions still looked shit, you just didn't know better.