Turns out Nvidia drivers downgrade lighting and shader effects in order to gain a few FPS

Turns out Nvidia drivers downgrade lighting and shader effects in order to gain a few FPS

I guess it's to be expected from the company that brought us 3.5GB; I only wonder what else they're skimping out on at this point.

Other urls found in this thread:

youtube.com/watch?v=aIbSeMcsOqA
youtube.com/watch?v=FVb25eomcrI
rt.com/uk/361566-horny-sex-spiders-homes/
twitter.com/SFWRedditVideos

...

Post the fucking link you're a lying faggot.

youtube.com/watch?v=aIbSeMcsOqA

>people still unironically buy nvidia

I noticed this in TW3 also but everyone kept saying I was crazy. So we started comparing screen shots and then people started accusing of trolling by putting down the settings. Nvidia cheats, man.

Gotta run that level editor better

Fucking LMAO

Here's another one

...

This is not the sort of thing Nvidia would/could do at a hardware or driver level. The only realistic thing they could do is use lower precision for lighting, but missing effects is 100% down to the developer.

The likely scenario is the devs remove certain effects on certain cards to get a good framerate, but a GPU has literally no idea how to "remove le water effect from halo".. it just processes shader instructions.

Source: I'm a dev

...

Here's another video, this time in the new Doom, but the same issues.

youtube.com/watch?v=FVb25eomcrI

>pay more to downgrade graphics
lol

Now you're just shitposting because I have the same GPU and it doesn't do that on DX12

It a crazy faggot

Source: I'm a dev

That's great OP but check this out

rt.com/uk/361566-horny-sex-spiders-homes/

>Last year, two schools in East London were forced to evacuate hundreds of students after an infestation of the spiders was discovered.

Don't fuck with spiders

>"issue"
Funny.
If this was AMD pulling this shit we would have other name for it.

neat

That's cool and all, but that's not even the same angle.

>GTX980-TI
>DX12
>Blurry screenshot
No.

Interior screen shot on a Radeon 390x.

Prioritizing FPS over quality seems like the right move. Should be like that for every game.

What the fuck nvidia

Now compare to how interior screen shots look with a Geforce 980ti. Shading goes to shit, all the colors become flat with ambience, it looks like a cartoon on interior situations with the Nvidia card. This did not happen with the Radeon.

Yeah the FPS goes from ~40 to ~80 but it can't maintain the graphical fidelity.

That's clearly a case of textures not loading in time. No need to lie to make a point.

Could you at least put more effort and make a screenshot that is remotely similar to the one you posted before?

watch the doom video

Guess we'll see how my 1070 holds up. Should be arriving shortly. Pretty sure there'd be no reason for 'cheating' extra frames out when you are already getting 100+.

Of course not. I don't have the Radeon any more and I didn't get the Nvidia card for doing comparisons. Even if I did, I can't just hot swap video cards.

Yeah, I'm not saying nvidia is any better than amd, but that Hitman picture is just plain old shitposting.

Sorry bud, it's all nvidia cards that do this.

You won't notice anything until you get a chance to play the same game on the same settings with both an AMD and an Nvidia card - that's part of the reason why people never really noticed until now.

It's a fairly small difference so for the most part people thought they were just seeing things. Now we're starting to get more and more actual comparisons.

Coincidentally Nvidia "optimized" their drivers for Ashes of the singularity and it improved performance but introduced lighting issues, kind of like what's being seen in the thread. Then the devs redid their rendering for the game and the lighting "issues" went away, but so did the performance improvement for the nvidia cards.

nVidia have been doing this since 2000...

It's not like they ever stopped...

Both Nvidia and AMD are horrible. It's like choosing between two piles of shit. You just pick one that tastes better for you.

What a waste of time watching that, kys

Why does nvidia keeps pulling jew shit like that?
You think they would have learmed their lesson when latest consoles use amd hardware but no lets fuck our customers more.

Thats why I buy nvidia instead. You small petite amd faggots CONTINUE TO CRY LMAO

It's so they can win at those graphics card comparisons because all they look at is the fps counter. Therefore they'll still cheat to get an extra 10 frames, even at 100+ fps.

I wonder if the ASA would have anything to say about this. Probably not as the tests are all independent.

This was my experience exactly to the T when switching from Radeon to Geforce.

>my nvidia card just melted

what should i do?

kek... so that's nvidya's secret to getting more frames.

This is just conspiracy theory tier bullshit. NO ONE has any actual conclusive evidence to prove Nvidia is rendering anything any differently than AMD is. It's literally just shitposters lying and messing around with contrast and gamma settings.

New Amd cards when?

>swims in the pool full of shit instead of water just to "impress" the ladies

my siddes user

Sometime between now and March 2017

>58 fps vs 45
>53 fps vs 37
I'll take my gimped lighting effects as long as they come with a playable frame rate. The differences probably wouldn't even be noticeable while actually playing a game.

To be honest, recent NVIDIA drivers were junk.
But AMD is still shit

>benchmarking with the lower memory edition
Lel.

Why not just lower your graphics settings to get higher FPS?

What's the point in Nvidia doing it for you and hiding it? All that accomplishes is now you have less control over your graphics.

I mean if you take that to the logical extreme why not have the nvidia card render everything like an old playstation 2 game? So long as you get the FPS, right? Who cares if you want more graphical fidelity, nvidia knows better than you do so enjoy your PS2 graphics and 120fps while AMD poors are playing it on Crysis settings at 70fps

The difference is so minuscule that it's not important. And don't hit me with that slippery slope bullshit, I'm not going to fall for it.

>mfw I didn't even buy my 1070 and still have better fps than any other AMShit ware.

The differencr is not miniscule at all, and if you are going to ignore past and present controversies nVidia was involved in for the sake of avoiding an argument, why even post at all?

I'm an Nvidia user too, have been all my PC life but the other user does have a point. It's pretty scummy to make these alterations behind the scenes without giving the user the ability to control it, especially on a platform that prides itself on being open. The difference is miniscule for the moment but such a thing really should be stopped before they make it a habit and move on to making even worse moves to limit the user.

The 1060 has a better looking blur effect though. Look at the pipe on the left and at the character's jacket, the blur effect is pronounced on both cards but Nvidia's looks better.

>avoiding an argument
I'm not trying to argue anything, I'm just stating my opinion that the differences in lighting effects shown in this thread are not important enough to me to sacrifice 15 fps.

The performance difference from these changes is probably more like 5fps, not 15fps

You do have the ability to control it. Nvidia defaults to Quality instead of High Quality because most people won't notice the difference but they will notice the better FPS.

It is a bit unfair, but it's not like AMD can't do the exact same thing.

AMD control panel does the same thing.

You can see in the images that it's higher than 5 fps.

Oh wow

Except it defaults to "let application decide"
Also it literally only changes AF and AA settings

Anyone notice the guy who posted it point out that rolling back the drivers fixed the issues?

In this particular case though, does it actually result in better or at least identical shaders or texture quality? Because if so, then no harm done. If not, then these settings you shared probably other things like AF and anti aliasing.

Don't mess with this. It barely does anything, mostly relating AF, which is largely irrelevant to performance, but many games assume default settings, and can produce strange results, like certain textures failing to low, or only extremely low quality textures loading.

Big deal. Just turn it back up? Odds are when it becomes a problem you won't even connect that it is the culprit.

Problem between keyboard and chair.

>37.5 fps

>37.5 FPS

Now that is just embarrasing

>people constantly complain that game is unoptimized if they can't play game max settings 60 fps
>but are alright with drivers forcing settings to low quality since it gives more frames
Than just lower settings, exactly same shit. You think you're playing on ultra, but it's actually low-medium.

So you saying don't update my drivers?

>medium settings performing better than max settings
Geez, I'm shocked.

>nvidia fucks up a driver update, disables something they didn't mean to
>hey guys look at this they're turning graphics settings down manually
You're assuming they're doing it on purpose, give it a week or so before you assume then, it could very well be them being retarded and making a mistake, this isn't uncommon with drivers to create new issues

The GTX 1060 isn't a perfectly direct 1:1 counterpart to the RX 480 bud, they have differing levels of performance, depending on the game and the API.

I agree, it's just a conspiracy theory

I can play with graphics settings too mom...

The left looks better though. And the W10 version looks better than them both.

What the fuck is going on in this thread?