Video game graphics

>graphics are improving slower than ever
Welcome to the law of diminishing returns

E.g. The difference between 2006 to 2011 graphics and 2011 to 2016 graphics.

It feels like graphics havent changed much since 2013. We won't have photorealistic graphics in our generation.

Pic related

Other urls found in this thread:

youtu.be/a80oClgNFJo
supergamestrader.com/wp-content/uploads/2015/11/metal-gear-solid-4.jpg
i.kinja-img.com/gawker-media/image/upload/s--bjei_wVo--/c_fit,fl_progressive,q_80,w_636/18j1gx4o1nbuijpg.jpg
totalgaming.co.uk/wp-content/uploads/2012/11/mgs4_snake-e1354188065319.png
images.pushsquare.com/news/2015/09/guide_sneaking_to_success_in_metal_gear_solid_v_the_phantom_pain_on_ps4/attachment/5/original.jpg
youtu.be/CrLizA9izdE
youtube.com/watch?v=fQEYVbZETVM&feature=youtu.be
youtube.com/watch?v=1Egy0nVombc
youtu.be/PsHa7pR9nVY
twitter.com/SFWRedditVideos

Did you play silent hills pt?

>E.g. The difference between 2006 to 2011 console graphics and 2011 to 2016 console graphics.

>It feels like console graphics havent changed much since 2013. We won't have photorealistic console graphics in our generation.

Fixed that for ya. Silly OP.

What's causing them diminishing returns? GPU power is still increasing every year, and game models are good but not "cinematic cutscene" good yet so there's room for improvement

we've reached the limits of silicon processors, real-time graphics wont get much better until we overcome that because all we do now is find more ways to parallelize processing and that has diminishing returns aswell

the difference between console graphics and pc graphics is so overstated by Cred Forums it's ridiiculous and this is coming from someone who hates consoles

>What's causing them diminishing returns?

Consoles.

Remember how good the witcher was on E3? Then it looked like shit, comparatively speaking, on retail? Our GPUs arent powerful enough.

If you want to get an idea how graphics will look in 5 years, watch E3 demos

The downgrade was strictly for consoles, most of the downgraded effects cost nothing for PC.

Consoles are holding PC back. And I'm enjoying drinking PCucks' tears.

The initial burst in graphics can be explained with this picture. Polygon count doesn't matter so much anymore. Newer technologies are slowly coming into being that help. You can take Hairworks from the witcher 3, or some sort of sepcial ambient occlusion used in the new tomb raider. There is also a future lighting model I heard rumored at some point, that hardware needs to catch up in order to be used properly... I think it was called "Ray tracing"

GPU power only increases linearly
to see dramatic improvements in graphics like the jump from 2D to 3D or shaded 3D it needs to increase exponentially, like GPUs that are 10 or 100 times as fast

>if a game doesn't have good graphics we wont play it!

Good, fuck off. Leave video games to the real men. If you can't figure out vanilla DF you're a shit.

that picture is complete bullshit, the 60000 tri is the 6000 tri model upscaled with a smoothing filter. someone even made a picture that debunks it

I think hair and fabric physics will be the next big jump in quality we see. Fluids and particles are already starting to show vast improvements, and I think in the next decade or two we'll see hair and fabric that looks like it was pre-rendered for a movie or something.

Witcher 3 can't even be maxed out at 1080p on a 970 without huge framerate drops. It's not just consoles. Most PC's can barely hand the downgraded version.

The Division
E3 2013 demo vs 2016 retail on PC
youtu.be/a80oClgNFJo
Our current GPUs simply can't handle it

debunk this

ps1

supergamestrader.com/wp-content/uploads/2015/11/metal-gear-solid-4.jpg

ps2

i.kinja-img.com/gawker-media/image/upload/s--bjei_wVo--/c_fit,fl_progressive,q_80,w_636/18j1gx4o1nbuijpg.jpg

ps3

totalgaming.co.uk/wp-content/uploads/2012/11/mgs4_snake-e1354188065319.png

ps4

images.pushsquare.com/news/2015/09/guide_sneaking_to_success_in_metal_gear_solid_v_the_phantom_pain_on_ps4/attachment/5/original.jpg
The last one is amazing, but the growth is DEFINITELY diminishing,

Real graphical jumps come with new console generations because they sell like candy. FTP games always look worse so as many people as possible can play the game so more people might be likely to pay money through microtransactions.

yeah it is diminishing i guess the picture just triggered me

1. Developers don't care about their work anymore. You can see this in JRPGs(PS2 games had more visual love than PS4 games), you can see this in shooters(linear levels with little going on instead of vast worlds with a lot going on), and you can see this in action games(they don't even have good textures these days).

2. Instead of focusing on good art, the focus is on expensive shading which isn't usually appreciated. Compare the performance and visials of both Mirrors Edge games running at 4k to have a good laugh.

3. The new consoles are dog shit. This is the main reason graphics haven't really changed in the past decade. The new consoles can essentially only render last-gen PC ports at 1080p. Crysis 1-3, also last-gen console games, running on PC are just as if not more demanding and impressive than any next-gen game.

>most of the downgraded effects cost nothing for PC.
>most of the downgraded effects cost nothing for PC.
The pre-downgrade footage was running real-time on PC.

Ubisoft downgrades nearly every game they make. We don't know why but it has nothing to do with GPU specs.

For starters that isn't a PS2 game or a PS4 game. It's a 3DS game and a PS3 game ported to PS4.

>console
There's nothing to debunk

Ps4 was a mistake nigger, wait for ps4 pro mgs

>What's causing them diminishing returns?
Money. The more photorealistic detail and more acurate physics costs way more to make and gives less and less.
Also most people don't give a fuck that your 3rd person shooter avatar you see from behind the whole game has 200000 separate facial hair that are moving with 98% accuracy according to physics.

>wait for ps4 pro mgs

games get downgraded because E3 builds arent the final game and aren't running all the systems the real game needs to run so they can make the graphics look prettier

Nice theory but that doesn't make sense.

So what, the graphics are still improving slower than ever before. E3 graphics wont be achieved without graphics mods for another 5 years at least

>a certain demographic has companies focused on nose-hair quality rather than gameplay, and gameplay quality has plummeted, but they're still worried about earwax textures

okay then

>-cry games

/thread

It's not a theory, it actually happened with the witcher, it was even explained by developers

how does it not make sense? I'm a game developer, I've done it myself. When your game isn't finished and you're running a demo build you can do things you would never be able to do in the final

You should have allowed E3 graphics for people with Quad Sli

The article you're talking about was damage control made for retards like you. You ate it up.

1. They didn't address gameplay footage AT ALL, they only talked about a CGI spoof from pre-alpha stages. The 2014 E3 footage was real-time and playable and even though it looked nice it wasn't a feat of fidelity by any stretch. It looked like a PC game should.

2. Most effects they downgraded cost -nothing- on PC.

3. PC games have graphical settings. There is no reason to remove anything, let alone so much.

4. The game was delayed to downgrade the visuals for console parity according to the CEO.

That isn't what you said. You said E3 builds are lacking final code to make it look better. It's the opposite and that code is removed from the final version.

well you misunderstood what I said
a demo game doesnt have all the systems a full game needs to have, so it can devote more of its resources to graphical effects that the final product wouldn't be able to have

pc ports are an afterthought, console developers dont really care how it looks on pc

>someone even made a picture that debunks it
Not that guy. But are you talking about this one?
All this picture really does in practice is illustrating the initial point about diminishing returns even clearer.
Like people get super worked up over all the time for whatever the reasons, all while they illustrate the same points.

Im talking about pc games

do pc games do the E3 demo bait and switch? I mean actual pc games, not console games that are on pc for convienience's sake?

>It feels like graphics havent changed much since 2013

LAMO

man even the pspro can't beat FUCKING 970 holy shit.

>a demo game doesnt have all the systems a full game needs to have
In most cases they do.

Yeah I always found this image to be kind of funny.
>No guys! It's totally not diminishing returns!
>Look, I'll prove it!

>Posts a really detailed and elaborate case of diminishing returns

>See, no diminishing returns guys!

...

man todd really pushed gambryo to it's limit amiright guyzx

i don't know, I haven't played a video game since Payday 2, i just keep up with modern gaming through youtube videos e.g. Demo trailers, showcases, and maybe lets plays.

Im the op

dude diminising returns on these rocks

HOLY SHIT even oblivion blows most games out of the water

how do you know that? all you see is what they show you. it doesn't even neccessarily have to be a gameplay system, you could just have half the map not loaded that you can't see. People build demos specifically for E3 trying to make it look as good as possible, regardless of what the final version of the game can handle

youtu.be/CrLizA9izdE

Graphics like these won't Be retail for another 30 years

Okay....?

graphics like these can be completely retail right now if you want to make games set entirely in small, completely static rooms

You mean "path tracing" an 'unbiased' offshoot of ray tracing used in architectural/concept renderer's like Vray or Octane
We won't see that being implemented in games for at least 5-10 years time because even a single 980 is struggling to run Brigade engine at sub 720p resolutions.
youtube.com/watch?v=fQEYVbZETVM&feature=youtu.be

>prebaked

PC Sperma?

It's real-time.

But when will computers be able to rig and animate this in real-time?

realtime rendering, prebaked lighting

We both know im not talking about a game where you just walk around a life like house

It's not pre-baked lighting, it's demanding. Games with pre-baked lighting run at 400fps.

I think this has a lot more to do with the technology developers are using. I don't think that those rocks or the terrain in MGS are bad because GPUs can't handle the extra polygons or HD textures. I think they are bad because of a focus on objectives other than detail. I know battlefront maps are pretty small, but even if you slash graphics quality to handle a larger game, environments built with photogrammetry will still look more authentic than open world games of this or the previous generation.

If you made a crysis-sized first person shooter today, you could afford stunning detail. You could make a game that looks as amazing as battlefront with as much detail as E3 The Division. But that's not what people want to play, they want to play open world games with their friends with tons of shit in it. They don't want Half-life that looks like reality.

youtube.com/watch?v=1Egy0nVombc

The games are getting bigger almost as fast as the technology is advancing

>The last one is amazing, but the growth is DEFINITELY diminishing,
come back and say that in another 5 years when we have 4k 120fps amd 8k 60fps

There are still no games that look better than crysis. I'm including physics and shit in here

yeah maybe if it's quake 1 it still needs to run complex shaders to look that good even if it is pre-baked lighting. if it was real time (which its not, the whole scene high really high quality global illumination) things would obviously be moving

That opening cutscene was the first time in a long time I thought "Wow" at a game's graphics

rate my crip guyz

Why does your house have chromatic abberation?

>5 years
>8k 60fps
hahahahahah, maybe in concept presentations

Such a shame Battlefronts gameplay was so bad because i mucked around with the trail and the games where amazing

The rocks are bad because developers don't care. There's nothing inherently wrong with that but it's a fact.

The focus on open world(especially in MGSV's case) is a bad thing because that takes away from quality fun. Off-topic but damn you for defending that.

Crysis practically has better graphics than BF, it just doesn't look realistic like BF does.

we have had 4k monitors since 2012 its really not that much of a stretch to assume we'll have 8k by 201x

camera must be acting up, well gotta go back to watching iron man 3 with my girlfriend

REMEMBER WHEN THEY GOT SHIT LOOKING LIKE THIS DOE?

See, there it is again. All over your shelves.

i'll have the maid clean it up

And how much were they in 2012? You might as well say we have 8k monitors today since it's actually available for the right price

Plz fix those jaggies on your shelves and get something to watch outside your living room, are you poor? Don't even get me started on the bloom.

You'll never have an 8k monitor, you get actual diminishing returns on displays that small.

So ?

do you think we should all be using 1024x768 monitors?

>>Cred Forums

reeeee, get out of Cred Forums here yuppies and hipsters.

>what is DPI
the lack of aliasing would be amazing

I wasn't defending MGSV. I've never played it. What I was saying was it doesn't look worse than Crysis because of the shortcomings in the technology of the rigs running it, but because of the shortcomings of the developers. They didn't invest time or money into making amazing-looking terrain, DICE did with battlefront primarily because they had all of these real locations to reproduce 1:1 on computers with new technology... and because the game was devoid of any depth beyond that.

Yest

>what is DPI
Something susceptible to diminishing returns. Notice how Apple has been making those "retina" displays for years now?

...

It means if money is at your disposal you can hire a team to decelop your own 16k monitor if you wish.

But mainstream 8k monitors wont be available in 5 years. Dreamer.

>It feels like graphics havent changed much since 2013.
>2013
>new consoles are released
>hmmm no change since

How could it be?

There is nothing wrong with having higher resolutions user

only reason i have gone to 4k yet is because there is no single card that can run every game at 60-100fps+ yet

Sure technically the Titan X Pascal can if you overclock the shit out of it and turn down settings but its really not quite ready yet.

I am guessing we'll have to wait until the 490 and 590 with 16 and 32gb of HBM2+ vram
They will be available by this year

>But mainstream 8k monitors wont be available in 5 years. Dreamer.
2020 is the date they said they would start selling them though. Like 4 years ago.

I agree but a monitor will never get 8k resolution. That's reserved for much larger displays and VR.

>What's causing them diminishing returns?
Resolution bloat and costs.

Devs can't make even very cool tech(think of soft particle physics and so on) that would work genuinely instead of going for various(ultimately inferior-looking) tricks because consoles won't support it and developing PC-specific tech is not cost-effective today.

Instead of it they decide to make superfucking huge resolution assets that give a difference in quality in 4k but below it they only serve to push high end cards(great example of it is Rise of the Tomb Raider - in 1080p you simply SHOULDN'T play on ultra/very high textures because they're 4k specific but all benchmarks will be done with them so you get a picture where cards without 8GB VRAM perform like shit in it) as most people don't own 4k for obvious reasons. The advantage of it is that to make them work on consoles you can just downscale them.

This leads to the situation where quite powerful GPU is tasked to render image in ridiculously high resolutions instead of doing something that's actually technological. You get some of it from GPU manufacturers but it doesn't take a genius to figure out that this tech is optimised for high range GPU's made by (although AMD isn't as scummy when it comes to it) and tank performance on everything else.

And before you'll tell me that physics is CPU-bound - it's not 2004 anymore.

so consoles are really holding gaming back in literally every regard?

Who gives a shit about polygons and shaders, give me fucking dynamic weather effects.

>Posting that shitty image again

Technologically - sort of.

Financially they allow PC-centric devs to have breathing periods, the 2nd half of 90's tanked so many good devs who couldn't keep up with progress that it's not even funny. Quake looked great in 1996 but two years later Unreal took it on another level. Two years later you've had Max Payne or NOLF or Giants which again were huge jump in quality and scale. In between numerous developers died out because suits wanted them to pursue new and hip graphics sometimes in mid-development(Ultima IX...) AND release games regularly which ended up in a disaster for many of them.

The biggest problem is gameplay-related though. AAA games from some point onwards were always made with controllers in mind and playing on big screen(influencing interface design), so no wonder everybody has felt a decline in quality.

physics is CPU-bound

Also did you guys hear Ronald Reagan DIED!!?!???

Im not an ameriturd

I never get tired of looking at how well they made Heather, and on the "inferior" hardware.

Do you think she is a virgin guise? I hope not

Why does this style of character models look so much more natural then the last of us or most triple a games nowadays

think about it as reaching perfection. almost perfect cant be more perfect, you´re reaching the top of what´s possible to create

so in a moment there´s no need for devs anymore and we can just sell Game Creators where you can generate characters, cities and plots + change genre instantly and choose from any 10 engines.
It wont take years to make a game, you can do that now for free on daily basis and play it online with friends immediatly.

Ive been told that it is literally not fair to compare any game then or now to Silent hill 3s Wizard like dev tech.

They got Heather absolutley perfect, not only in that girl down the street vibe but also just her characterisation. Her voice activ quality is top tier I was really taken aback the first time she loses her temper and starts yelling at someone, i totally believed that she was mad as hell.

Watch this for an interesting perspective on what made her great.

Muh dick/10

>muh prebaked lighting
Who cares? It looks gorgeous. Unless you are playing a game with a day/night cycle, I don't know why you would want only realtime global illumination. Prebaked lighting will run many times smoother that realtime and will usually look better anyway because it was artistically crafted to match the level.

and there's nothing that says a game with prebaked GI for its main light source cant have smaller light sources that use realtime GI

Oops! Forgot link again! Here:

youtu.be/PsHa7pR9nVY

Game dev's are fucking downgrading bc of consoles. It's fucking obvious at this point and only texture mods can save the game.

>Game dev's are fucking downgrading bc of consoles.
Go look at STEAM statistics how many members of 'master race' have actually anything that's not a toaster or 5 years old 1337 ALIENWARE RIG.
If you build BEST RIG EVER that could run 4 instances of Crysis in 4K@60FPS on 4monitors doesn't mean that devs will cater to you few others instead of targeting the millions of toaster jockeys.

That's complete bs. I've got a 980ti and can run it past ultra with graphic mods

>980ti
>original poster said 970
holy fuck you're stupid

ayy lmao found the peasant

You know what didn't improve since the 90s? Gameplay. Welcome to diminishing returns.

>implying this isn't the laziest generation of gaming.

some games look great, but the majority are such piss poor ports to PC (even though x86 was supposed to make it easier; reminds me of the promises they made for the vita) and the rest are cheap cash grabs and indies.

most devs o(or greedy pubs) aren't really trying like they used to. And when they do, the game is ass. like EA Battlefront.

nobody really cares about PC.

Steam has no quality control, not even they care.

>can't ague with facts
>pretends to be retarded
It's ok anonymous, it's ok.
Someone will slay BAD CONSOLES any day now and then all games would look photorealistic and run in 8K 120FPS without a hitch.

and they wonder why they say this feels like the worst generation.

steam has so much shovelware from "indies" that it makes it difficult to find the good ones when they're on sale.

I don't mind it. Even average graphics nowadays are serviceable. A slow drip is fine by me, I don't mind what we have now.

prebaked lighting at that quality doesn't work on anything that moves. you want those cushions to react when you touch them? you want a player that doesn't seem ridiculously out of place? not going to happen with graphics like that

true. most graphics are out of the ugly blocky , jaggie phase.

but then interacting with stuff is out of the question.

So you're implying that consoles are ok in the in industry?

What a fucking shame that the dev team put so much effort into the details of the game.

Imagine all those hours of hard work only for it to be shit on.

Only to be ruined by bad game play a poor design choice.

He is implying that even if Consoles didn't exist, most people would have toasters instead of mustard PCs thus, devs would lower the graphics to appeal to a wider demographic.

>GPU power only increases linearly
No. Every GPU generation (~18 months) is 30-40% faster than the previous gen. At least this has been holding true for the last few generations.

Look at that beautiful detail.

This is exactly why a cohesive and engaging art style is preferable to muh photorealism

Also why arguments about "best graphics" art retarded

I don't like my video games looking like real life.

And the 1080 ti will be close to the new titans in power for 800-900ish bucks

This right here says everything that needs to be said about the modern game consumer.

Place them in a static world where nothing at all changes and nothing at all moves except the player, and they'll praise it so long as the visuals are high-fidelity.

You wanna know why corridor shooter bullshit is all the rage? Nothing in the world has to move, it just needs to receive bullet decals

>graphics are improving slower than ever

Are you fucking retarded? Look at battlefield 1 it looks amazing

This. PC allows for optimisation but, how many would not just have toasters? You think the serious PC gaming community is enough to support the industry alone?

People here always act super shocked that THE MAJORATY would prefer to have a box set up and ready to go for them that allows for no messing around. That they really are just going out buying the latest AAA stuff in their droves.

I just wanna plug in My comfy PS2, put on my comfy Zenhiesser Headphones and play some comfy Resident Evil or Comfy Silent Hill.

I may be called a faggot, buy ill be a happy comfy faggot.

Quite the opposite. You can develop a photorealistic game, however

1) nobody except a very small minority can actually run it
2) the cost of development outweights the potential profit

Most PCs are not even gaming PCs, and the specs of more than 50% of gaming PCs are laughable, easily comparable to consoles. Thus logically to make a profit you either need to make a multiplayer game that runs on a wooden PC for optimal experience of everyone involved, or a graphic-heavy single player game.

In the case of latter, it would be impossible to see a profit from such game on just PCs alone. Developers need console market to push high production value, and this game also needs to sell, thus a big part of budget spent on marketing. PC community can then adopt the game and increase the fidelity through open nature of the platform. But make no mistake, PC is just a second fiddle to consoles.

>Decreased lightning and shadow contrast
Why does this even happens? Surely this cant take any performance. So why dont games has sharp contrast? Its one of the things Witcher 3 did right.

If you adjust for the increased price that would make it 30-40% faster than the 980ti, which will be 19 months old by the time the 1080ti is released.

From Sega's naked Virtua Racing polygons till today's games 3D graphics have never wow'ed me. Maybe with exception of the DC port of SoulCalibur. The reason is 3D progress moves so creepy slow, especially when you play games every day.
There is also this element of cold and unimpressive that go along with textured polygons, no matter how many pretty lights you shine on them.

Compare BF4 to BF1 (god, that is poorly named), there's definitely an improvement, of course it's not comparable to the advances that were being made in the 90's.

...

The last game that impressed me graphically.

>Its one of the things Witcher 3 did right.
it didn't even have interior lighting for the most part. are you drunk?

>it didn't even have interior lighting for the most part
O come one it's not that bad, are you thinking of f4?

This looks like absolute garbage

delete this right now

that game had the exact same problem.

Compare the last of us and uncharted 4 you fagit

Answer me this, what will a publisher and/or investor rather fund, a game that sticks to what works, or a highly ambitious game with lots of R&D and innovation that may or may not work or people may not even like?

Sweet Jesus, is that in-engine?

yes it needs to be more like 1000% to see significant improvements in graphics

Rise had some beautiful fucking graphics. Kinda glad I bought a second gpu now. Constant 60fps maxed.

this does not even look real

PC GPU is still garbage regardless of how much better it is than a console. They can barely get 60fps at 4k on a game originally made for console hardware. It doesn't help that the biggest company refuses to use hbm or more than 6gb and always overprices everything. Yeah it's cool to have a $3000 pc, but game companies aren't going to target that minority. They're going to go for the people that can actually run it for the most profit and that means lowering the requirements.

shits expensive to develop yo