Games are why computers are so powerful today

>games are why computers are so powerful today
Is there any merit to this claim?

Other urls found in this thread:

press.ihs.com/press-release/tv-shipments-decline-even-4k-tv-continues-strong-growth-ihs-says
venturebeat.com/2016/04/21/video-games-will-become-a-99-6b-industry-this-year-as-mobile-overtakes-consoles-and-pcs/
twitter.com/NSFWRedditVideo

name one person who ever said that

You cant prove that something "isn't"
Burden of proof lies on the person who claims this theory

>youtubers video editing are why computers are so powerful today

Is there any merit to this claim?

What else would the average consumer do with a high-end machine? Sure there'd still be powerful computers for government and large company uses, but they'd be expensive as fuck. The more people buy something, the cheaper and more widespread the manufacturing can become.
I've never heard anyone say that before, but it's easy to see how the logic follows.

They are probably why GPUs are so powerful today. CPU not that much.

Uh, every game developer ever?

>>games are why computers are so powerful today
games are why computational power is so affordable today

you didn't name anyone
case still open, provide source for one such claim.

>youtube is older than gaming

What popularized computers to begin with? Games did.

Check and mate, softwarefags

You can be certain that without millions of people needing high end GPUs they would have never developed this fast.

Games are why powerful computers are cheap.

It's likely. The gaming industry is a huge money machine for the hardware manufacturers, and in turn they can spend billions in R&D (which is what a new architecture actually costs)

Yes. You dont need and i7 processor to browse the Internet o use Word.

You dont need a $1000 graphic card to watch HD videos and family pictures on you computer. Rendering and 3d maker software needs more power but the real reason are games and poor coding optimization

corelation doesnt imply causation

>tv shows are why tvs are so advanced today

Is there any merit to this claim?

Its the 4th biggest market in the world of course it has helped.
Not gonna have as much effect on trillion dollar NSA servers but standard desktops are far stronger thanks to them

There's no way that graphics cards would be anywhere near where they are today with games. CPUs and disks? They'd probably still be close enough with the demands coming from servers and video storage.

>corelation doesnt imply causation
Of course not, which is why I would never make OP's supposed claim. It's just easy to see how it could be possible.

I honestly would like an answer to my first question. What else is as widespread and needs as much power as gaming that could cause such an evolution of personal computers?

>poor coding optimization
The 90s would like a word with you

Name one other good reason to keep people buying new hardware and hardware producers afloat aside from old breaking and getting a replacement. Exactly.

Intel probably gets a larger amount of money from business sales.

Gaming certainly pushed graphics cards and we probably wouldn't have graphics cards doing compute now without it but I think the CPU side would have progressed more of less the same.

>Not gonna have as much effect on trillion dollar NSA servers
Actually, it does. Government buys whatever's out there. Things like the Manhattan Project where we actually push technology forward are rare. If 70's era tech is what everyone has, 70's era tech is what they'll be using.

Are you one of those retards who bought a Quadro for gaming?

I think the server and workstation markets are what has driven CPU development. Graphics cards have been driven by games, there's no doubt about that.
We'd probably have ended up with dedicated coprocessors like the xeon phi being much more prevalent. GPUs would have been pretty basic things with full colour being limited to niche workstations like those for the video and picture industries

>What else is as widespread and needs as much power as gaming that could cause such an evolution of personal computers?

Capitalism.
Because you need to improve things to sell them to people over and over again.
Think about my TV example. There were rarely any 1080p tv shows when HD tv became a thing. 4k TV is becoming a thing now and yet there are no 4k TV shows.

Who said that?

No. Gamers make certain technologies cheaper, but they aren't even responsible for that as a whole.

They didn't bring "powerful" concepts like virtualization, parallelism, or memory protection that big business already had since the '60s to the consumer market.

But you can allege that they pushed things like 3D acceleration into the mainstream. You can reasonably assume that they still are a significant driving force in the advancement of games-related technologies, although there are plenty of professional uses for a lot of them as well that require equally as much or more power.

Not nearly as much as the professional market. And that shit's pure profit, even if the on-paper sales aren't always as impressive.

Gamers are a cheap bunch, big business will pay whatever you write on the blank check.

>Because you need to improve things to sell them to people over and over again.
All you have to do is advertise it as better not actually improve anything

>We'd probably have ended up with dedicated coprocessors like the xeon phi being much more prevalent.

Yeah, that sounds pretty likely. Coprocessors taking up the slack for compute while CPUs would be relatively the same.

>What else is as widespread and needs as much power as gaming that could cause such an evolution of personal computers?
CAD
Media production
"Scientific" computing and simulation
Development
Software intelligence

Games are probably the least demanding things you can run on high-end hardware, really.

How many people do you know with a 4k tv? Everything improves with time, but without a legitimate need for an upgrade, the new products would likely not succeed. What if tv shows decided to never ever make a 4k program? Do you think they'll start selling 8k tv's any time soon?

Thats only the case if you have a monopoly. Otherwise you have to at least improve your competitors offering.

Hey everyone, look at this guy talking out of his ass

>Instead of the clusterfuck that is graphic card architecture for massive parallel programming we could have nicely designed 2048 core special cards with a memory architecture that makes sense by now
Thank you gaylords

>CAD
>Media production
>"Scientific" computing and simulation
>Development
>Software intelligence
Literally all of your examples are industry-based. All of them combined probably don't outweigh the number of people using personal computers for gaming.

>How many people do you know with a 4k tv?
Anectodal evidence doesnt matter. People are buying 4k TVs even though there is no 4k content avaliable.

Source: press.ihs.com/press-release/tv-shipments-decline-even-4k-tv-continues-strong-growth-ihs-says

And where would CAD be today if it wasn't for Carmack? WELL?

ITT enterprisefags say the darndest things

And the number of users who do nothing more than use computers for web browsing vastly outnumber those, does that then mean they also have no influence either?

It's absolutely ludicrous to discount those industry-based uses based on head count alone, especially when they're the ones actually paying $40,000 a seat and delivering the most profit while gamers consider a piece of $60 software to be some form of highway robbery.

The gaming industry is bigger than Hollywood. They are a powerful industry. It makes sense they drive a large portion of the sales in video cards. And they are the reason desktop PC's still exist.

>they are the reason desktop PC's still exist.
That's taking it a bit too far, PCs have plenty of uses outside of gaming.

Revenue of Nvidia is 5 billion.
Revenue of Intel is 55 billion.

Actually, it was modular kits developed off of Microsoft's first commercially available microprocessor that popularised computers.

If by computers you mean GPUs then yes

So what? People spend more on video cards than on games. And intel makes chips for everything.

Not the guy you're responding to

The revenue of WoW alone is $1 billion.

That's ONE fucking game from a single studio. Think about that for a just a fucking moment

The gaming industry is 100 billion.

venturebeat.com/2016/04/21/video-games-will-become-a-99-6b-industry-this-year-as-mobile-overtakes-consoles-and-pcs/

Of course 4k tv's are going to sell more than others. There's so many fucking tvs everywhere these days, if you're not going to buy something top of the line, you may as well buy a used one. Also, the entire idea of buying a 4k tv is to be prepared for 4k content, if there was no 4k content ever made, no one would buy 8k tvs to prepare for the growth, only a handful of richfags who want it just to be superior. You gotta think in terms of future development.
No, of course it has influence. Web browsing activity sparked the need for simple cheap shit like Chromebooks. It has nothing to do with what we're talking about, but sure it's still there.

Gamers may think software prices are highway robbery, but they'll sure shit out the cash for hardware, which is the main point of discussion here.

This feels closer to the truth.
Computers most likely would have kept advancing regardless but I believe it's gaming that has led us to the point where there is a wide spectrum of performance options all for affordable prices.

Without gaming, consumer level hardware would probably be much more limited, even if we had the same level of performance.

Nope.
The movie industry is mainly why GPUs are being advanced and other shit like facial recognition which is accelerated by them (and for which the government doesn't mind throwing a lot of money).

As far as CPUs, servers, supercomputers, etc. If anybody gave a shit about gayming in the CPU industry, it wouldn't be cores that keep increasing, but all other traits and implementation of graphene.
Gayming best benefits from less cores with more power.

>All of them combined probably don't outweigh the number of people using personal computers for gaming.
But they do in the amount of devices.
Those industries use hundreds and thousands of the devices as farms.

Wow, you don't know shit about history.

What do you think popularized CGI in the first place? Ever watched an 80s or early 90s movie before?

Kids these days...

>It has nothing to do with what we're talking about
You argued that "industry" use cases were less relevant because there are more gamers, I raised you a market that outweighs gamers.

>but they'll sure shit out the cash for hardware, which is the main point of discussion here
Indeed, but while you're being wrung out of $600 for a flagship GeForce card, enterprise customers are shelling $6,000 for the Quadro equivalent.

>I think the server and workstation markets are what has driven CPU development
Not really. Smaller market, fewer options. Big iron is dead because it couldn't keep up. Without PC competition they would've been happy to sit there pushing expensive, gold-plated, incremental upgrades.
>Two reels for twice the speed!
Workstation/server customers reason like Linux and Failfox fags: They think if it's clunky and poorly designed it must be above consumer-grade trash, so you can keep selling them garbage year after year. It's only when they have an obvious comparison that they see they're being taken for fools.

The revenue of WoW alone is so mcuh because it doesnt require any dedicated GPU to run.
Other games that have more than 1 billion revenue:
>League of Legends
>CrossFire
>Clash of Clans
All they have in common is that they dont require dedicated GPUs

Your link says mobile overtakes consoles and pcs. AMD or Nvidia dont have any market there.

The 60's where the start of CGI m8, when props and special effects started being combined with what would now be considered primitive visual and computer technology. Don't act like you know something when you don't.

creative art stuff in general is what helped pushed technology
that and nasa

>Smaller market, fewer options. Big iron is dead because it couldn't keep up.
Workstations, servers and even mainframes are still very much alive, far more profitable and used by customers with far higher power demands than a gamer whose use case hardly abuses anything aside from the GPU (if that)

>The 60's where the start of CGI
Clips or it didn't happen

Games are directly responsible for GPUs being so powerful and affordable.

Games contribute to lower hardware costs due to higher demand.

Games are partially responsible for advances in the x86 architecture. Therefore, games are indirectly responsible for some of the processing power typical in a desktop CPU, but are overall not the largest contributor and far from the majority in that category.

Except for consumer-oriented CPUs that have integrated graphics - games drive a significant portion of that (the rest being traditional media e.g. videos).

Download any documentary on the creation process of the original Star Trek series jimbo. There's plenty of them.

>Server market is smaller than the gaming market
You are aware that every website you interact with is hosted on a server, yes?

Apollo 11 guidance computer was a 2.048Mhz, 2048 words magnetic-memory beast that weight 30 kilos.

So much for NASA and their power requirements

Not just that, almost every modern game connects to a server to store multiplayer data.

>Workstations, servers and even mainframes are still very much alive
Using desktop parts.

That's actually a fair point. My only counterargument would be something along the lines of a few companies buying shit in bulk would be cheap for them, but still expensive and mostly unavailable for average people. Which in turn may stagnate the growth of such products due to having a limited market.
Both the industry and gamer markets are affecting the hardware development in terms of power. Just because the web-browsing-only market is larger, that doesn't mean it has the same kind of affect on the development of hardware, i.e., it prompted simple computers like Chromebooks.
Your second point is actually fair and basically falls along the lines of my response to the first user.

the Nintendo 3DS runs on a shitty outdated dual-core ARM chip clocked at like 800 MHz

gaymers are at the cutting edge of performance how again???

the defense industry is the reason 3d graphics exists. marketing it to home users was just a cost-recovery afterthought.

What other programs need all GPUs and RAMs games require?

As someone who is in the defense industry with a focus on graphics, I can say that this is no longer true. The consumer market leads by a mile in real-time graphics.

>Using desktop parts.
What else would they use... phone parts?

We seem to be agreeing at this point.

You're kinda new, aren't you?

This guy is

What does the instruction set or hardware implementation have to do with anything? Are shitty dual-threaded AAA titles selling fibre channel SSD arrays, 20-core Xeons and octa-socket servers?

They pushed the design of all the hardware. The rest is just scaling it.
The TOP500 does not run on custom iron, it runs on thousands of desktop CPUs and gaming GPUs.

>we can now have a single computer than can accomplish what two or more could do with 1 or more graphics cards thanks to the want for multiple graphics cards for playing vidya
>(almost) no need for clusters of computers do rendering of 3d models, animations, or video even
I believe it

Using server and workstation parts. The desktops are the ones using their parts.

>We seem to be agreeing at this point.
Just about. Like I said to the first user, I still the the market would likely stagnate if only a handful of large companies were the ones buying high end equipment. But then again, I'm not even a gamer and never had a leg in this race, so I'd rather not keep arguing it.

>They pushed the design of all the hardware
How? Other than maybe the occasional trivial instruction set extension, x86 is fuck all driven by gaming, none of the major progress made into the high-end such as RAS features, hardware virtualization, multi-core and multi-threaded processing, 64-but addressing, multi-level caches, superscalar execution, practically fucking everything, have been implemented with gaming even distantly in mind.

>The TOP500 does not run on custom iron
Why does it have to be custom? It doesn't work that way anymore because it's wasteful, and Intel has been making leaps and bounds to court enterprise customers for decades. Do you think the Itanic was made for gamers, too?

Besides,
>it runs on thousands of desktop CPUs and gaming GPUs
Xeons and Teslas are hardly "desktop" products even though their underlying architecture is implemented in desktop products, it's like calling a Nintendo 64 a workstation because the CPU is based on the same R4000s used in the SGI Indigo line, a total crock of shit.

Nope. Desktop took over server.

In some ways yeah, the GPGPU trend is an example.

We can now use GPUs for finance analysis, physics simulations and general number crushing at tiny fractions of the cost of doing the same in CPUs.

And in many ways a modern GPU is a more "complex" piece of technology than a CPU is.

Nvidia is already making more money selling GPUs to financial companies than making consumer grade cards.

Most of those architectures were used in desktops too.

>Games are the reason GPU makers are investing into improving there ability
>Games are the reason CPU makers have to improve multitasking
>Games are the reason Phone makers and normal monitor makers are increasingly improving and making more pixel dense displays

Cred Forums has an illogical hatred for games, when there a large part of the reason technology is advancing at the rate it is.

No, some were used in ultra expensive custom workstations. A market which was taken over by beefed up gaming PCs.

>Photo editing software
>Video editing software
>Various modelling/manufacturing/design/animations/visual effects/engineering software
>various oil/gas software
>servers/CCTV/control centres
>various financial software

All require more powerful computers than a gaming one.

If you think quants in some hedge fund don't need the top of the top of intel Xeons designed to last 24/365 compared to your i7 red diode Alienware with a League of Legends logo you're retarded. Games did aid personal GPU development but they are far from the only reasons why "computers are so powerful these days".

Cred Forums hates gamers because they are slaves to marketing and half of them don't give half a fuck to learn about the hardware they're using or the software it runs on and just expect everything to be handed to them on a plate.

It's like asking why muscle car club who spends all day in the engine bay of their cars hates the kids who just swipe a credit card and buy the GT-R.

No, server parts are trickling down to the desktop market. X86 is not inherently a gaming architecture.
>>Games are the reason CPU makers have to improve multitasking
Heh, no. Multithreaded has been a massive thing in servers and workstations since forever. Games taking advantage of more than two cores is relatively recent.

>No, some were used in ultra expensive custom workstations.
The fuck are you smoking? You practically buy that shit off the shelf, especially a more mainstream platform like Sun or SGI. It wasn't even that expensive in the entry level.

>A market which was taken over by beefed up gaming PCs
Commodity shit was eating into that market long before the Quadro and FireGL took on their current forms, and graphics were just one of the many nails in the RISC/UNIX coffin.

>>Photo editing software
>>Video editing software
>>Various modelling/manufacturing/design/animations/visual effects/engineering software
>>various oil/gas software
>>servers/CCTV/control centres
>>various financial software
Half of that only has to be better than the competition, and the rest will make do with whatever it has. They're smaller markets, and with smaller markets you have higher margins and are competing against less. There's no killer instinct when everyone is making $90,000 profit on a $100,000 sale. If the other guy's a little bigger, that's no reason to get into a price war and kill the goose that's laying the golden eggs.

It was custom for workstations, moron.

>No, server parts are trickling down to the desktop market
Is your desktop running an Alpha? Thought not.

It was a fucking mass produced, general-purpose product, do you even know what "custom" means?

Shit, outside of the ultra-proprietary SGI realm, pretty much all workstations were commodity through and through except for the processor and glue chips by the end.

Games are why computers are so good at vector processing.

No, it's running a descendant of the workstation and server-oriented Pentium Pro with 64-bit extensions introduced with the workstation and server-oriented AMD Opteron.

Is your desktop running something strange and unusual that was designed for the consumer market? Every architecture has roots in the server and workstation markets. Acorn Risc Machines, more commonly known as ARM, designed their CPUs to fit workstations for the BBC, for example.

The BBC micro was about as far from a workstation as you could get. Really most of the modern big names in CPUs weren't really "enterprise" at birth, but they are the way they are because of heavy influence from their higher end counterparts. Most of the fundamentals of the shit we do on personal computers today was already done on the mainframe 50 years ago.

>Every architecture has roots in the server and workstation markets
How many architectures are omnipresent in the desktop one?

Desktop beat out server.

It didn't, desktop became server by absorbing all of the good parts. There was ultimately no real distinction between the two anyway.

Everything good went into desktop due to its need to satisfy gaming requirements. That's how it took over server.
Thank you for proving my point.

>due to its need to satisfy gaming requirements.
Yeah that totally explains why memory bandwidth improved long before games ever needed it. Totally explains the shift to multi core systems in ~2006 when games are only taking advantage of them *now*. Totally explains the increased PCIe bandwidth when games still don't saturate x8 2.0. Face it, games are not driving the development of CPUs.

4k 60fps porn is the reason

>they would have never developed this fast.
Yes, they would. Gamer faggots are driving the development, business is. Gamer faggots just foot the bill.

>but they'll sure shit out the cash for hardware, which is the main point of discussion here.
Oh wow a fucking 1000 dollar computer.

Businesses are spending 5-40k+ on workstations to do the shit they need to do.

And yet on the ground in more recent times they literally kept SGI alive prior to the buyout. Those O3ks in their datacenters weren't just sitting around.

Nvidia wouldnt be 10% of its today size if not for games.
The money put into RnD would be orders of magnitude smaller if not for games.

Believe me or not, a product which has no giant consumer base will get less funding from a company then one which does.

And besides gaming there would be very few people actually buying high end GPUs because other applications are all related to specific jobs.
Consumers are in fact the reason that most technologies continue to develop because there is more money in a industry that caters to millions then to a few thousands.

There is some similarity to drug pricing, drugs that are used to treat diseases that very few people have receive for obvious reason less funding and cost a lot more then for common diseases.

Games are the reason why some consumer hardware dedicated to real time 3d rendering is so powerful. The rest is the utility of being able to do math quickly. You might make the argument that it's the reason extreme parallel processing is affordable to consumers but the more useful gpu offloading becomes the less dependent its existence on gaming becomes.

Most people with PCs that I know do not play video games on them.

normie get out

reeeeeee

Most people who buy GPUs play video games though.
But then again GPU market is a lot smaller than computer market.

>Most people with PCs that I know
Stopped reading there

Probably true.
Linux autists love their third-hand Thinkpads, offices tend to stick with old hardware and software until shit breaks beyond being repairable.
Most people see the biggest performance improvement by putting an SSD in their old PC's and laptops.

I dare say gamers are the only reliable hardware consumers these days.

>Cherry picking level designs
Now throw borderlands or fallout 3 into your illustration. Games have diversified, the fact that there are games with a more linear experience is not a slight against modern games.

You would, because you're just a gamer faggot

A large market of consumers who spend their money could never possibly help push technology in that sector, are you retarded?

All the jobs made for creating games and developing them is also ridiculous, obviously they don't have an impact either and theres none of them ever, anywhere that might use enterprise level product either... Obviously.

A market that supports physics simulations and increases demand for higher performance would never have even a slight responsibility for the creation of that product.

And don't even think for a minute increasing the demand for lowered latency, ping, and creating another huge market would ever aid the development of servers in any way, there's obviously no money coming from the gaming industry for that development.

fucking cat videos are the reason

games are software you idiot. You think macs are not PCs don't you?

>tourismo

:s/youtube/Cred Forums porn.webm threads/

>rendering longcats.

What the fuck?

>Everything good went into desktop due to its need to satisfy gaming requirements.
What game in 1985 prompted the development of Virtual 8086 mode?
What game in 1989 required on-chip caching and an integrated FPU?
What game in 1993 needed a superscalar processor?
What game in 1995 forced Intel to introduce x86's modern hybrid core design?
What game in 2003 needed 64-bit registers and memory addressing?

You're fucking retarded.

john carmack

Diablo Rogue didn't have a cape and no bare midriff. OP is a FAGGOT.

War does. War to get your money out of your pockets. If we get to a war with Skynet, we'll either have super strong unhackable computers or revert back to the stone age.

ITT: Mouthbreathing, acne-riddled gamers confuse demand for a product with the groundwork necessary to create that product.

Playing your stupid Doom and Mario did not play any significant factor in the decision to fund research that made it possible to create and market the type of things you have today.

In fact, I would argue that the opposite is true. Ever since games have become commodified to the point that they are now, to the point where they are actual engineering targets for fortune 500 corporations, innovation has stagnated.

Gamers typify everything that's wrong with this generation, and you should feel ashamed to call yourself one.

The most powerful processors we have today are GPUs by a long shot, in fact all of the Nvidia super computer units are based around almost identical architecture on the GPUs and have evolved hand in hand for the past decade.

I think it's more accurate to say that gaming and hardware speed has had a hand in hand symbiotic relationship, neither can improve without the improvement of the other because people only invest in making hardware mass producable if there's a market to use it, and people only develop software/games for hardware if there's someone to make it. The entire system of constant R&D is supported by people constantly upgrading in rapid succession and most of these people are gamers.

Most of the business applications for processing power are not real time simulations, the real heavy lifiting done in the business world is done with time independent code which means they simply leave bigger jobs run for longer (for example offline 3d rendering in todays 3d movies, or CAD rendering) otherwise these uses of tech end up simply being scaled by being more parallel (i.e its cheaped to buy twice as many CPUs than to invest in R&D to make the same CPU twice as fast).

Only real time usage and simulations require the immediate power of upgraded devices and the primary use of that across the world is games, undeniably.

Games don't really need strong CPUs. Grand Theft Auto V, maybe the most complex game ever programmed, runs perfectly well on 1.6GHz AMDs and last gen consoles. Games have CPU dependence today only for high framerates, or because they're sloppily coded as a result of 24/7 crunch time and/or untalented programming teams.

>contradicts himself and spaghettis the fuck out of his sentence
Without money there isn't any development.
While gaymen might not be the main reason it definitly is a good contributor towards pc related development.

You faggots realise that one server serves a shitload of people on consumer systems right?

I don't know or care really, but I would honestly trade the last 10-20 years of technological development for no gamers.

>It's like asking why muscle car club who spends all day in the engine bay of their cars hates the kids who just swipe a credit card and buy the GT-R.
Sounds like insecurity to me, honestly how can you hate on something that's the reason why your hardware is as "cheap" as it is now. It's like hating on normies because you don't have to pay 1000$ for a 2kg mobile phone.

Ow wow Businesses buy 2-4 10k workstations every 2 years meanwhile a 2k gaymen pc is sold every few minutes to some mom. You have to be completely fucking retarded if you think the money is to be made in any other market than the consumer market. You need volume to make it big not some niche shit.

No my friend it's you that's retarded thinking that all the money that gets poured into the market doesn't fuel development. You might in fact be retarded, need doesn't fuel anything money does.

This isn't even slightly accurate for the majority of office computers. I'm an IT Director and sign off the cost for our upgrades and office computers typically have a 5 year EoL and cost about £600 for a decent i5 with 8Gb RAM and a bit of HDD space.

The last video card I bought was more expensive than an entire PC we use at work, most of the work in offices is no more harsh than browsing the web and using the office suite.

Actually the margins on bulk low/med kit is very small, the really large margins are on the brand new kit like brand new video cards, that's where the bulk of the R&D cost is made back.

I'm talking about the faggots who draw in CAD, hence the 2-4 and not a few dozen/hundred depending on the size of the company.
Consumer market makes a shitload more than the plebs who sell systems for niche industry applications.

Right. I forgot that Von Neumann architectures, Turing machines, Backus-Naur form, microprocessors, matrix multiplication, radios, coordinate systems, A Treatise on Electricity and Magnetism, Philosophiæ Naturalis Principia Mathematica, elliptic curve cryptography, and so on were all driven by consumer demand!

How foolish of me.

We should all pat ourselves on the backs, crack a few beers, and fire up the xbox! Who needs hard questions when we have so much money?

Idiot.

Oh yes all these great leaps improving already existing technology, oh wait.

Moron

If I wanted a comeback, I would have wiped it off your face, Billy Bob.

Shouldn't you be off pretending to read Mises or Hayek or frogposting somewhere?

>meanwhile a 2k gaymen pc is sold every few minutes to some mom
Who won't upgrade it in the next five years minimum, probably even longer given the spiraling decline of PC sales

>You need volume to make it big not some niche shit.
Hence why businesses are the money makers. A gamertard buys one $800-$1200 low-margin system every four years with a base warranty, a typical large business will upgrade a fleet of thousands of systems plus support contracts plus software one or more times in that timeframe.

Multiply this by even just 50 businesses a year (hypothetical minimum) and that's an absolute shitload of pure profit your low-margin niche gamer trash won't hold a candle to.

I think you overestimate the value of systems sold to businesses and how long they last until replaced.
The only places that keep up to date a niche markets, meanwhile consumers keep buying that shit at a rate much higher.

You do know that the CAD machines basically have slightly specialized variants of gaming GPUs in them with more memory? And they have a huge price premium because no one else needs that much vRAM on a GPU. It's powered by the exact same cycle of gamers/GPU upgrades, the same way that consoles get cut down mid to low range GPUs for no investment into R&D because the R&D is paid for by PC gamers paying premiums on gaming cards and more or less the same thing goes for Nvidias compute units used for super computers.

If the gaming market on the PC collapsed then CAD workstations would collapse and so would modern consoles and probably even the compute cards.

Those CAD machines are 10k anyways they're like 2-3k with a pair of CAD video cards in and they're bought in a tiny fraction compared to office desktops.

Remember that these markets are secondary off the back of PC gaming, Sony and MS absolutely DO NOT pay the R&D costs for a modern GPU, if there was no PC market they'd have to develop all the R&D to go from a PS3 to a PS4 in that massive 7-8 year gap which is not what happens. The interim is filled with 3-4 PC generations all paid for by PC gamers and they simply piggyback off whatever the next latest generation is and maybe slap a big of extra high speed memory on the card.

If the PC gaming market didn't exist that would make the same hike in GPU power from one console gen to the next make the console units cost about £5,000 a piece because you'd be paying for ever penny of R&D that now has to be paid for by Sony/MS directly.

GPU and CPU progress has always been iterative design.

Consoles run on aged desktop hardware with minor modifications so these jumps wouldn't even be possible.

It's the volume that makes it affordable and gamers/desktops pay a large part of the tab.

This is what pisses me off about console fanboys who slam the PC. Hurr durr PC is crap and console is better, they just want PC gaming to die without knowing their hardware is heavily subsidized by PC gamers, consoles would cost a fortune if the GPUs had to be designed from scratch every generation, you wouldn't have cheap hardware that costs a few hundred bux.

>I think you overestimate the value of systems sold to businesses and how long they last until replaced.
Everywhere I've ever worked, or worked with, has been on a very regular upgrade cycle.
>meanwhile consumers keep buying that shit at a rate much higher.
The consumer market isn't only yuppies on Cred Forums with more disposable income than sense. Anecdotes aside, declining PC sales clearly demonstrate that consumers are NOT buying new systems at a much higher rate than large corporations which are quite often on upgrade cycles of a few years for most systems.

Declining sales just means we've passed the peak which makes sense. Performance has reached a point where the goy doesn't need more and now it's just getting smaller and more efficient seeing how the average mobile phone costs more than the average office pc.

Every place a lot of people work there will a good amount less systems than people working there. Not every job is some desk job or Cred Forumseek shit. It might be perceived by you as such because your job is located in such an environment.

>You need volume to make it big not some niche shit.
the reason you "need volume" in the consumer market is because it's a fucking race to the bottom, profit margins are so incredibly low that you need to sell millions of devices to make any reasonable amount of money, hence why that market is dominated by a pool of 2 or 3 massive conglomerates while smaller companies fight over their leftovers

"niche shit" on the other hand is pure profit, for every $5,000 quadro nV sells they make the same amount of profit as they would if they sold several 980s/1080s, same goes for high-end servers and workstations, which is why IBM withdrew from the PC market entirely yet is still just fine

I'll tell you that the EBITDA for this consumer "junk" is a shitload higher than that for the quadros that are overpriced niche garbage.
The market is just too small for it to even stand a chance at reaching numbers like those of a consumer product.

and once again, only for the few large conglomerates that can push the required volume

This.

And to be honest the declining PC sales is largely down to abuse of the PC market by developers making shitty ported games that cater to lowest common denominator, the PC gaming industry was much more healthy when developers targeted specifically for PC exclusives that utilize the power of the platform.

As consoles slowly become more like PCs we're seeing that shift, you only need to look at their trend to essentially becoming PCs, having hefty OS's that use a large amount of the resources, needing constant updates, more crashes and bugs because patching is now a thing, DLC and download games across the internet following steam, and very probably modular hardware in the near future.

Niche remains high profit, look at the very high end CPUs and GPUs, the price to performance ratio is always absolutely terrible in the high end kit because the price ramps up exponentially. You start paying twice the price for 10% more speed and people pay it because the want the best, I just dropped £650 on a 1080 and will likely get a 2nd before the year is out, and then next gen do it all over again in 18-24 months. Each of those cards has a huge profit associated with it.

Word Processors and small databases use to require a fuck load of computing power (relatively). programmers maintaining and compiling large scale code bases after debugging took a lot of man hours so application support was costly. video rendering and editing requires tons of time of waiting for the program. faster processors cut down on man-hours which gave companies more products in less time.

Engineering software like CAD designs and 3D modeling and graphic rendering drove graphics processing. gaming did play a roll but wasn't everything. gaming has always been a proof of concept and PC gaming wasn't a viable alternative (to console systems) for a long time.

The NES, Atari and Sega Genesis were what made gaming. Computers were viable and used large scale way before these toys.

if gaming drove a computing industry you'd have console systems. so the next time you faggots on Cred Forums think you're doing something so great for the computing industry think of the PlayStation.

>Declining sales just means we've passed the peak which makes sense. Performance has reached a point where the goy doesn't need more and now it's just getting smaller and more efficient seeing how the average mobile phone costs more than the average office pc.
Exactly, meaning consumers aren't going to consume at a much higher rate than businesses regularly upgrading infrastructure, hardware and software to meet demand.

That's only true for Graphics cards.
Gamers used to be the main buyers for these things before we started using them for GPGPU.

Yes and these few conglomerates are making the most money, thus it's where the money is at.

Implying businesses won't slack on or adapt to different platforms either considering there isn't a reason to spend the money. Desktops are a dying breed and this isn't just for the consumer market.

>Engineering software like CAD designs and 3D modeling and graphic rendering drove graphics processing. gaming did play a roll but wasn't everything

Go back over the last 20 years and look at the hardware in CAD machines, you'll find that it's all gaming hardware with more vRAM tagged on, it always has been and always will be because it's a tiny niche market that only survives off the back of the PC gaming market.

Specifically the Nvidia Quadro and the AMD Firepro cards use exactly the same architecture as the gaming counterparts and you'll find that in the high end Macs and high end CAD dell prebuilts as well.

Nvidias deep learning and super computer variants are also built off the back of their tesla/pascal GPU lineup.

Research this stuff first.

>The movie industry is mainly why GPUs are being advanced
Bull fucking shit m8.
The majority of FX rendering is still being done with software renderers.
Most of these renderers were written before CUDA and other GPU programming libraries became a thing.
The reason for not using APIs like directX and Opengl is because they can result slightly different images on different graphics cards because of different implementations.

It's only recently that they've started utilizing GPUs for rendering using CUDA and OpenCL

>other shit like facial recognition which is accelerated by them
Again, this is only in recent years that this has gained a popularity. A few research groups buying GPUs vs Million gamers buying GPUs worldwide.

same

Nvidia made ~$800 million in enterprise product sales alone in 2015, and almost all of that is profit, that moves up to around $1 billion when you factor in the just under $300 million of HPC product sales (also profit, and growing by 53 fucking percent from 2014) and ~$200 million of automotive sales.

While it may still only be a little over half of the GeForce line's total sales, you can bet your ass those sales didn't even yield nearly the same amount of profit.

>thus it's where the money is at.
jesus fuck you're dumb
if you're basing the viability of a business solely on "where the money is at" then you should be shilling for big banks on /biz/ not gayming garbage on Cred Forums

>and almost all of that is profit
Yes because development is free of charge

GeForce and Quadros are derived from the same common architecture, so that point is pretty...useless when comparing the development costs between the two.

This was true before before Nvidia's Fermi, back when Nvidia's focus was solely on Graphics card for videogames, when Fermi came out they started to bin the shit out of them and the best chipsets went to workstations cards where they had the biggest profit margins, not to mention by this time Nvidia had almost one year of CUDA development in their pockets and it would be suicide to abandon it. People don't really seem to grasp they huge risk CUDA was for Nvidia back then.

Snce them Nvidia focus has been primary workstations and Geforce second.

>multi-billion dollar engineering companies
>a niche market
hahahahah

>Specifically the Nvidia Quadro and the AMD Firepro cards use exactly the same architecture as the gaming counterparts
because these cards were build for 3D engines and making big corporate partners happy then sold to enthusiasts for "proof-of-concept" applications (i.e. gaymes).

>Nvidias deep learning and super computer variants are also built off the back of their tesla/pascal GPU lineup.

which can be used in gaming but has multiple applications.

if you seriously think gaymen is the only thing keeping this industry running or "It would die without games" you are deluded.

if all gaming left computers tomorrow the only market that would take a massive hit is graphics, but would still be viable.

> it's a tiny niche market that only survives off the back of the PC gaming market

So people who are accomplishing nothing are supporting those who are?

Fortunately, this couldn't be farther from the truth. CAD cards have carefully verified drivers to assure visual integrity of the displayed models. Companies (people making money) will pay whatever they need to pay for this. If the entire gaming market blew one too many loads in their sock drawer tomorrow and died of dehydration -- all the CAD users would just pony up 10K for a low end card if that was the cost of doing business and keeping food on their familys' tables.

Smartphones were a mistake so that's a poor comparison.

this entire post is fucking hilarious, as if the inevitable unification of visualization products somehow means that gamers are responsible for everything that went into it

>Better hardware comes out
>Gaymers blow their budget on more assets artists

>>multi-billion dollar engineering companies
>>a niche market
>hahahahah
Not what I said, I said that CAD was a niche market, it exists off the back of the gaming market and has done for decades.

>because these cards were build for 3D engines and making big corporate partners happy then sold to enthusiasts for "proof-of-concept" applications (i.e. gaymes).
They were build for 3D rendering and so they're suitable for CAD except for typically they need more memory. My point is that CAD users get this hardware for relatively cheap because designing brand new CAD hardware from the ground up paid for purely by CAD users wouldn't fund even 1% of the R&D needed to produce modern hardware.

The R&D goes into the product first, the gamers get hold of the GPUs first, the medium and high end GPUs pay off the R&D costs by having a premium on top, and the CAD markets and GPGPU markets are secondary. We know they're secondary because their hardware is based off gaming cards and not vice versa, and the gaming cards have been around longer than both CAD and GPGPU variants. (this is necessarily true)

>which can be used in gaming but has multiple applications.
Right but is based on GPUs designed for gaming.

3 decades ago there was no GPUs
Then first gaming GPUs came about
The profit of which power R&D for next gen
Repeat this cycle every 18 months for 3 decades.

That's the total sum R&D that's gone into GPUs because of gaming.

Now rewind 30 years, suppose take gaming out of the picture. Fast forward back to today, there's no gaming GPUs, there's no CAD and no GPGPU. If you wanted those things you'd have to pay for 30 years worth of R&D

This is not a hard mental exorcise to do.

Yeah except it wouldn't be 10k for a low end card, it would be millions of dollars because who is going to pay for the R&D and constant development cycle you'd need to mature a technology like that.

I've seen it argued on this shitty board tons of times. Most "gamers" are fucking braindead and seem to lack critical thinking skills. Then again that's true for Cred Forums in general.

yes, but gnu/poonix guys get new bloated KDE and Gnome every 8 years to force them upgrade their bash calculators.

>CAD was a niche market
Not him but stop. You don't know what you're talking about. What the fuck do you think aerospace engineers use? How about the automotive industry? Engineers? Fuck, even interior designers use CAD these days.

>it exists off the back of the gaming market and has done for decades.
CAD existed before video games. Do you even know what "CAD" stands for?

>this entire post is fucking hilarious, as if the inevitable unification of visualization products somehow means that gamers are responsible for everything that went into it

Right but who paid for the R&D for decades before it was unified? Sure wasn't the CAD guys, sure wasn't the GPGPU guys, they're not paying a million dollars per unit to make up for the decades of research that preceded their hardware, they simply get the latest and greatest for cost of the products in the current cycle. This ignores the decades of product cycles that came before and all the engineering effort that went into that.

yes

if it weren't for games, you wouldn't even need a i7 processor 3.2 ghz and 32 gb of ram. you wouldn't need these really expensive graphics cards. integrated graphics would suffice

I want
to leave.

CAD existed before 3D accelerated CAD existed just the same way games existed before 3D accelerated graphics cards.

This doesn't invalidate the point that we've had decades of GPU progress funded by gamers of which CAD 3D accelerated hardware is a 2ndry market, that's just a historical fact.

There's no line of continual development into CAD specific hardware which GPUs are made off the back of, but there is the reverse.

What does this tell you? Hint: use your head.

>I said that CAD was a niche market
stop posting any time, gamertard hubris is fucking incredible

>and the gaming cards have been around longer than both CAD and GPGPU variants. (this is necessarily true)
because the gaming cards were too limp-dick before for those jobs, it wasn't until the first GeForce and Radeon cards that nV and ATI could seriously compete in the (low end) of those markets, when PC workstations started becoming viable in the entry-level around 1996 with the Pentium Pro and NT coming of age, most of those systems either used 3dlabs (GLINT, permedia2 et al.) or custom OEM solutions (like HP's Kayak XW lineup, Intergraph's Z stations and SGI's Visual Workstations)

>Right but who paid for the R&D for decades before it was unified?
SGI? Integraph? Sun? HP? IBM? DEC? the myriad o big-namef vendors that dominated the market...? do you think people were designing products on a fucking VIC hooked up to their mom's TV?

>they're not paying a million dollars per unit to make up for the decades of research that preceded their hardware
uh, yeah they were. it wasn't uncommon to pay upwards for $50,000 a system not counting software licensing costs for a decent workstation, let alone a specialized CAD or editing station

R&D wasn't even as expensive as you think, especially back then, just skim at the history of ARM chips on wikipedia for fuck sake

>they simply get the latest and greatest for cost of the products in the current cycle. This ignores the decades of product cycles that came before and all the engineering effort that went into that.
what the fuck are you even rambling on about at this point

>This doesn't invalidate the point that we've had decades of GPU progress funded by gamers of which CAD 3D accelerated hardware is a 2ndry market, that's just a historical fact.
jesus christ you're fucking retarded
CAD users had full 2D/3D acceleration before gamers were even getting shitty 2.5D shooters

Absolute shittest pair of legs and hips I've ever seen.