Intel

Intel
>We just pioneered the worlds fastest file transfer system giving us SSDs with speeds up to 2000mb/s on both desktop and mobile, we also continually improve the single core performance of our processors despite it becoming more and more difficult to do

Nvidia
>We just used our GPUs deep computing abilities to create a self driving car that drives almost perfectly like a human, and made a desktop GPU that provides high end perfomance at a midrange price

AMD
>WE GIVE A GPU 1TB OF VRAM
>WE GIVE A CPU 64 CORES
>MOAR VRAM
>MOAR CORES

Guess which one this technology board defends as the "good guy"

Other urls found in this thread:

anandtech.com/show/9815/amd-moves-pre-gcn-gpus-to-legacy
en.wikipedia.org/wiki/Comparison_of_instruction_set_architectures
twitter.com/NSFWRedditVideo

amd is god
>intel mustard race

The SSD on the GPU is not VRAM.
Server CPUs legitimately need more cores.

Why is it that shitposters here are always genuinely tech illiterate retards?

AMD is truly the gaymer retard brand
>Whoa kids check this shit out, this CPUs got over 6 nigghertz of power on 8 cores and has liquid cooling! Weve also got over 16 GIGS of Vram on that graphics card, can you believe it???

just this moment i was looking to buy a wx4100

fucking pajeets paper launch business cards months ago.

they use all the paper for their launches that they have to use their hands instead to clean their asses

REEEEEEEEEEEEEEEEE

they all came during the surge of '13

for whatever reason the traffic increased and most of the new posters were unironic retards. coincided perfectly with the iUsers.

Amd will live as long as consoles will be.

So until this generation of consoles ends

Nintendo already ditched them to get a powerful and quiet Nvidia Tegra for the NX

Best part is itll outperform the PS4 and wont be a 200w housefire

>implying we don't know it's because of clover

>>WE GIVE A GPU 1TB OF VRAM
>>WE GIVE A CPU 64 CORES
If this is real, I'll take huge RAM and faster CPUs over SSD speeds that CPUs can't keep up with and self-driving cars that need more memory.

Oh shit wit are you doing?

The point is AMD innovates nothing and just goes for retarded advertising gimmicks
>look! weve got lots of VRAMs and cores that nothing will ever make use of! But hey its a big number!

Its not VRAM. Its a local permanent storage pool that isn't limited by PCI-E or CPU overhead. This appeals to me directly because I work with autocad, and some projects take 40mins+ just to fucking open in the morning.
That file, if loaded onto the SSG, would open within seconds.
There are plenty of people out there dealing with even bigger workloads that take longer. The same benefits apply to video editing as well, huge files could be scanned through in real time once loaded onto the GPU.

That is a major change for tons of professionals. I've already basically begged my employer to buy one of these for my workstation. I get paid a commission fee when completing something that the company takes on as consultant work, so that means more money going into my pocket. If I'm getting an extra hour or two of work in a day if affects the company as a whole and me personally.
\

>MOAR VRAM

love how AyyMD just dropped the news on the 16Gb HBM2 cards then a week after nvidia got hit
>MUH 32 GB GDDR6

>200 watts of power constitutes a housefire

Nvidiots everyone.

AYYMD IS FINISHED & BANKRUPT

AYYMDPOORFAGS CONFIRMED ON SUICIDE WATCH

Intel

>> We managed to conveniently hit a brick wall when it comes to CPU development so we stay ahead of the competition but not by much. Also we can make incremental 5-10% improvements yearly, but can't quite make a yechnologixal leap even if this trend persists for the last 5 years.

NVidia

We pay off most devs to just use our technology that we decided needa to be closed off.


I don't believe AyyMD would be any better if the position was reversed, but rooting for Intel/Nvidia is god damn anticonsumer at this point.

Its a legit argument that AMD can't afford to be evil, they have to try to be the nice guy underdog. As far as businesses go anyway.
A company will do anything they can get away with in the name of profitability. AMD's foibles, PR failings, and horrid marketing I feel comes down to them still having management fucking things up.
A larger company wouldn't make so many little mistakes. A larger company would make larger, intentionally misleading mistakes, and follow up with the PR to try and hide or justify it.

>but rooting for Intel/Nvidia is god damn anticonsumer at this point.

this.

indeed

...what the fuck is that supposed to measure?

That's why I'm hoping Zen will be good. Not break the mold, not be godly just a fucking decent 6-8 core in the 400$ range. I bought my i3 5 years ago, and I'm glad to see that specs wise besides instructions and cache, the rest of the relevant specs, core count & core speec, are the same.
My only upgrades are literally i5 or i7, I can't upgrade to an i3 because mobo+cpu would be like 10-20% increase in performance for rebuying my old parts v6.0
/rant

you make it sound like intel is somehow evil for doing it that way
it´s shitty sure but it´s only reasonable to do seeing the cost of getting more performance and AMDs nonexistend competition the latter being the one and only reason they can do that

It's measuring performance of the cards in the launch and 1.3 version of Fallout
Possitive values means the 1.3 version is faster than launch version, negative the contrary
You can see pretty much ever AMD card being quite faster on the newer version, while Nvidia cards are slower, ie. Nvidia isn't optimizing their drivers for the newer version
>tl;dr it shows how Nvidia gimps their cards

>muh gimmicks

The Zen 64 core CPUs are server/supercomputer CPUs.
The 1tb SSD cache GPU is for editing and it MASSIVELY increases performance.

Kill yourself.

>Intel
>>We just pioneered the worlds fastest file transfer system giving us SSDs with speeds up to 2000mb/s on both desktop and mobile
Which no desktop consumer can take advantage of since normies don't have servers to utilize that speed. They're just wasting time making it faster for no reason when they could be focusing on making things cheaper.

>>we also continually improve the single core performance of our processors despite it becoming more and more difficult to do
>implying they don't already have the technology that way surpasses what we have and are just dripping it like a faucet for the most profit

>Nvidia
>>We just used our GPUs deep computing abilities to create a self driving car that drives almost perfectly like a human
Which is pointless since people will still do it better in the end. At least people can be blamed for being drunk while the machines are just going on random killing sprees.

>>and made a desktop GPU that provides high end perfomance at a midrange price
>still performs like crap at 4k
>high end
topkek bro. Stop living in the past. Top end years ago is garbage by today's standards.

They're not evil, they're anticonsumer. And I'm a fucking consumer.
I did say, I don't think AyyMD would be any different in their position.
It's a good decision money wise. The aren't hit by the anti monopoly laws, but they do hold a practical monopoly outside of the budget/low-mid end builds. Once you leave the 150$ zone you don't look for their CPUs, that's why Intel got away with pricing their 10core at 1.7k$

Cred Forums is contrarian, if the tables were turned and Nvidia was the underdog they'd defend Nvidia instead

Consumers should defend anything in their best interests you fucking retard

Nvidia was in deep shit during Fermi and everyone was shitting on them but a few tripfags

Fucking kys nvidishill.
I run Intel/Nvidishit atm because the fucking 1060 was 70$ cheaper than the 480 (330$ vs 400$), otherwise I would've gone red.

Still posting this 1.3 beta lie that has been throughly DEBUNKED, so sad & pathetic AYYMDPOORFAGS

>330$ vs 400$
hahaha wtf in what kind of shithole do you need to live to get this ripped off

>adhom means it's debunked
You're not getting paid for this Pajeet

Stop lying, AYYMDPOORFAGS

Only butthurt AYYMDPOORFAGS are mad about Fermi

Fermi was faster than AYYMD and support while 5000/6000 users are now thrown into the garbage can with NO DRIVERS support

anandtech.com/show/9815/amd-moves-pre-gcn-gpus-to-legacy

>BUTTMAD AYYMDPOOJECTFAG DETECTED

YOUR GIMPING LIES HAVE BEEN THROUGHLY DEBUNKED OVER AND OVER AGAIN

Yup, note getting paid at all for this shift
If you post a link to debunk that, then you might get paid HALF of the shift

AMD gimping is real with no driver support

Nvidia, rock solid driver support you can depend on

Yuropoor.

Everyone gets ripped off. What we get in Internet we lose in HW.

AYYMDPOORFAGS, forever mad their lies keep being DEBUNKED with real facts while their DESIGNATED SHITTING STREETS company goes bankrupt

I thought only AMD hardware was inflated to hell in yuroland, still noice internet is worth it

Price also includes tax, which is payed after shipping, so we get to pay tax on shipping which is great. I3+mobo+16gb ram is 450$...

where can i buy a wx5100? it's been months since they announced it

Aftermarket 1060's and 480's are the same price in my country. Which is the better card to upgrade to if there is no price difference at all.

I do not care about fanboyism, but I like AMD better.

Depends on the games you want to run imho. Benches go pretty 50/50. Imho the 480 is better cause of Vulkan. On DX benches 1060 does better afaik. You can also CrossFire the 480, but if youvplan on doing that I'd just save up and buy a 1070 or Vega, but I feel we'll witness the sun dying out before then.
Bottom line: current and older titles: 1060, newer titles: AyyMD

Samefagging ti add if you plan on goin higher than 1080p save up and buy 1070 or wait for 1170/Vega launch and see benches

Historically, amd pioneered everything that makes everything better for us. Intel/Nvidia just has a habit of slowing down it's acceptance by make their own proprietary crap and paying people to use only it.

Just imagine how much better off we would be if all software could actually use tons of inexpensive cores + gpu mining cores.

Imagine a world where Itanium became standard, and GPU specific memory were never developed.

It would be a fucking tragedy.

please tell me what you just change your router everyday and the mods are doing their job. how can posters this retarded be allowed to continue posting everyday?

I have a 1440p monitor right now, but i'm thinking of trading in my monitors for a single 40 inch 4k monitor. Dont think I'm going to be playing games on native 4k though, especially on high settings.

The problem is that 8gb 480's are more expensive than some 1060's. The 4gb 480 seems gimped.

reset your router*

4Gb is gimped. 3Gb is bare minimum and not even enough for some games so I'd go with the 8Gb model.

Between 1060/480 it's a toss up, as I said before. For playing on 1440p or a res above 1080p I'd go with the 480 because muh VRAM, which actually helps when you have high res cause less chance to be bottlenecked by memory when rendering frames.

My 670 2gb still seems to chug along nicely though, even on native. But I agree what you are saying, I will keep an eye on the pricing.

>low rent console going to continue being the last place product by using a phone chip that already failed.

If you play without AA/Supersampling that might be why. Or just low textures.
But I'd be weary of the 3/4Gb versions of the cards anyway. The 1060 3Gb has a shittier clock also afaik, might be the same with the 4Gb 480, or maybe shittier parts.

are you retarded just cause some technology can´t be used by the general plebs doesn't mean it´s wasted time

tell AMD to get their shit together and stop being pushovers who don´t compete

>pointless pushing into objectively superior technology
yeah sure right now the random moron behind a wheel is "better" you are delusional

sure i give you that 4k is currently universally unachievable

I can not even find a 3gb 1060 available

the first 480 8g is 20 euro's more expensive than the first 1060 6gb. Shit should be the other way around, but fuck the market i guess.

AMD didn't pioneer any jack

Intel developed x86

JEDEC members develops memory standards, not AMD

>powerful and quiet Nvidia Tegra

You're probably posting from a PC with a AMD64 cpu, but yeah I'm sure they didn't pioneer anything.

>I don't even know what JEDEC is
>I'm going to just shitpost and prove I'm a tech illiterate retarded child faggot

Why haven't you killed yourself yet? You won't accomplish anything in life.
Just end it.

Elaborating on: If Intel were put in charge of making a 64-bit extension to x86 (that is, NOT Itanium), we'd still be stuck with 8 registers everywhere and using the x87 FPU for floating point

Shit I forgot to add, we'd still be using segments as well even though nobody really uses that shit on 32-bit

>drives perfectly
>like a human

Pick one.

Without Intel, you won't even have x86, x86-64 is nothing but an ugly AMD hack which still doesn't have 32 registers like most other CPU architectures, so STFU

Itanium is a clean 64bit implementation that is superior to AMD's ugly 16 registers hack

Intel
>we backdoored every CPU we sell with firmware level undetectable remote backdoor access, even if it's turned off and/or it doesn't have an active connection to the internet

>doesn't innovate
Which is why Intel is allowed to use AMD's x64 instruction set, even if they fucked up with Itanium.

And yet Itanium didn't take off. Would have been nice though.

Compare it with Intel's "Protected Mode" hack. Earliest revision only had 64KB segments and the only way to leave protected mode was to reboot the CPU... and on top of that IBM had to add a workaround to mask out the A20 line for backwards compatibility, and they abused the keyboard controller to handle this.

>32 registers like most other CPU architectures
Don't you mean 16?

I know MIPS has 32 regs, and I *think* PowerPC has 32 regs. I struggle to believe "most", unless you're limiting your scope to expensive servers.

ARM: 16 registers
68000: 16 registers
SuperH: 16 registers

And as the 80386 was a thing in 1987, I'll even list these which were still being used in game consoles and shit:

Z80: 6 16-bit registers, ignoring shadow regs
6502: 4 8-bit registers

>Itanium
>superior
>ignores the fact that it failed to take off

>x86-64 is an ugly AMD hack
>but surprisingly it works everywhere

kys

The sooner AMD stops making consumer CPUs the better for everyone.

Intel can just sit around pretending to be "competitive" with AMD and producing tiny upgrades each time.

It's like a heavyweight taking on a drunk cerebal palsy featherweight. Sure he is there in the ring but he has zero chance of landing a punch let alone a knock down.

>being this comfy that AMD makes nothing good
>being this monopolistic

Let me guess, you assume Comcrap being the sole ISP for a lot of places in 'Murica is also fair game, right?

You think there can't be other competittors?

You think that America is the only country on earth?

The reason you have the likes of Comcast and Time Warner is because you're all "land of the free one day I'll be that rich", monopolies are easily broken up in Europe and if someone gets too big they get split if they are not providing fair service.

>Server CPUs legitimately need more cores.

Except AMD's "cores" aren't actually cores, they just copied Intel's hyperthreading and brand a quad core chip as "8 cores".

Also, AMD have been a joke for server chips for nearly 10 years now, they just can't compete on power consumption and thermal performance.

>You think there can't be other competittors?
there literally can't be
amd, intel and via are the only companys legally allowed to produce x86 CPUs
enjoy your patents
And i doubt this is different here in europe, and even if it was, nobody would develop a CPU from scratch if they can't sell it in america and some other countries.
>x86-64 is an ugly amd hack
kek, it's leagues ahead of x86, and i wonder who's responsible for that one? hint: not amd
>muh itanium
completely different architecture, has nothing to do with x86
x86-64 is very similiar to x86, as is required for legacy reasons
en.wikipedia.org/wiki/Comparison_of_instruction_set_architectures
has register counts for a lot of architectures

CMT is literally nothing like SMT.
Stay in Cred Forums where you belong, you obviously underage kid.

>Without Intel, you won't even have x86
yeah, we could have had a much saner architecture.
instead x86 became dominant.

Are you seriously trying to say that there can't be any competitors on the microprocessor market? Do you even know how much money and how much R&D you need? Not to mention royalties and permission-asking from AMD and/or Intel, unless you come up with your own architecture, instruction set AND the OS/software to work with and compete with those running on x86 or x64

Except gaming is literally the only moderately demanded workload i can think of that doesn't use the cores. Most professional workloads are very well multithreaded and whatnot.
Additionally it's not like AMD could magically produce more IPC if they had lower clocks or less cores. So they have to to compete. Meanwhile because Intel IPC IS miles ahead they just put the cores behind a paywall.
I mean im gonna clock my CPU as aggressively as I can anyway.

>WE GIVE A GPU 1TB OF VRAM
it's not VRAM, and this is useful for professional applications where there can be immense lag trying to manipulate several hundred gigs of data with a GPU bound function.
>WE GIVE A CPU 64 CORES
a heavily parallelized application can benefit from this much more than better single core performance.
>MOAR VRAM
this can be useful depending on resolution, and "futureproofs" cards better. 6 years ago 1GiB of VRAM was enough; now you need at least 4GiB.
>MOAR CORES
nice meme.

*can be any

Damn autocorrect.

Also, one thing I forgot to note, it would cost billions to even have a chance against AMD, let alone Intel.

>drives almost perfectly like a human
Humans are shit drivers, by and large. I hope self-driving cars are nothing like humans.

>implying nobody can buy AMD

Checked, dubs confirm

The only countries on this planet where people drive decently are Germany and Japan.

Worst? Any GCC country, India, Indonesia and US. I'd put Africa there but they can't afford a lot of cars, and in South Africa an exception is made because you can get robbed while driving.

user was talking about other competitors, buying AMD does not mean new competition magically appears.

Seconded this notion

ARM is an improvement, but nothing beats glorious MIPS

If AMD goes under you can take Intel and nvidia prices and multiply them by 2.
If instead zen and their gpu are good it will better for every one buying a PC.

>powerful and quiet Nvidia Tegra

Tegra is so shit that only Nvidia uses them.

>Tegra
>powerful
>quiet

This fucking guy, my sides

>Guess which one this technology board defends as the "good guy"

Well, let's see:

Intel: Buzzwords
Nvidia: Buzzwords
AMD: Tangible performance enhancements.

>Humans
>Perfect drivers
lel no.

If you're as much of a dumbass as to call out the SSG for being "shit", then clearly you only play games.
That 1TB of memory allows for massive workloads to be brought straight to the GPU, meaning no CPU power is wasted in passing it there and back from and to the main storage. As well as shortening the time it then takes to get those files.

Even at the launch, the SSG was shown to run 8k video and cleanup at 90fps. Current top end workstation GPUs can only manage that at roughly 19fps. It's a huge boost for the video industries.

>i-iit's debunked
>source: my ass
Sure

or....... you have nvidia kids buying gimped dx12 software....

Hyper threading and AMDs core design are completely different you blundering retard.
Bulldozer style AMD chips do have 8 integer cores, they just share 1 FPU between every 2 cores.

This is what Cred Forums has become

>I heard buzzwords from others
>I shall not do any research of my own
>I shall spout buzzwords until people agree with me

Oh the little Jewlet.

Who cares? I just want a graphics card to render my waifu games.

I can't wait to get my 700 buck graphics card so I can play Monster Girl Quest in 8K.

It's because of clover and funnyjunk fags

here's a non-shill actual truth version for AMD
>we keep scraping by with comparatively tiny R&D budgets and delivering acceptable and at times superior alternatives
>we have never resorted to anti-competitive measures despite our competitors constantly doing so
>we supply drivers and rendering technology as libre software
>we build GPUs that perform well in every task and actually render scenes properly, rather than bogged down pieces of shit that only perform well in old games and only deliver part of the advertised spec

...

>nvidia
>self driving cars
useless, why the fuck would you want the botnet driving for you

besides in terms of innovation and moving the industry forward, AMD does that more than nvidia or intel