'6Gb is enough for any game' Nvidia said

'6Gb is enough for any game' Nvidia said.

digitaltrends.com/computing/amd-vega-graphics-chip-launch-windows-confirmed/

16GB HBM2 on Vega 10.

Other urls found in this thread:

computerbase.de/2016-09/grafikkarten-speicher-vram-test/
imgur.com/gallery/Bk4NO
wccftech.com/nvidia-gtx-1080-ti-gp102-specs-leaked/
en.wikipedia.org/wiki/GeForce_700_series
youtube.com/watch?v=SUEXhu3tm3M
pclab.pl/art60000-21.html
lmgtfy.com/?q=rx480 pcie slot
youtube.com/watch?v=sRo-1VFMcbc
youtu.be/C70BubcRcTQ
twitter.com/SFWRedditVideos

Fuck me. The only thing that makes me feel okay with my Fury X is that the Vega 10 is also supposedly going to only have 4096 stream processors. If that's true, I don't imagine the jump from a Fury X will be worth it.
Also to consider is that I've got a custom cooling loop, and getting a new GPU would mean also needing a new block and back plate.

games are made with lowest common denominator in mind. if consoles and nvidia cards don't have more than 6 or 8 then that's the useful limit

Owning 2 AMD cards I must admit it is becoming a meme. Instead of making the GPU more powerful let's just throw more vram at it because biggest numbers are better right? AMD need to fucking sort their shit out. Vega better fucking deliver because you know just as soon as it drops a few weeks later (or less) Nvidia will drop the 1080Ti and wipe the floor with it.

>video cards have 16gb of memory now
>i just upgraded my computer to 4gb ram

More vram won't do shit.

Is HBM2 something to look forward or just a meme ?
I skipped this generation to get HBM2 memory in med-low end GPUs.

computerbase.de/2016-09/grafikkarten-speicher-vram-test/

tl;dt

More VRAM is better.
More FAST VRAM is even better still.

Wipe the floor with a $600+ gpu? How many 1080 builds have you seen? Yeah I knew, only a few. Most of the PC builds created with pcpartpicker are under $1000. The market for 1080ti will be tiny.

...

Giving that card only 8 GB HBM2 would not even make the card $5 cheaper.

HBM2 is dirt cheap to produce compared to GDDR5 or HBM1.

Only AMD fags would fall for this specsmanship

MORE CORES
MORE VRAM

finally maybe i will get a gpu that want get bottle necked by its vram, the 290x is still a great card but it gets bottlenecked by only having 4 gb of ram

Turn the texture quality down.

That is literally how GPU design works.
GPUs are massively parallel processors.

TL;DR for those that don't speak German and don't want to Google translate:

3 GB is not enough for 1080p for ultra and not even high anymore. Don't buy a 3 GB card if you plan to use it for today's games. Bad idea.

4 GB is the same way, but it's a fair bit better in the tests. Less stuttering is evident. 6 GB is already used by games optimizing for the 390/480 with ease. The 6 GB 1060 is a bad long term investment for a card, even at 1080p. Don't bother with it at 1440p.

As a side note, you can see and understand the frametime graphs at least. Strangely, Nvidia's cards all seem to exhibit severe hitching. Obviously, the cards with lesser/slower memory are worse about it, but let's take a look at Black Ops 3.

See the latency spikes on the Nvidia cards, how the valleys and hills are much more pronounced? Now so you can see closer, flip between the Extra quality settings. AMD has superior frame pacing in this.

In Mirror's Edge, the 4 GB of the 480 was enough for Hyper textures. The 3 GB 1060 is unplayable. Hell, the 1060 can't even do Ultra.

ROTTR is mostly a tie.

Deus Ex shows the 470 destroying the 1060. It's an AMD-optimized game and Nvidia's drivers seem busted or something because the frametimes are not ideal, even on the 6 GB 1060.

This unexpectedly turned into a wall of text. I need to see more frametime analyses but it looks like Nvidia is losing on the frametime (ie smoothness) front.

>First half of 2017
fuck it lads
why AMD waiting so long just to release a 1080/1070 competitor? Nvidia will have 1080ti to fuck them over with by then. Didn't they learn from the Fury X?

you're forgetting the architectural improvements present on the 400 series gpus, the 480 has 32 rops and is on par with a 390, which has 64 rops

now let's say that the fury x will have the same 4096 SP, 64 ROPS, 256 TMUs, I think that, while compute will only be 10 tflops, performance like geometry processing and other good stuff will be much higher

How the fuck are you using that vram
Only game that uses much is Max Payne 3 and GTAV in 4k

Rise of the Tomb Raider regularly uses over 6GB on Ultra.

Except for when you wanna play Stalker with mods

Oh god yeah, the geometry will be a lot better, certainly, but worth jumping from a Fury X for?
I can't justify another stupidly expensive purchase so soon.
I mean, sure, it'll have likely been around two years when the new successor comes out (I got my Fury X on day 0) but I only got my OLC some time around... June?
I'm looking at changing out some of the tubes for hardline as it is. Adding in a new GPU, block & backplate would just be too much.

Once the 590, or perhaps even 690 comes out, then will be the right/best time to upgrade.

Only .5gb more VRAM and 970 users wouldnt get this screen

How high res are these textures man

It doesn't need to if it had 8gb (390x)

Deus Ex is also really shittily programmed, they managed to make dx12 LESS efficient than dx11

>falling for the 16GB meme
:^)

Nixxes are supposedly very good developers. They have got RotTR DX12 working quite well with my RX 480 on a Nvidia sponsored game. Give it time it's still beta.

VRAM is cheap, theres no reason NVIDIA nor AMD shouldnt include more by default.
NVIDIA in this case is just trying to defend why it refuses to give customers more when it literally would barely affect their costs of production. Reality here is NVIDIA has always been into planned obsolescence with drivers, vram, etc and they never will budge.

>Give it time it's still beta.
Oh? I haven't followed it. Why are people quoting a benchmark that's still in-dev?

If VRAM was cheap AMD would've given the RX480 HBM and made it like a Nano.

>Mirror's Edge, the 4 GB of the 480 was enough for Hyper textures.
barely its just that 3rd and 4th gen gcn has texture compression and 1st and 2nd doesnt

>I can't justify another stupidly expensive purchase so soon.

but u actually bought a fury x and being a bigger retard by putting it in ur olc desu

>1080Ti
imgur.com/gallery/Bk4NO
wccftech.com/nvidia-gtx-1080-ti-gp102-specs-leaked/
I am really hoping for a 1060 Ti and/or 1050 Ti now.

>Only uses 50w more than the rx480 but has 5x the performance
fucking KEK

>Bigger retard putting it in an OLC
Actually, its temps dropped by 20c under load.
Going from 61c to 41c at absolute maximum is hardly anything other than substantial.
That lone 120mm radiator, no matter how thick, was not enough. I mean, sure, it was enough if you kick the fan on to a higher speed, but then it sounded like a fucking jet engine.

Was the temperature throttling it? No. But having it drop like that says that there was definitely a lot more room for it to be cooler given the chance.

>1060 ti
never going to happen
>1050 ti
already out, its called the 1060 3gb

*100 Watts
and 60-50% performance at best

>never going to happen

had the 480 been faster than the 1060 it would have. there's ~10 device ids for gp104 variants but so far we've only gotten 5 (1070, 1080, mobile variants and tesla p4)

a 1070 with 2 more disabled SMs and 4gb of vram would have been a great value.

>'6Gb is enough for any game' Nvidia said.

There is no card with 6Gb of memory on the market currently.

IIRC Mirror's Edge actually overrides your texture setting if you have too little VRAM unless you untick a box, which lead to a 4GB Nvidia card beating the 8GB AMD card when the box was ticked, but the other way around when the box wasn't in Digitalfoundry's tests on hyper. I don't speak nazi but it wouldn't surprise me if they didn't test that game properly.

these things will be worthless when unlimited detail engine is released anyway

GTX1060 6GB
Fucking moron.

the 380 was faster than the 960 and there was never a 960 ti, the 280 was faster than the 760 and there was never a 760 ti

Gb =/= GB, moron. 6Gb is 768MB.

Oh don't play semantics you autistic retard. Yes, there is a difference, but did you HONESTLY think that the OP cared whether he put a big or little b?
End of the day, anyone who isn't mentally handicapped will understand it as 6GB.

>the 380 was faster than the 960

not in real world usage (i.e not pairing the card with an overclocked i7)

>and there was never a 960 ti,

970 was the 960ti. it had a discounted price relative to past x70 cards and had more disabled SMs and gimped memory bus like previous x60ti cards.

>the 280 was faster than the 760 and there was never a 760 ti

760 was a rebrand of the 660 ti, and there was a 760 ti with 1344 cuda cores.

en.wikipedia.org/wiki/GeForce_700_series

>'3.5GB is enough for any game' Nvidia said.
Fixed.

>End of the day, anyone who is a tech illiterate AMD fanboy will read it as 6GB.

Fixed that for you. OP very clearly meant Gb, otherwise they would have typed GB.

>6Gb is enough for any game Nvidia said.
This year yes, but next year, they'll say it's 8GB

I got this with my 3GB 780, all games performed brilliantly, but the moment Maxwell dropped with 4GB suddenly 3GB wasn't enough. At least getting a card with 8GB minimum will proof it for a few years more.

Uh huh. Of course they did. By the way, did you forget your medication today, Billy?

the high vram meme needs to end.

6gb is more than enough. autists just like seeing the word "ultra" in their settings when the compressed "high" textures look identical.

even the 970's 3.5gb is still enough. i've had no problem playing new games on high settings and maintaining 60 fps with an OCed 970

Its more of AMD trying to find ways to market so fanboys can have something to justify it over getting a Nvidia

same with their processors having higher clockspeeds, gaymers love that shit.

Sure it does pal

youtube.com/watch?v=SUEXhu3tm3M

Wouldn't ticking the box turn down the graphics settings for the nvidia card, meaning that it would have inherently less work to do meaning it would perform better?

>16GB HHBM2
I can finally play Fallout 3 on ultra high!

Yeah that's the point. If the box is ticked the game limits you from setting the options too high, but the UI doesn't really tell you that. The settings menu will say the textures are on hyper quality but the game will actually use lower quality textures.

It's a smart way of making people think their cards are better than they are and/or preventing people from setting the game to settings way out of their league and then complaining about "muh optimizashun" like Cred Forums loves to do.

Does that replace Crysis as the new meme?

did you not see me use the word "texture" retard?

also this is how much vram witcher 3 uses.

high vram is a shitty marketing meme.

Are your GPUs shit?
Add more Vram because bigger is better!

Here's a quick lesson in GPU scaling. In the past when everyone was gaming at 1080P and only a few nerdy types were gaming at 1600P what was called the 'Ultra' setting quickly became the 'High' setting and the 'High' setting became the new 'Medium' setting the following generation GPU wise.

This was all well and good.

However we have a new dynamic being added to the mix. The number of 1440P gamers has increased rather substantially and the proliferance of 4K TV's has pushed us into a corner with GPU performance. Gamers with cash to splash are demanding more from their GPU's architecture for a similar amount of money. Unfortunately we are not quite there yet. We have yet to get to where a single GPU can play at 4K 60 fps adequately and affordably. Especially on higher settings.

This is a big transitional period and a very bad time to be upgrading your GPU. Next year we will see 4K 144Hz monitors hitting the shops so pushing the demands on GPU architecture even more.

You know that big blockbuster movie with all the fancy CGI in it? Most of it is rendered at 2K and upscaled to 4K projection and even thne it takes large render farms a day to render maybe a few frames.

>AMD’s chief technology officer, Mark Papermaster
>Mark Papermaster
yfw papermaster confirms another paper lauch kek

Are your cpus shit?
Overclock them!

>You know that big blockbuster movie with all the fancy CGI in it? Most of it is rendered at 2K and upscaled to 4K projection and even thne it takes large render farms a day to render maybe a few frames.
not anymore with Radeon Pro SSG

Cant wait for it to be 5% better than a 1080 at twice the power usage and temps

I hope so. Avatar was only rendered at 2K. BTW I have yet to find any info on what resolution the movie Antz was rendered at. Most likely the same res as Toy Story was before it got re-rendered for HD.

why do enthusiasts act like the average consumer gives a shit about 4k? GPUs aren't going to suddenly get faster because of 144hz 4k monitors.

going 4k now basically means you're an early adopter so you have to deal with all the headaches that comes with that.

1440p is the way to go right now, 4k will remain meme status for years.

The steam survey said said fewer than 4% of gamers were using above 1080p

Haven't you seen the benchmarks? Nvidia wins everytime.. AMD users on suicide watch

AMD doesnt do paper launches, Buttcoin miners just steal everything before gaymers can buy

The Fury X is actually faster than stock 980ti in a lot games now due to driver improvements. AMD seems to be stepping up their driver game and its working.

I could run Fallout 3 on a 256 MB DDR3 8600GT back in '07...where the fuck has the time gone??

DX12 is only less efficient than DX11 if you have a Nvidya card.

Instead of forcing hardware manufacturers to work over-time and create more expensive bullshit, like, why don't video game developers just optimize their systems better so they can have better looking games run on less intensive hardware

Too much effort, doesn't bring in the money.
No, I am not joking, this is literally the reason why they don't do it.

>GTAV in 4k
Even @ 1080p GTAV uses over 4GB of vram

5x more performance?

No it doesn't you hobo.

I have a 27'' 1440p ips 120hz and a 32'' 4k ips 60hz

And I cannot wait til we get 4k,120hz.

Fuck 1440p. 4k on a 32'' is gorgeous. I'm running Windows with 200% scaling and its a dream to use.
Everything from text to icons everything looks so much better.

The only downside is that 60hz which isn't as smooth as 120hz.but I'm just gonna wait.

Tell that to my multiple monitors.

But you can't have 8gb hbm1.

...

Yeah, they usually leave out the part that it's in beta. It's pretty fucking stupid how all of these review sites are acting as though it's some kind of final product.

>not in real world usage (i.e not pairing the card with an overclocked i7)
An i5 is enough for the 380 surpass the 960

>970 was the 960ti
Nope, 970 was the 970

>760 was a rebrand of the 660 ti, and there was a 760 ti with 1344 cuda cores.
So the 760 was a 760

Rise of the tomb raider uses 7.8GB on my Gtx 1080 at 1080p with every setting maxed out.

Newer games are Vram hogs

>AMD reaffirms when its upcoming 'Polaris' graphics cards will hit the market
Don't they mean Vega?

> It will be offered in two variants: Vega 10 for the enthusiast market and Vega 11 for everyone else.
Unsubstantiated rumors, though I bet there will be two cards. Doubtful that even the cut down chip will be "mainstream" considering it will be 350+ at minimum

Does the author even know what he's fucking talking about?

>Dean McCarron, principal analyst at Mercury Research, told the IDG News Service that Nvidia’s market share decline was due to it “de-emphasizing” the sale of graphics chips in large volumes through mainstream OEMs.
Kek'd, can't compete with AMD's console volumes / pricing structure

You can definitely play games with a 2GB card still (not sure how long this will last at 1080p) but most games will come with Hyper settings that use more than 4GB soon. Mirror's Edge Catalyst and RotTR come to mind. You might care if you're PC mustard race.

Having said that, consoles have 8GB of shared memory. Some gets reserved for OS, so 4GB will get you good textures.

It'll be good on mid range GPUs, especially with texture compression

MORE LONGEVITY
MORE NVIDIA BUTTHURT

Honestly the 290x isn't really powerful enough for 8GB. The 390 and 480 aren't really either. Consider that Mirror's Edge Catalyst takes a shit when Hyper is enabled on anything but a 1070 or 1080

>I can't justify another stupidly expensive purchase so soon.
I really don't know how you could in the first place

I know you're meming, but my 4GB 470 gets the same warning. Notice that it says "more than 4GB" which clearly means exactly what it says, MORE THAN 4, not equal to 4.

Agreed with lack of VRAM thing

That 1080Ti will actually be amazing for 4K. That means we mainstream plebs will have to wait two gens for affordable 4K gaming.

>>the 380 was faster than the 960
>not in real world usage
Yes it is in games except FO4 and GTAV, aka CPU heavy titles. On AVERAGE, it is.

>An i5 is enough for the 380 surpass the 960

no it isn't. all AMD GPUs lose substantial performance unless you're using an overclocked i7.

pclab.pl/art60000-21.html

I think they could have released a version of Vega this year, but the rumor of that and the release of the 480 forced Nvidias hand and they dropped the 10xx series a little early.

AMD saw that and for whatever reason decided to postpone the launch. My guess is they had some improvements they saw they could make to Vega chips so they decided to push that through now rather than on later Vega chips.

It doesn't look like they are rushing Vega OR Zen, which gives me a sense that they might actually have something really good, or, really shit. I hope they have something good, there needs to be more competition in the market.

Vega was always planned for 2017. HBM2 isn't even being mass produced yet, and the Vega chips don't have PHY for GDDR5 or GDDR5X.

>AMD’s discrete GPU market share climbing from 26.9 percent to 34.2 percent in the second quarter of 2016 compared to last year. Nvidia dropped from 73.1 percent to 65.8 percent in the same quarter.
PREPARE FOR
T A K E D O W N
A
K
E
D
O
W
N

It was always planned officially for 2017 you are right, but earlier this year there were credible rumors they could release Vega in Nov-Dec, and this spooked some people at Nvidia I'm sure.

Who knows what they got on the chip though? HBM2 can be produced in significant numbers now I'm sure. If AMD asked Samsung and Hynix they would do it. Will the card come with it? I don't know, would be a nice surprise if it did.

Who the fuck builds a new PC every time GPU lines are refreshed? I'm still using my Sabertooth X58 w/ i7 950 build from 2010, only thing that has been upgraded is my GPU(ATI5850>GTX770>GTX1070) and shits still maxing out games no problem. The need for the latest and greatest everything is a meme, granted the shit you start off with are good in the first place.

Jesus. I've had my 760 for years and it plays everything I want on ultra settings. These cards are

so

unnecessary

I had a 760. Sold it for a 970. Huge difference, but I play things like GTAV and Witcher 3

>AMD board meeting

Nvidia
>Our sales our down 2.5%, lets release a new series of GPUs that gives the illusion of good DX11 performance, but has gimped support for DX12 due to lack of Async capable hardware, that way we can give users a reason to "upgrade" to next year's line of cards. Lets also make sure to slowly kill performance through driver updates.

>it's a by the time AMD catches up with Nvidia 1 year old card Nvdia already has a new series/Ti version available that beats it episode
>it's always this fucking episode

Don't worry, it'll be $50 cheaper than the Nvidia equivalent one so its better!

best part is all that money's made up by your power bill :^)

>my 4GB 470

kek

AMD
>Our sales are down 85%, let's release a new series of GPU's that can power an African village, but is capable to start housefires, that way we can give poor people a reason to "update" to next year's line of cards. Let's also make sure to only rely on consoles and bitcoin miners for revenue. We have to try something to get our stock up

>175w is enough to power an entire African village
Im so sorry you cant even afford to light your own home. Thats like 2 light bulbs right there.

Its amazing how Nvidiots claim to be richfags but always bring up "muh power billz!"

i think the fire the gpu creates is what really powers the village

luckily for AMD that actually works for GPUs

yeah but a fury x still costs $500
they should drop the nano/fury to $300 and the fury x to $330
I'd buy a Nano for $300

this guy gets it

The only gpu to actually catch fire has been a Nvidia :^)
newfags dont remember fermi

no the rx480 fried a bunch of people's mobos but keep spewing "LOL NEWFAGS DON'T REMEMBER FERMI" even though literally everyone on Cred Forums remembers fermi

If you find 1 example of an rx480 blowing up a motherboard from someone that's not a bitcoin miner I'll blow you.

>citation needed

>putting a fury x in a custom loop

google is your friend
>gpu bitcoin mining in 2016
yeah, no one is that dumb

You're the fucking retard making the claim(Guess you couldn't find any). HOLY SHIT, just kill yourself I know no one is going to miss you.

>WHY WON'T YOU SPOONFEED ME
lmgtfy.com/?q=rx480 pcie slot

Notice how nobody complains about rx480 issues except nvidiots perpetuating a meme from July

believe it or not, DXMD has an INSANE amount of stuff it renders, despite how underwhelming it looks.

there's parallax mapping EVERYWHERE for instance.

>HBM2 is dirt cheap to produce compared to GDDR5 or HBM1.

no
there's a reason why we're getting GDDR6. HBM2 is still expensive as fuck and has to be built into the interposer.

same with fermi though
you're doing the exact same thing but think you're memetastic for it

GTX480 actually caught fire
meanwhile nvidiots freak out over 200 watts of power

This is wrong for several reasons.

a) Nobody mines Bitcoin with graphics cards these days, ASICs took over back in 2013.
b) People do mine other cryptocoins with graphics cards (Ethereum and coins like it) and while this activity historically was dominated by AMD it's no longer the case. Newer NVidia cards have a far better hash/watt ratio than AMD cards for the vast majority of cryptocurrencies. This has been the case for a while.
c) The IMF and the World Bank now control the Afghanistan economy and opium production has increased by 900% since the US invaded and the Taliban lost control. The US are now trying to do the same to Syria using their ISIS proxy army.
d) The graphics card demand from miners probably strip Nvidia supply a whole lot more than the AMD supply so if AMD has released a card on the market and it's not available then that's just poor planning by AMD.

MORE AMADA

how can you mine with Nvidia if it doesnt compute?

I prefer 3D girls like Kwon Yuri.

so did the motherboard
i don't know how you're attacking one and defending the other when they're both horrible mistakes

yeah but that aint technolo/g/

>Nvidia doesn't compute

I hate this meme. Compute is easiest on Nvidia cards, with CUDA, the libraries like cuDNN and extensive documentation compared to OpenCL.

As someone that was mostly happy with my 960 2GB then upgraded to 1070 I seriously question if you need this much vram.
Sounds abit like having a giant petrol tank for a prius.

we do need more Amada
AMD hasnt used her since the Fury

>does a car need fuel
What a shitty comparison wow. You should've stuck with the food analogies.

This. I'm on a 380 and the only game I've ever had any real trouble with was No Man's Sky and that ran like shit on everything including 1080s.

Exactly. All the nvidia cards has Compute and version 2.0 or 3.5 or any later version work just fine.

As for CUDA vs OpenCL, that doesn't even matter since you can mine cryptocoins with both interfaces if you have a nvidia card. The performance difference between those isn't really measurable.

>can't find a single 480
>tells you to good it

Fuck off faggot

Nvidia at it's best for you new fags
youtube.com/watch?v=sRo-1VFMcbc

I'm hoping Vega is a big jump.
Fury X was supposed to take on 980ti, giving their new flagship "1080 performance" would mean a fucking 10% boost after 2 years of waiting.

Im just interested in the price.

350 dollars?
400?

Feels like AMD and Nvidia are both pretty even right now, except for AMD's lack of competition for the 1070, 1080, and Titan X. I guess it's possible that the Fury X could compete with the 1070, I doubt that though. It's available for less than $400 now.

I still consider the 480 and I don't know how anyone can defend AMD. They lied about the power consumption of their GPU, and more importantly, they lied about the price. There is still no $200 480 to be found. I'm glad their marketshare has improved but they are still on my shit list for being liars. It's almost as bad as the 970 fiasco, although nothing can top that.

Fuck both of them, no sense in being a fanboy, just get what's the best deal for you.

Poolaris is a dissappointment because its true purpose was consoles. R9 290 performance on the PS4K and all.

>3.5gb is all you need
>8gb is overkill

LOL

Is that a boi or a girl?

Is Vega gonna be better than say a 1070? I'm wondering if I should wait for Vega.

Why don't you look into the crystal ball for AMD and see what the Vega specs are?

What's the point? You're waiting until 2017 for a cheap "1070/1080 competitor" when Nvidia will have something better by then.

AMD is the most retarded company in tech.

Because Fermi actually caused a housefire that killed someone...

AMD broke the PCIe spec, oh no.

Honestly, I am quite surprised people still use lossy formats like dds for textures in current year. I mean, you won't do such thing for meshes, why would you lose texture quality when video card have more than enough memory even for uncompressed textures?
Personally, I can't fully enjoy 4k 144Hz experience because of those imperfections in textures.

My 512mb AMD 4500 series was good enough to run Skyrim at ultra high settings, are cards these days just poorly made or something?

I'm not a smart guy. I thought there would be a big difference with AMD using HBM2

this screen shows up on 8gb cards

I clearly remember playing Skyrim on my ATI 4670 w/ 1GB of VRAM and barely being able to run it on medium at 25FPS

desu I had anti-aliasing turned off

I never used anti aliasing on that setup.

Weird man. I can't remember what FPS I was running at but I think it was a bit over 30. I would occasionally get lag when the laptop got over 70°C though

Any information about prices ?
Can it be expected to be in 1070 segment?

Thinking of getting a 1060. Is the 3gb fine or would I be better off saving for the 6gb version?

nobody buys nvidia cards for ethereum mining, their price/perf is fucking terrible

980ti

Im getting the same FPS in muh Dota2 with RX470 as popular streamers with GTX980/980Ti. Pretty funny.

get the 6GB version
it's also about 10% faster and you'll likely need at least 4 GB of VRAM for upcoming games - since it seems the industry settles on 4 GB as the standard

>no one ever talks about nvidia cards on mining sites
>it must be good at it too
Stop making shit up.

Thats it. I'm fucking done with being a waitfag.

If AMD hasn't released both Vega and Zen by 16/05/2017. I'm pulling the trigger on a new rig. Best of the best, no brand loyalty.

She was at the Polaris reveal event.

What would you do with your old rig?

no shit you tard, it's a CPU heavy game, I gain like 30fps going from minimum to max settings which means nothing when I'm getting ~200 anyways.

go play Witcher 3 or RotTR and see how well your 470 fares compared to those with a 980 Ti

Vega better be on par with 1080ti or else this is a flop.

This is the last time I've given them a chance.

>dota2
>CPU heavy game
Hahaha, youre funny, kid. If thats was the case why am i getting the same FPS with my old i5 compared to them with their new i7's?
>go play Witcher 3 or RotTR and see how well your 470 fares compared to those with a 980 Ti
No shit, idiot. I was talking about dota2. Why so butthurt though? Make sure to not update your novidia drivers, or my 470 will catch your 980 in no time ;)

You can never have too much VRAM, in the past GPUs couldn't do much so it became a waste. Now with GPU compute there's so much you could do with the GPU but they don't have much RAM and so no one is really exploiting it because they don't want to exclude 2GB cards.

16GB is about where things start to get really interesting, beyond that on gpu SSD is probably good enough rather than 32GB and 64GB.

But nothing will happen until 8GB is as common as 2GB cards are now.

SHOW PROFITS PLS

Me too

Windows vista struggled with 2gb ram

You replied last! You win!

>Hahaha, youre funny, kid. If thats was the case why am i getting the same FPS with my old i5 compared to them with their new i7's?

because the game runs entirely on a single thread like every other source engine based game in the last 15 years

no. probably $600 for slightly slower than 1080 perf.

There will be lower Vega 490/Rage competing with 1070 and Fury Vega competing with 1080. As for prices Rage will be 380-400$ and Fury 550-600$.

[citation needed]

a large die with the full 4 stacks of hbm2 would literally be sold at a loss at a $400 price point.

3.5 LUL Nvidia shill

don't even bother trying to respond to them op. they're the wizards of smarts who a month ago where proclaiming 3gb ought to be enough for everybody and you will never exceed 2gb at 1080p, let alone 4gb.

Dude 4Gb is perfectly fine for a mid/low mid card. Fucking dota 2 on ultras use 1Gb.

when 1Gb was fine to run anything and 2Gb was high end
You need 24Gb now
What happened?

You dont NEED more than 4Gb of RAM. The thing is 4Gb will provide you with HIGH textures for a while which is what matters the most. Look at fucking Mirror Edge, you can set HYPER settings but it wont look any better than fucking ultra and will consume much more ram at that. Fucking marketing bullshit.

>everything I want
How's dorf fort and LoL?

>rx480 fried a bunch of people's mobos
>a bunch
No

6/10, I pick up girls hotter than her all the time, and I'm not a Chad

Undoubtedly the 1080Ti will be performance king

The 470 beats the 970, take that nvidiot

What's wrong with that?

Nothing. The best value card right now. Thats is if you get non-shit model.

Thanks for the ridiculously quick response, that was the card I was considering getting when prices come down.

what price are you getting it for?

>The 470 beats the 970, take that nvidiot

lol no. the 470 is both more expensive than a second hand 970 would cost and is ~10% slower. and that's just against a stock 970.

>16GB
wow video cards are finally catching up to Android smartphones

>ADD MOAR MEMEORY

>tfw waited

FUCK OFF
NEW RUBY DEMO WHEN

>16GB

Unnecessary as fuck.

The only thing I need all that VRAM for is modded STALKER, Skyrim, OpenMW, and Fallout games.

>unless you're using an overclocked i7
considering most games don't even use hyperthreading you're saying an overclocked i5 is enough, and it is. But you don't even need to overclock it to get better results with the 380
youtu.be/C70BubcRcTQ

>are cards these days just poorly made or something?
maybe things changed after 5 years? maybe we're in another console generation?

I run 4K with 980 SLI so 4 GB of vram. Recent games I just bump the textures down to "High" and it works just fine with zero stuttering. I can even bump it up a notch sometimes with very minor stuttering.

All devs have started doing is included mostly uncompressed or higher resolution textures as an option for Ultra settings. Their high setting usually sits just below 4GB for me at 4K so that's somewhere in the 3.5GB range for 1080p.

Review sites are all incompetent. Most of them don't even benchmark properly.

They are only useful for rough metrics. None of them are scientifically rigorous enough to be worth considering accurate.

>But you don't even need to overclock it to get better results with the 380

960 performed better or equal in all those games except for the amd sponsored one (crysis 3).

it's pretty abysmal that even the 380, which was a factory OCed 285, is struggling to match a stock 960, which is widely regarded as a shit card even by hardcore nvidia fanboys.

>Better in Crysis
>Worse in Project Cars (the most Nvidia favoured game ever)
>Equal in TW3 (Nvidia game)
>Equal in GTA V (Nvidia game)
>Better in TR

I'd say it's overall better, plus the bigger bus size and DX12 advantage. Your point is you need a high end CPU to get the ADM advantage and you're clearly wrong

>I'd say it's overall better, plus the bigger bus size and DX12 advantage.

except it isn't. it's a more expensive, OCed card struggling to perform on par with a reference card.

>Your point is you need a high end CPU to get the ADM advantage and you're clearly wrong

the system used in the review you linked had a 4690k, that's a high end CPU and it's more than $100 more expensive than both the GPUs tested. most gaymers building systems with 960s and 285/380s are skimping on the CPU and getting shit like i3 6100 and i5 6400.

>except it isn't. it's a more expensive, OCed card struggling to perform on par with a reference card.
Who said it's refference? Also you could say it's an older card "struggling" to get the same performance or better than a newer one
>the system used in the review you linked had a 4690k, that's a high end CPU and it's more than $100 more expensive than both the GPUs tested. most gaymers building systems with 960s and 285/380s are skimping on the CPU and getting shit like i3 6100 and i5 6400.
It's operatin on stock clocks, which is much far from a "heavily OCd i7" like you originally said. Now you're just backpedaling and making assumptions

So we all agree the cheapest 1070 is the best performance/price ratio with a modicum of future proof right now. no?

I agree.
Poorfag = 470 4Gb.
Richfag = 1070.
Two the best value cards right now.

I see 1070s at the 380 price point all the time and 480s at 220. Is it really worth the 160 bucks?
I'm thinking I should buy a 1060 or 480 and then upgrade in 1 generation or 2.

1070 is around 35-40% faster. Do the math.

If you have a smaller budget, buy the 8G RX480 over any 1060. Every new driver is increasing 480 performance buy a good amount. Recently the 480 passed the 1060 in Witcher 3, not to mention the DX12 games.

The recent driver hotfix gave a lot of AMD cards a 10% boost in DX11.

If you got cash burning your pocket go for the 1070.

where are you seeing 1070's for 380? everywhere i look they're 430 at best for a non-blower cooler, and all FE i've seen are still at 450.

>These factors are likely what sent AMD’s discrete GPU market share climbing from 26.9 percent to 34.2 percent in the second quarter of 2016 compared to last year. Nvidia dropped from 73.1 percent to 65.8 percent in the same quarter

well shit AMD isn't dead

Not that I disagree with you, but what century are you living in? Incandescent lightbulbs?

175W is 35 x 5W LEDs. Enough to turn a house into a lighthouse.

>they should drop the nano/fury to $300

The regular Fury is 300 yurocuckcurrency over there right now, it's happening.

>Implying

>nvidia users actually believe this

HBM2 is faster, in a sense.
Also yeah, more.

Faster RAM means faster CPU...FPU operations, since the processor has to wait for I/O before it can actually do anything. In any case, decreasing the travel time between the processor and RAM is pretty much the best thing you can do to increase performance nowadays.

What century are you living in that LED lights are considered cheap, and not a long-term investment?

fury has been between 275-350($310) for the pas couple of weeks. Prices seem to have stabilize at +-10 @ $300. Fury X has been on sale for as low as $350 and is normally 375-$400 now.

>Nvidia will drop the 1080Ti and wipe the floor with it.

nvidia did the same with the 980 ti much to the dismay of people who bought a titan at launch 3 months before 980 ti dropped

it's a cat and mouse game and the only losers are the customers that get in too early

I'm willing to pay within $40 of reference MSRP for a non reference card. So $240 CAD. Right now it is $270.

literally no game uses more than 3.5gb of vram

You know user.

>implying anyone cares about the present when the future exists.

Stop playing console games on PC so often

err no 1070 is the poorfaggots 1080.
If you got a 1070 it literally means you cant afford a 1080

Just bought a saphire nitro+ 8gb for 216.
Feels fucking good man.

The 21st?

This picture triggers my autism because merecury research's results aren't what people think they are. Namely mercury research normalises their data which jon peddie doesn't. Its why mercury research's data hides lulls and spikes which is why its rather poor data to use because it inherently ignores the sudden spike polaris gave AMD in the few weeks it was on sale in Q1 (or Q2, I forget).

JPR don't normalise so this spike is evident.

tl;dr that chart doesn't say what you think it does.

They make 8GB 290xs

Question?
It is realistic to think the 1070 will last and be useful for 7 years?
In my view and given the prices it makes more sense to buy a 1060 now and upgrade in 3.5 years time. You end up paying more or less the same but trade top-notch performance today + barely able to keep up in the future for ok performance across the whole 7 years span given that you are able to upgrade mid way. Thoughts?

The first run of 8gb 290x cards were rare as fuck because only sapphire produced them and only on the vapor-x. It wasn't until the last two or three months before the 390/x were released did the 290x 8gb really exist in numbers and from other manufacturers.

>It is realistic to think the 1070 will last and be useful for 7 years?

How is the 8800 GT ultra working out?

1060/1070 life depends on what you set your targets to. If you think in 2 years itll still be running games on ultra at a decent fps you are very mistaken

That is the point.
I went 8800gts -> 560ti -> now considering 1060 or 1070.

Those cards lasted me on average 3.5 years and every-time I have to upgrade I ask myself if it would not e more sensical to get a better model that will last longer but given the prices they have to last twice which is not very realistic. In the end it seems my strategy to upgrade every 3-4 years is better and cheaper than trying to go for better models. I guess I am looking for reassurance from other people that I am doing the most cost efficient thing.

1080ti with no vega in sight
gddr5 and not gddr5x
lel
no reason to release this card yet as nvidia already dominates the high end market maybe next year?

The difference this time is that the titan xp is going to be on the shelves uncontested for quite a while now. On the otherhand it's already a cut down chip, so they could rerelease it as 1080ti and make an uncut gp102 as the titan xp black edition so they have something to remain at ~$1000. Unless the 1080ti is a further cut gp102, but they really don't need it unless amd gets something to compete with or beat the 1080vanilla.

witcher 3 ive noticed doesn't use much vram at all a credit to its developers, but i think having larger frame buffers in cards will allow developers to get lazy like faggot ex mankind divded at times can use all 8gb of vram at 3440x1440 and desu it isnt really an open world game so there is no excuse.

Yeah but the witcher 3 can bottle an i5-4460 & a 1080.

I run a similar system to you, always waiting 3-5 years for an upgrade. At that point it always seemed to me that with a careful eye you can get a cut down top tier card for cheap and those always seem to be a stones throw from the top cards whilst being a fair chunk cheaper (had better luck with amd ***pro cards than *70 cards).

So it was x800pro for ~£200 near launch, x1950pro for similar, 5850 on launch before the same before they bumped the price and slotted in the 5830 and recently a 290pro for £215 nearly two years ago.

But with nvidia it was the x70 tier cards always being that much more expensive for about the same performance, ignoring x80 and xxxXT cards being way too far into diminishing returns for cost.

vega needs to compete with volta to be really competitive.
If its just a 1080 replacement, then amd will have already lost.

Given we are seeing fiji compete against the bigger pascal chips (generally in DX12) I really don't think we need to worry about vega being anything less than monstrous.

Well at the same 4096 core count as fiji, add on the 30x turbo core speed, my guess is it's hitting 1080 tier of performance. This is assuming it uses a similar polaris arch though, as I don't believe that care a perfor core improvement outside of marginal error. All the visible improvement is from the process and it's given core speed increase in the majority of cases.

I have a single Fury X too and works fine for all my 4k needs as is now.

Not sure anything coming out that I want to play will require a jump.

The 480 is actually a really powerful card when you consider it is basically half hawaii on most fronts and is just as fast. Hell if Fiji was simply given 96rops (no small task) it would crush the 1080 today. Would be amusing to see AMD go mental and make a 128rop chip as the wider you go the better GCN gets - its thermals and voltage that hold it back (GCN is typically much denser than its equivalent Nvidia card).

I think fiji was designed for 64 polaris rops at 1.3ghz which I assume that's what vega will deliver. I was just using the core count x core speed as ballpark figures. I just don't think vega will compete with gp102 even with it's bottlenecks lessened. There's always full gp102 somewhere as well.

Ok about to buy Gainward Phoenix 1060 GS for €299. The cheapest 1060 I can get is €279 and it is the Gainward basic model. So €20 for factory overclock and 2-way bios does not seem expensive. Should I do it?

6 gigabits isn't that much desu

I'm just fascinated by how easy it is to push marketing gimmicks down idiots throats. It's been weel over a year since I started seeing complete fucking idiots everywhere thinking that you measure a gpu's power by its vram amount, including here.

>b-b-but X game needs more vram!

No it does not. The only thing they are doing is just caching resources they aren't actively using and that's it. It won't introduce any stutter or frame time pacing issues whatsoever if you go with a lower amount. In fact, the only game I've seen that stutters on cards below 4 gb is the new Tomb Raider.

But I guess sticking 8 GBs of vram on cards that aren't even close to being as powerful as they would have to be fully utilize it wasn't enough, gotta take the meme to a whole new level.

Do remember that to get more bandwidth you need more chips for GDDR5. Sure you can crank clockspeed but that has rather severe limitations unless you go full fat on the bus width (which, again, sucks down power).

Well fuck at the last moment I got a Gainward Phoenix 1070 for €450. I am just way to monetarily conservative. Fucked I want to enjoy games for once in my life without having to worry about not maxing settings. YOLO.