>>Intermittent mouse cursor corruption may be experienced on some Radeon RX 400 Series graphics products.
Why is this still a problem with their drivers? I even got that sometimes on linux as well with a nine years old card. Can't they into hardware cursors?
>>56717667
I don't think AMD or ATi ever got past analog technology. You'll notice this issue only happens with digital connections. If you're VGA to VGA it doesn't happen.
it was an odd bug this time. it only appeared to happen after resuming from sleep and was fixed by moving the mouse to the corner of the screen and back again.
>Why is this still a problem with their drivers?
Well it's not if it's under fixed issues. I last experienced it under Linux back when I had a 4870.
Call me when they update drivers for my hd6970.
Get a new GPU already, besides and did everything they could do for that gpu
Oh shit they fixed video playback in Firefox?
I'm definitely going to buy AMD I mean, you can watch video now!
The driver updates are real!
Dumb tripfag go back to /reddit/
>HDMI® Audio may be lost after resuming from standby on Windows 10 Anniversary Edition.
Lets hope it fixes that. Its annoying, as I have to put the computer to sleep to make it work again or restart if not that.
>all these "may happen"s
AMD quality right there
Are you retarded? Literally every bug list from every company uses the same "may" "possibly" "occasionally" language, because bugs often only affect certain configurations. Even in situations where a huge number of people are experiencing issues, companies play it down with language to make it sound like it was a smaller number.
I guess I shouldn't expect the average retard on Cred Forums 2016 to have ever read any patch notes before.
kys tripfag
No they didnt.
6970 was just as capable as a 7870, until AMD decided to gimp it and force users To upgrade.
The HD 6970 is a 6 year old, 40 nm GPU at this point, with the old TeraScale VLIW architecture instead of GCN (SIMT).
Even a RX 470 stomps it in every regard, and a RX 460 isn't even all that far behind it in raw specs.
Maybe it's time to give it up, friend.
i think even the 460 would be faster than it.
rx470 stomps a gtx 760, doesn't mean you can't still use it.
Funny how now even a 650 is faster than a 6970 thanks to driver gimping
Can I adjust gamma yet?
As the owner of a 3 year old Kepler, your whining about a 6 year old card getting second-class driver support is fucking hilarious.
Gtx 680 fag here, I have no complaints, still getting monthly Game ready drivers.. Still playing most games on high or ultra
Kepler cards still get enough support to run new games, but even the 670/680 ($400/$500 cards at launch) get completely shit on by a 960, which is a completely inferior piece of silicon except for a much beefier tessellation unit.
If Nvidia fucked Fermi/Kepler users, it was far more due to pushing HairWorks type nonsense the moment Maxwell came out with their TWIMTBP.
>amd
Abandonware
My mate has a 960, it traded blows with my 680? Which is normal because it's 2 generations newer.
x960 tier performing on par with previous gen x70/x80 is perfectly normal.
well the 960 was essentially a re-branded 760 with a lower tdp in terms of performance.
>anandtech.com
a 760 was a re-branded 670 in virtually all regards.
>anandtech.com
the 960 offered stronger tessellation performance vs the 760 so in cases with heavy, nonsensical tessellation usage the 960 could easily come out ahead of a 680.
overall though the moment a new series comes out from nvidia, nvidia does make the new series top priority. so all performance, bug fixes, and overall focus is first on the new series. the older series are secondary. only critical bugs on older generations will get equal focus as the new generation.
so nvidia does optimize older generations that are still covered under the current, main driver branch, they just come in slower compared to the new shiny generation.
this shouldn't come at a surprise though. the new shiny generation makes them new sales. not deprecated products that have been superseded and no longer in production.
to be honest i'm somewhat surprised nvidia still offers support for fermi at all. keplar to polaris share a similar architecture, while fermi was a completely different architecture. fermi is similar to that of gcn. all that old code base that's mostly irreverent for keplar and above. though seeing how nvidia didn't bother keeping their word with releasing dx12 support on fermi, fermi will most likely get drop down to legacy driver support soon.
>well the 960 was essentially a re-branded 760
You are literally fucking retarded if you believe this.
well the links i linked show it
and again, the only thing the 960 offered over the 760 was the lower tdp and increased tessellation performance.
but hey you will probably claim
>anandtech - into the trash it goes
Retard only Nvidia gimps their cards
> GK104 - 3.5B transistors, 294 mm^2, 32 GP/s, 129 GT/s, 192 GB/s, 3 TFLOPS
> GM206 - 2.9B transistors, 227 mm^2, 36 GP/s, 72 GT/s, 112 GB/s, 2.3 TFLOPS
The 960 was a distinctly weaker (and more importantly cheaper) chip, and the memory compression in Maxwell didn't come close to making up for the raw bandwidth cut.
Maxwell was a big scam where Nvidia sold cheaper chips with one beefed up feature (tessellation) and then encouraged the majority of AAA games to rely heavily on that feature and bottleneck every older card or card from their competition.
>architecture has no defining effect on performance!
>raw specs are all that matter!
it's very likely that kepler was underutilized in many workloads or that the gated off fp64 units were counted in those transistor/core specs.
>Maxwell was a big scam where Nvidia sold cheaper chips with one beefed up feature (tessellation)
wtf are you talking about? maxwell was not some shitty rehash of kepler. it was a big jump in per-core throughput and rasterization performance.
nvidia even implemented that psuedo tile based rendering technique that's documented here: realworldtech.com
Do they gimp or not optimise for them anymore? I have a 1070 that I got kind of cheap and worry about the future potency with DX 12.
They just don't really optimize them. This can still cause decreases in performance though which is why people believe they're actively being gimped.
>pre-GCN
There's your answer as to why 6000 series isn't supported anymore.
Dropped support =/= gimping btw.
Adding onto that, the performance decreases you might get will be fairly marginal (about 2-3 FPS usually) and depends heavily per game. ArmA III runs worse on kepler w/ newer drivers for example, or so I recall.
>>>v
stfu kid -_-
I got my 1070 for a miracle price of £330, over here that's dirt cheap at new, but I still worry about DX 12, then if I'd have gotten an rx 480 I'd probably worry about comparatively shit performance in DX 11.
IIRC AMD just pushed out a driver recently that massively reduced DX11 driver overhead. I saw a benchmark that showed the RX480 overtaking the 980 in TW3.
But as for your 1070, that's an alright value. What model? I'd worry more about the GDDR5 memory as that might fall behind drastically w/ HBM and GDDR5X on the rise. You can see massive benefits w/ GDDR5X on higher resolutions with the 1080.
I got the Zotac AMP, I'm more of a refresh rate frequency person and have a 1080p 144hz monitor coming.
I got the Zotac AMP, I'm more of a refresh rate frequency person and have a 1080p 144hz monitor coming.
I have an offer to sell this tomorrow for £370 since prices are all up and down. May go for a cheap 470 since all I'm playing is DE:MD and god eater.