wrong, i have an amd r9-290 and although it is cheap, ($300) can run bf4 at maxxed out settings, etc it is rated to run at 95c for a reason.
if you want a card that is cheap and powerful, while also having a big psu, and a case with good airflow, amd is the way to go. (lucky me, my pc has all of that)
turned out I needed signed Intel drivers from laptop manufacturer, then AMD drivers also signed by laptop manufacturer, the problem was that the first few times I installed them the AMD drivers couldn't detect my GPU. Intel drivers had no problem though. I do not even remember what exactly made them work, I restarted like crazy, turned both GPUs off then on again multiple times, even reinstalled the OS. Now it works, card is pretty powerful for the money I spent.
i had the same problem. with my packard bell easy note ls11hr with a amd radeon hd6650
holy **** it was a pain in the ass so i just gave up on it. considered to use it as it was cause some games would detect it. just got a asus rog with gtx 860m and so far no problems at all and i dont think i will ever go back to amd
I think the issue is that laptops always try to use the iGPU as standard when you're not gaming, thus resulting in the AMD software not detecting the M265, but the Intel chip instead. I had the same issues with my dad's Nvidia GTX860M.
only if they're configured to, you can configure the laptop to use dedicated GPU even if on desktop, or at least I can do that but it didn't help at all
Always contact the chip manufracturer, they're also the ones that make the drivers and all the other software.
Acer just buys the hardware, sticks them in their configurations and installs the bloatware and basic af drivers.
yeah but AMD doesn't really give a **** about notebook GPUs, it even says on their page that unless stated otherwise only the notebook manufacturer's drivers will work.
U w0t m8? A few years ago maybe? Have you seen the tdp difference between the maxwell cards and similarly performing r9 2xx cards?? Itt's literally double the amount of heat for the same performance
a 290x and a 780 ti For the sake of arguement, because there from the same 'generation' relative to each other drew the same amount of power and produced the same amount of heat.
The 980 is a different story, because it's a cut down Titan X, but the Titan X draws and produces as much heat as a 290x
The joke is old, a couple of years ago when Nvidia had their "Fermi" architecture they were the ones literally burning cards. AMD had a lot of driver and support issues back then.
I roll with a 550Ti and cant afford to build a new pc (I have had this pc for at least 4 years) and it hasn't failed me once, it has also survived some of the worst heat waves to hit the state of Virginia when we had no working AC. (The heat waves went to like 110 degrees F. plus humidity and the added heat from an insulated house that is supposed to stay warm in the cold.)
>implying nividia cards are still hot
maybe like 2 or 3 years ago but the 9XX series has good cooling, my 970 runs alot cooler than the 7970 I had before it.
i think its more because of the repuation they have gotten, their cards used to get so hot.
for example the 400 series was known for massivly hot cards, i think one even caused a house fire? nvidia made the switch and focused more R&D to efficiency which is why we see the super cool but fast cards nvidia has these days whilst AMD needs a watercooler for its stock fan on one of its top end cards.
tfw i have a GTX480 with a semi-broken cooler so i have to underclock it for it to work
That is not why the 295x2 have water cooling. It is because otherwise they would have to clock it down exactly like the titan z. I own a 295x2 and I can confirm that it is really cool and I like that AMD went for a different solution. I can overclock it heavily and it will still runs nicely.
Titan z on the other hand is like a 480 with a lower thermal cap. That is the reason that 295x2 overshadows the ****** titan z.
no it's not that, it's just something about the 980 running hot cause of bad factory ******** if I remember correctly, it's a new thing. Unfortunately because my card runs cold as **** compared to older gens (560) vs my 760 for example.
Also a lot of AMD dick riding, it's almost as if people finally noticed AMD actually makes good cards like Nvidia does.
Yeah it's two 120's or 100's i couldn't tell you for sure without taking it out, they're completely silent, actually, even under full load, my chassis fans are louder.
It's one of the best stock heat solutions i've encountered, to be fair, outside of the AMD centrifuge system, but that was loud as **** .
>tfw all these cool tower dudes and there gpu cards
>tfw you guys will never know the pain of a laptop Nvidia gpu that can never be removed and running at 126 Fahrenheit
As someone with a nvidia GTX 980 and a Titan X, I had to go caseless and get PCIe extension cables just so the damn things didn't melt themselves or the board.
**rolladdict used "*roll picture*"** **rolladdict rolled image**
I am extremely happy with Nvidia. They just updated all drivers and somehow made it possible to run 4K at 60hz with the older cards that still have HDMI 1.4 (though the color range is not 4:4:4)
No noise that I can tell of, but it gets extremely hot when I play stuff on 4K resolution xD
My old AMD bluescreened a lot (heat damage, I guess. I live in a tropical country) and was really noisy.
OP of the thread here, when I was a poorfag and couldn't afford modern GPUs, my old Nvidia GPUs used to burn, like... 1-3, GPUs, probably 1-2, Can't remember. And now I have a neat CPU, but a ****** AMD GPU because PC is brebuilt, and god damn, I keep getting BSoD like at least 1-5 times a month.
For the record, intel has the fewest graphics drivers problems
and never glitches out. Keep working hard, Intel.
You will grow up to be as big and strong as your brothers someday.
One of nvidia's higher end cards, the GTX 970, has 4GB worth of memory on board. however, due to the way that they're made, 500 megs of that memory is incredibly slow compared to the rest of it. so much so that it's effectively unusable. nvidia never made public this fact, and when it was discovered many months later, several owners felt that they had been cheated and sold a defective or incomplete product.
To be fair, this problem is only really apparent if you are crossfiring two 970's that's actually one of the reason they were able to find this problem. It still works great as a single GPU.
Yes and no. It is a problem mainly if you use a lot VRAM. The reason why it was trouble for SLI users is that they hoped to use a mid end card to get high performance thinking that the gtx 970 had 4gb of VRAM. When you get the power of 2 970's you need as much VRAM as possible and the 970 did not deliver.
I do currently have 4GB(x2) of VRAM on my GPU and it is bare minimum, 3.5GB is just not enough.
Some games attempts to use that ****** 512MB of vram and the result is heavy stutters and a ******** of problems.
You shouldn't really have an issue unless you are trying to go to 4K resolutions. The card is still the best damn thing on the market for its price range but it is downright insulting that these design choices were not made public until after they were discovered by users.
the 970 will be a great card for the next 2-3 years but it will lose its zeal much more quickly then what Nvidia was hyping it as.
Either way there is nothing AMD has that even competes for close to the price, but they may have tricks up their sleeve... who knows.
for further insite, heres my copypasta for whenever someone says GTX 970 doesnt have 4gb
Except the 970 does have 4 GB of VRAM. They are seperated onto two seperate buses or pools that cannot be accessed at the same time. Specifically, one bus has 3.5 GB and the other has .5 GB, thus a total of 4 GB as was stated. The thing is is that because they are unevenly distributed and not connected, they don't run at 220 GB/s which is what you'd see from a normal card and that you cannot access both buses simultaneously. The GPU has to actively switch between the two buses, this being done at an extremely fast rate to be virtually instant. However a consequence is that the .5 GB bus has only 1/8 the bandwidth, a still fast amount at ~28 GB/s but much slower to the ~200 GB/s on the main bus.
Frankly it doesn't matter as much as people seem to think. Ultimately when you decide what card you're going to buy, you look at the benchmarks. The benchmarks haven't magically changed before and after having discovered this design nuance. So the 970 still remains a perfectly solid purchase for the money.
Truly the issue with this isn't the fact that the design was like that, but rather that Nvidia never once bothered to mention it. Which is deceptive in that any hardware savvy individual would assume one bus because that's how cards are typically designed. Not to mention the fact that they mistakenly or otherwise incorrectly described the hardware specs for the 970. The 970 was stated as having the same number of ROPs and the same sized L2 Cache as that of a 980, which it doesn't.
What is this trying to imply, that intel is completely reliable, a blast hue to be around, and is perhaps the most perfect thing humanity has ever made?
If you want a company that is anti net neutrality, pro "diversity" in tech and pro censorship, be my guest, I'll personally stick with AMD where their employees' ideologies aren't shoved down my throat
Amd fx 9590 @ 5.09 ghz watercooled
Nvidia gtx 970 (it was cheap so I figured what the heck)
Asus sabertooth fx 990 r2.0 motherboard
120 ssd drive for os and programs and 1tb hdd for storage
Chassis: corsair obsidian 650
Windows 10 technical preview
Wired ethernet
16gb ddr3 ram
21 inch 1080p monitor
its been a couple of years and i havent really changed anything but the peripherals on my rig.
im wondering if i should actually bother getting another 7870 and running two or if i should just upgrade to a newer card. what do you think, fj?
not... really... but i do a lot of video editing, and a couple times ive had to work with 4k, which is a really big pain in the ass to render.
ive been having a bunch of bluescreens ever since i built my comp, but im 94% sure that its my ram and not actually the gpu.
pic unrelated, i just thought it was a cool setup.
Except the 970 does have 4 GB of VRAM. They are seperated onto two seperate buses or pools that cannot be accessed at the same time. Specifically, one bus has 3.5 GB and the other has .5 GB, thus a total of 4 GB as was stated. The thing is is that because they are unevenly distributed and not connected, they don't run at 220 GB/s which is what you'd see from a normal card and that you cannot access both buses simultaneously. The GPU has to actively switch between the two buses, this being done at an extremely fast rate to be virtually instant. However a consequence is that the .5 GB bus has only 1/8 the bandwidth, a still fast amount at ~28 GB/s but much slower to the ~200 GB/s on the main bus.
Frankly it doesn't matter as much as people seem to think. Ultimately when you decide what card you're going to buy, you look at the benchmarks. The benchmarks haven't magically changed before and after having discovered this design nuance. So the 970 still remains a perfectly solid purchase for the money.
Truly the issue with this isn't the fact that the design was like that, but rather that Nvidia never once bothered to mention it. Which is deceptive in that any hardware savvy individual would assume one bus because that's how cards are typically designed. Not to mention the fact that they mistakenly or otherwise incorrectly described the hardware specs for the 970. The 970 was stated as having the same number of ROPs and the same sized L2 Cache as that of a 980, which it doesn't.
The writer sucks at drawing, so he makes crude sketches, which the art guy uses to draw the rest. Occasionally they leave Saitama in the writers' style and do everything else in the other guys' style.
To elaborate a little bit more: The original author AND drawer, ONE, has drawn Onepunch Man for a while. Murata, through some circumstance I don't know about, ended up drawing the "official" manga later, in his own superdetailed style. ONE still draws the original webmanga and is several chapters in front of Murata as such.
As a matter of fact, several people enjoys ONEs depictions better than Muratas.
And yeah, Saitama is usually drawn crudely simply because he (Saitama that is, not Murata) doesn't give a **** . Which is why when he gets "serious", the art gets serious.
It's usually played the other way around but yeah, when Saitama gets serious the art does too. He even has moves for the situation. Usually, because he is strong enough to beat someone in a single punch, he has moves like "Normal Chain Punch" or something, but in Murata's latest chapter Saitama used "Serious Consecutive Side Hops" to essentially create an army of residual images of himself.
He just rarely gets serious is all. That's kind of the joke.
i go green, but meh, just what im used to, might switch to amd if they get physx, tesselation, and a better control panel, like what nvidia already have.
no hate, just prefer the green card, rather than the red ones, due to software.
AMD was years before NVIDIA with tesselation.
The control panel is good.
PhysX is so unsupported it is practically non existing and you can still run it on CPU. (Some have hacked it to the GPU as well).
I would much rather have the pros of AMD like the ACE units and existing double precision performance. Also, let us be honest. NVidia's high end pricing is just **** . I literally get 2x the power if I go with AMD compared to NVidia when you spend big money. (1000 to 1500$USD)
Not saying that NVidia is bad, just overrated. I have used both vigorously and so far my best experience have been with AMD.
AMD already has tessellation, they have for a long time, and they have their own special effects like tressfx and whatever else.
To be honest, my Nvidia GPU has given me way more problems than my AMD GPU, and I have two pc's, one with each. Radeonpro can give some more options as far as the control panel.
It all comes down to a matter of preference of course, but I prefer the red team.