Duke4.net Forums: Radeon vs NVidia. - Duke4.net Forums

Jump to content

  • 4 Pages +
  • 1
  • 2
  • 3
  • Last »
  • You cannot start a new topic
  • You cannot reply to this topic

Radeon vs NVidia.  "Favorite GPU brand?"

Poll: Radeon vs NVidia. (37 member(s) have cast votes)

Which is your favorite video card brand?

  1. Radeon (8 votes [21.62%])

    Percentage of vote: 21.62%

  2. NVidia. (29 votes [78.38%])

    Percentage of vote: 78.38%

Vote Guests cannot vote

#1

NVidia has my vote, b/c their drivers are so much better than the shittier Radeon drivers. Radeon's has buggy drivers that sometimes doesn't detect it's video cards very well. How about you?

This post has been edited by DustFalcon85: 10 May 2013 - 07:38 AM

0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#2

I don't have a favorite brand. At the moment, I'm siding with AMD.

I hate the 600 series cards. Many are overpriced, the GPGPU performance flat out sucks, and the memory buses are too narrow. The high end ones either have too much or not enough VRAM.

Also AMD has gotten their driver shit together. I've had my card for over a year and a half now and I have no issues, except for polymost stutters, but on average nVidia has given me more trouble with older software.
0

User is offline   The Commander 

  • I used to be a Brown Fuzzy Fruit, but I've changed bro...

#3

I can not vote...

Price wise for performance and power usage I would go with ATI.
Or if I am rich, then sure I would pick Nvidia.

http://www.3dmark.co...4/3dm11/5223792 - HD5750 in current PC vs my old set up.

http://www.3dmark.co...3/3dm11/6018138 - HD5750 vs GTX 570 in my current set up.

EDIT: I am not sure why but the clock cores of my i7 3770 read as 1.6 GHz when it should be 3.4 GHz in a couple of the tests.

This post has been edited by NZRage: 11 May 2013 - 12:38 PM

0

#4

Nvidia is better at delivering Anti Aliasing support/performance as well as very well done Vsync.

I would go for Radeon HD cards because I'm poor :lol:

EDIT: Also Radeon HD cards in terms of performance between Nvidia are quite the same as they leapfrog, so there is no right or wrong decision in that.

This post has been edited by sheridanm962: 11 June 2013 - 04:06 PM

0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#5

View Postsheridanm962, on 11 June 2013 - 04:05 PM, said:

Nvidia is better at delivering Anti Aliasing support/performance as well as very well done Vsync.


Nvidia only has a performance advantage when their cards have better memory bandwidth, and actually, the same goes for ATI. Their current cards have less than what ATI offers, most of the time.
0

User is offline   Jeff 

#6

I used to buy ATI Radeon cards (back around 2003 or so), but I had an issue with one, and have been buying Nvidia ever since. No issues here.
0

User is offline   Tea Monster 

  • Polymancer

#7

I've stated why I like Nvidia previously. Its not that I particularly like Nvidia (I used to be a Matrox Guy), but they do the job and don't get in the way of what I'm doing. Every time I've owned an ATI card, I've always played 'hunt the driver' and spent more time trying to get stuff working than actually using the damn thing.
1

User is offline   Paul B 

#8

ATi catalyst drivers are always a big source of frustration and the hardware they use seems to be way more prone to failure. You get what you pay for though and in my opinion there is no real comparison between the two products. Been selling hardware for 13 years and we only special order ATi for the people who come in and demand it. That way they can't come to us and say what kind of crap did you sell me?

This post has been edited by Paul B: 12 June 2013 - 08:00 PM

0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#9

View PostPaul B, on 12 June 2013 - 07:46 PM, said:

ATi catalyst drivers are always a big source of frustration and the hardware they use seems to be way more prone to failure. You get what you pay for though and in my opinion there is no real comparison between the two products. Been selling hardware for 13 years and we only special order ATi for the people who come in and demand it. That way they can't come to us and say what kind of crap did you sell me?


That's not ATI's problem.

1. Company X buys a chip from the supplier.

2. Company X buys the surrounding components from multiple suppliers.

3. Company X assembles, packages, and sells the card.

4. Company X's reliability depends on steps 2 and 3, it's not ATI's fault.

I'm not really a fan of either brand, but honestly, if anyone has worse chips, historically speaking, it's Nvidia.

Some Nvidia chips have heat problems (FX 5800 Ultra, 8800 Ultra, GTX 295, GTX 480). Actually, the 480 runs so hot, when combined with the lousy reference PCB design, it's famous for VRM failure. Others run very hot (GTX 280, 465, 470, 8800GT), either due to the design of the chip or the reference PCB/Cooler. Other chips like the GTX 280/285/295 have 512-bit DDR3 memory buses, which is fucking retarded, hard to implement, and always results in an overly complicated PCB with many points of failure. Why they didn't use GDDR5 is beyond me, Radeons were already using it.

And don't even get me started on the G84/G86 notebook parts. My buddy's T61 just failed, now I gotta replace the motherboard. Way to go Nvidia, releasing a flawed design then lying about it. Pissed off every OEM on the planet for a few years.

I can't think of any chips from ATI that have that issue. There have been some really shitty chips like the 2900XT, and the Radeon 7000, but they didn't kill themselves.

Nvidia also pissed off Microsoft and Sony, neither of them have nice things to say about their partnerships in the console industry.

I love Nvidia products, every one I've ever owned has been top notch, but they are one hell of a shitty company. I don't understand how Nvidia fanboys act like ATI is complete shit because VERY few companies in the PC industry have fucked people as hard as Nvidia and lived to tell the tale.

Also, how can you claim ATI's are less reliable when you barely see any and have customers special order cards? Customers are fucking dumb and naturally they are going to gravitate towards cheap, shitty cards like Powercolor while real enthusiasts will stick with better brands like MSI or XFX. The price difference between a high quality ATI or Nvidia card is pretty nonexistent these days.

This post has been edited by 486DX2: 12 June 2013 - 09:14 PM

0

User is offline   Tea Monster 

  • Polymancer

#10

When I was a lot younger, I may have been more interested in how many transistors it had or the nitty gritty of how it's memory pathways were laid down. Now I'm much more aware of how much time I spend trying to sort it out. And that is time that is taken away from doing what I want. At the end of the day, I decided that doing what I wanted was more important than what was, essentially, babysitting a badly thought-out machine.

It's like in the old days, it was great to look cool because you could sort out IRQ conflicts with sound cards and modems, but looking back, I'm really glad I don't have to arse around with that crap any more. More time with my lady, talking to the kids and playing Duke = priceless :lol:

This post has been edited by Tea Monster: 12 June 2013 - 11:22 PM

0

User is offline   Forge 

  • Speaker of the Outhouse

#11

i've owned both brands over the years. both performed satisfactorily.
the better bang for your buck battle and the driver dance have fluctuated back and forth over the years and will continue to do so
bottom line: buy cheap from either company - get cheap performance
0

User is offline   Tea Monster 

  • Polymancer

#12

Yeah, I can never justify dropping more than 150 on a graphics card. The only reason I've done that recently is that I've been starting to get a bit of money back here and there from the 3D stuff so I decided to treat myself to something that would handle CUDA with some authority as it would pay for itself.
0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#13

I spent 230 on my current card in November 2011.

Sapphire Radeon 6950 Flex Edition 2GB.

Unlocked the unused shaders and overclocked it to 6970 speeds.

Still hauls ass, basically the poor man's 570, maybe 5% slower and $150 cheaper. Plus it has ~800MB more RAM than most 570's.

IMO it's much cheaper to spend the extra $100 and get an extra year or two out of the card.

That said, I'd never buy another Sapphire card. Their hardware is top notch, but their warranty sucks (2 years), and I have to use their God awful TRiXX overclocking software for voltage adjustment. Right now, I have a Powershell script, batch file, and a resolution changing program load at startup so my display doesn't get distorted after the clock change. Fortunately someone posted how to do this online so it was a two minute fix.

This post has been edited by 486DX2: 13 June 2013 - 06:50 AM

0

User is offline   Paul B 

#14

View Post486DX2, on 12 June 2013 - 08:49 PM, said:

That's not ATI's problem.

1. Company X buys a chip from the supplier.

2. Company X buys the surrounding components from multiple suppliers.

3. Company X assembles, packages, and sells the card.

4. Company X's reliability depends on steps 2 and 3, it's not ATI's fault.

I'm not really a fan of either brand, but honestly, if anyone has worse chips, historically speaking, it's Nvidia.


I don't necessarily agree with this as ATI use to do a lot of in house stuff from before 2001 they were still crap back then and now not really much has changed. If the OEM's decide to cut corners well they unfortunately bring down the entire product image. I don't understand why they cheap out with individual small IC's anyway. So they might save 2 cents on any one component. Is that really worth developing a bad rep. As if they already don't have good enough margins for the volumes they sell.

View Post486DX2, on 12 June 2013 - 08:49 PM, said:

Also, how can you claim ATI's are less reliable when you barely see any and have customers special order cards? Customers are fucking dumb and naturally they are going to gravitate towards cheap, shitty cards like Powercolor while real enthusiasts will stick with better brands like MSI or XFX. The price difference between a high quality ATI or Nvidia card is pretty nonexistent these days.

Because we service all types of computers, even ones we don't sell. Infact the ones we service the most are the ones other people build using ATI & AMD processors. Maybe its different where you are from but where I am from 90 percent of our service are from those cheap enthusiast PC's which overheat, lock up and just have bad ATi hardware\drivers. So in theory, I should be promoting that crap to keep the business booming just not for our store. =P

As for Nvidia's chipsets on motherboards.. That's another story. That's just plain crap.

This post has been edited by Paul B: 13 June 2013 - 01:44 PM

0

User is offline   Tea Monster 

  • Polymancer

#15

View Post486DX2, on 12 June 2013 - 08:49 PM, said:

That's not ATI's problem.
(snip)

Also, how can you claim ATI's are less reliable when you barely see any and have customers special order cards? (snip).

But if you recall my story about the Pro cards they don't work either and ATI's service desk can't even tell you what driver version they are supposed to be using.

Paul B - I've had 3 Nvidia chipset motherboards, and neither of them has lasted more than a year before mysteriously dying.
0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#16

View PostPaul B, on 13 June 2013 - 12:28 PM, said:

I don't necessarily agree with this as ATI use to do a lot of in house stuff from before 2001 they were still crap back then and now not really much has changed. If the OEM's decide to cut corners well they unfortunately bring down the entire product image. I don't understand why they cheap out with individual small IC's anyway. So they might save 2 cents on any one component. Is that really worth developing a bad rep. As if they already don't have good enough margins for the volumes they sell.


What are you talking about? ATI's in house cards were great, the drivers were a total letdown. Anne Frankly, prior to the 8500, their chips weren't all that great either. Sapphires earlier stuff...hoo boy...what a load of crap. \

Once again, this is due to customers buying crap. There's a reason the twin cam Dodge Neon was a cheap go fast car in it's heydey. It was a piece of shit. It could nearly keep pace with an SN-95 'Stang, but you got what ya paid for.

I have a 2GB Sapphire 6950 Flex Edition. It has vapor chamber cooling like a GTX 580, dual BIOSes, a BIOS switch with a second BIOS that unlocks unused shaders (no flashing needed!), voltage adjustments and killer VRM's. Their overclocking software I have to use sucks, but it's the best built card I ever owned. I own a GTX 570 for a fraction of the price.

Quote

Because we service all types of computers, even ones we don't sell. Infact the ones we service the most are the ones other people build using ATI & AMD processors. Maybe its different where you are from but where I am from 90 percent of our service are from those cheap enthusiast PC's which overheat, lock up and just have bad ATi hardware\drivers. So in theory, I should be promoting that crap to keep the business booming just not for our store. =P


Keyword is "cheap." How is this the chip manufacturers fault again? Retards on a budget building their first PC love their Foxconn/Powercolor/Zotac/Seagate garbage. "Oh wow look at these great refurbished barebones deals!"

I owned a Zotac card ONCE, and it was only because no other company was manufacturing 2GB 460's, and I wanted to run SLi later on. That piece of shit wouldn't even overclock 3MHz, but that's the manufacturers fault. But it still works...

As for cheap video cards, you need to look on Newegg, there's all kinds of crappy Nvidia cards like Zotacs, Palits, and Sparkles. Even the cheap shit from EVGA can be pretty naff at times.

Quote

As for Nvidia's chipsets on motherboards.. That's another story. That's just plain crap.


I'm building Jimmy a gaming rig with a 750i SLi.

Worst chipset ever. Shit's all over the terrible KT266A I had a decade ago.

The nForce2 was great. That's it. 3 and 4 were meh, everything else sucks except those cool custom Apple chipsets with the fast integrated video. My 750a SLI's PCI-E bus took a fat crap a couple years ago.

View PostTea Monster, on 13 June 2013 - 03:37 PM, said:

But if you recall my story about the Pro cards they don't work either and ATI's service desk can't even tell you what driver version they are supposed to be using.


Yeah, I recall that story. Their Pro cards suck. We're not talking about those, they've never been very good.

This post has been edited by 486DX2: 14 June 2013 - 05:22 PM

0

User is offline   Forge 

  • Speaker of the Outhouse

#17

View Post486DX2, on 14 June 2013 - 05:18 PM, said:

As for cheap video cards, you need to look on Newegg, there's all kinds of crappy Nvidia cards like Zotacs, Palits, and Sparkles. Even the cheap shit from EVGA can be pretty naff at times.

*cough*PNY*cough*

View PostTea Monster, on 13 June 2013 - 03:37 PM, said:

- I've had 3 Nvidia chipset motherboards, and neither of them has lasted more than a year before mysteriously dying.

mine lasted two years before taking a crap. just started imploding section by section - first the pci slot died, then all the usb ports went out - had to use a serial mouse & keyboard, then those ports died and i tossed it.
the only other component i've ever had go out on me other than an occasional stick of ram was a P-133 cpu way back in the age of dinosaurs. Those were the days... all you had to do was release a little lock lever and out popped the chip, put the new chip in, push the lever down, hit the power button.
0

#18

Funny that, I just uploaded a video on my only surviving nForce board system... But that's OT, still, short ramble time;

nForce chipsets were useless, do they still make them? I do not care, nForce 2 was fast when it worked, but that was a rarity. I never used one that was reliable. I never heard of an nForce 3 board that worked and by the time the nForce 4 came around I was using Intel chipsets. I had a similar history with them to Forge. And I miss the old days too, back when you could upgrade the memory on your Graphics or Sound card, replace the Cache for the CPU and all manner of other good ideas.

For graphics cards, I used to favour ATi. As with any product I generally don't favour any company as such, I just favour the one that makes the best product for me at the time, I shall elaborate;

I used to be an ATi man, the Magnum and Rage cards were excellent for what they cost, my Radeon VE never let me down. The 7500 was very flexible and I never found a program using a display mode it had problems with, it beat the shit out of my MX420 although I did want a GeForce 4 Ti back then having since used one for myself I can say they were overpriced and had stability issues, likely drawing too much power for the old AGP slots to handle (do not put it in a board with no ATX 12V power connector!) as well as compatibility problems (a lot of DX8 demos say the card doesn't support the features they need - not too much of an issue, the PC Scene post '98 is rubbish IMO) but the performance is nice when it works.

My Radeon 9200 was a letdown, did not run properly with Win98 although that may have been due to the problematic nForce 2 board I was running it in because when it ended up in a Pentium II I own it worked well enough save for some flickering in DOS mode - which I can forgive it for. It beat the shit out of the GeForce FX series anyway.

My FireGL / V300 were pants, that was the first card in this machine and it didn't run shit, even DXDiag's tests ran at no-frames-per-second. TRAOD used to run like crap too, though it is a crappy game. Replaced it with an Inno3D GeForce 7300GS which was cheap, that worked nicely. That was replaced with an 8600GT which was garbage, it had washed out colors (bad DSP/DAC?) and generally performed slower than my 7300 so I suspect it was faulty but I did consider jumping back to ATi at that point but my friends 3870 and it's infinite problems with OpenGL (which I rely on a lot) put me off. Ended up with a 9600 GSO which ran Crysis on Very High, this impressed me so I then used a GTX 260 which contrary to popular belief is a good card if you have a properly cooled case and steady power - I used a dedicated PSU. I now use a GTX 460 and will probably use a Titan in my next build.

Thing is, I do still look at ATI cards when I want a new one, I also research what will run on each card. Sometimes I ask people who own one already to run something for me as benchmarks only tell half the story, a fine example of this was the GeForce 3 which scored lower than the GeForce 2 primarily because it just happened to be worse at those particular tasks, it was faster in every other respect... Not that I am saying the GeForce 3 was a good card you understand.

SLI and CrossFire are a waste of money in my opinion.

Can we get 3DFX in on this argument? I want to bash them :lol:

This post has been edited by High Treason: 15 June 2013 - 05:03 AM

1

User is offline   Forge 

  • Speaker of the Outhouse

#19

View PostHigh Treason, on 15 June 2013 - 04:53 AM, said:

Can we get 3DFX in on this argument? I want to bash them :lol:

ouch.
back then when i was heavy into linux (redhat was my prefered flavor), voodoo was the only series outside of ati cards that had decent kernel/driver support and would work properly. AMD also had better response to *nix at the time.
AMD & ATI were for *nix
Intel & Nvidia were for micro$oft

some things change - some don't
back then when someone would show up on a linux bbs/forum complaining about not getting something to work right/can't find drivers . everyone would bash them to get a new card/board with a "better" brand
0

#20

Yes, nVidia's Linux support was crap last time I checked - don't know how it is now though. This doesn't affect me of course... Though there was a time when I ran Suse and Minix and these were back in the days when I used ATI.

3DFX were never a viable option for me, things like shaders and T&L were useless in their minds and you could only buffer small textures (mine is limited to 128x128px), the other issue was that the early cards were 3D only meaning you had to have an existing 2D card, it introduces weird frequencies to the monitor and that mechanical relay was a stupid idea (Relays + ICs are not a good mix, would it have killed them to add a soft-on circuit?) - the card was never popular on the demoscene for these reasons (and that's where the real pioneers usually are) so I'm inclined to find it useless.

There was also that card that had an external power brick, it didn't fit any anybodie's case and required use of an extra cable, I would have had a pass-thru to the monitor or something to be decent but you know, 3DFX, cut corners. I hear those cards were prone to setting fire too, guess you end up with two bricks that way... To be fair, I'm not sure they were released to the public.

Anybody who slates ATI's drivers doesn't know how lucky they are, if you isntall a 3DFX card you have got it for life, removing the drivers and card is not enough, go on, try and run a Direct3D program now... What's that? It won't run? It wants the 3DFX card back in? Now, let's use the official uninstaller, sorry bro, you done busted your windows installation, we know it's been there for ten years but you're gonna have to re-install it now.

A fine example of their poor performance is if you compare the demos "To be continued" and "Dope" - the former is from 1998/99 and uses Glide wheras the latter is from 1995 and uses your CPU... Whilst it may be somewhat unfair to compare strait-up assembly to an API, you can see that Dope runs far better and has almost the same set of effects. As for games, Interstate 76 supports almost every API from it's time, 3DFX mode is, as it is in every game I have tried, slow, the Direct3D and even PowerVR systems were far better and that is assuming the 3DFX modes even work.

Even nVidia knows they were a joke...

Bashing complete
1

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#21

View PostHigh Treason, on 15 June 2013 - 04:53 AM, said:

I used to be an ATi man, the Magnum and Rage cards were excellent for what they cost, my Radeon VE never let me down. The 7500 was very flexible and I never found a program using a display mode it had problems with


Posted Image

Driver problems.

This post has been edited by 486DX2: 15 June 2013 - 09:41 AM

1

User is offline   Forge 

  • Speaker of the Outhouse

#22

View PostHigh Treason, on 15 June 2013 - 06:27 AM, said:

3DFX were never a viable option for me,... the other issue was that the early cards were 3D only meaning you had to have an existing 2D card

if you isntall a 3DFX card you have got it for life, removing the drivers and card is not enough,

never went near 3dfx cards. i got one of my friends hooked on linux and he seen the driver support for voodoo cards and decided to get one. it was choppy as f*ck for the few "3D" games for linux back then, and it didn't improve anything else (non-"3D") near worth the money he spent on it. I don't know if it was because he didn't set it up right, or if it didn't have enough memory to perform as it claimed. pissed him off and he took it back, so i learned from his mistake (not that i ever intended to get one anyway)

luckily it's alot easier to remove drivers from *nix than micro$quash.

This post has been edited by Forge: 15 June 2013 - 02:45 PM

0

#23

Is this $149 NVIDIA GTX 750 ti gonna be any better than the 600 series? http://www.gamespot....e/1100-6417813/

I have a EVGA 660 Superclocked NVidia in my desktop.

This post has been edited by DustFalcon85: 18 February 2014 - 07:24 AM

0

User is offline   Forge 

  • Speaker of the Outhouse

#24

no. that 750 is a piece of garbage


660 super: Cuda 960/Core 1046mhz/Memory 192bit

750 ti: Cuda 640/Core 1020mhz/Memory 128bit


marketing bullshit

This post has been edited by Forge: 18 February 2014 - 07:50 AM

0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#25

LOL 750 Ti.

Brand new Maxwell architecture, still can't do double precision GPGPU to save it's own life. It's even worse than Kepler.

I recently bought a 4GB Asus DirectCU 770 because of the whole Radeon Litecoin shortage. I'm not exactly thrilled with financing a $370 piece of hardware that chokes to death on compute functions, and lacks it's own dedicated API.

I didn't have a choice though. My power supply failed and took my video card out with it. Guess what? The new card isn't any faster than the old one for most GPGPU functions. Way to suck a dick, Nvidia.

This post has been edited by Protected by Viper: 18 February 2014 - 08:37 AM

0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#26

GTX 770 experience so far:

-Deus Ex is jittery as hell with both the original and third party OpenGL renderer. I have to use the DirectX 10 renderer and turn off simulated multipass, so all the reflections look like shit. "Superior" OpenGL LOL.

-Intermittent, jittery pre-rendered cutscenes in Mass Effect 3. Pre-rendered. Come the fuck on Nvidia!

More driver bugs in the first month than my Radeon had in two years! *slow clap*

Now I know what you guys are thinking. "Viper! You have the fastest video card on the forum! Why are you bitching about it?!"

Well, gee, let me think. It's a $400 component that is fucking deficient in a few key areas. Nvidia dominates due to their reputation, and their CIA style "ATI drivers are bad" disinfo. Their silicon is TRASH.

This post has been edited by Protected by Viper: 01 March 2014 - 05:47 PM

0

User is offline   Inspector Lagomorf 

  • Glory To Motherland!

#27

So... your GIGABYTE mobo is bad, your ASUS Laptop is bad, and your GTX video cards are bad...

Have you ever wondered if it's not your computer parts are the problem, but you? :blink:
0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#28

The video card itself is actually very nice, honestly it's the nicest one I've owned yet. But Kepler is a turd.

I just have high standards. They don't build shit like they used to. Companies today seem really content on half assing R&D, or creating things they aren't capable of, and have marketing stick a bunch of crap on the box.

I wouldn't count on most modern machines living to the eight year mark. Modern stuff either runs too hot or is too complex for the guys making it.

Call me cynical but I've been building systems for over ten years and I've noticed a HUGE decline in quality for a while now. Like, the actual build quality is better, the parts feel nicer, are more solid, and have all sorts of pretty details, but the engineering sucks and they don't last for shit. You used to be able to buy a product from a company with a good reputation and feel safe, but now every company from Asus to Gigabyte to EVGA has bad models scattered throughout their lineup. Worst of all, some of the problems don't creep up unless you really start using the shit, like the boot loops with my board.

On top of all this, there just isn't any competition anymore. Ten years ago, picking a motherboard was a big deal. You had other great manufacturers like Epox and Abit and second tier companies like Chaintech and Soltek who had some really nice models. You also had four different chipset manufacturers. With all of the consolidation and bankruptcies the remaining players don't really innovate like they used to.

Don't even get me started on processors. Holy shit. The stagnation. You know the market is fucked when your upgrade consists of a two generation old chip that, well, isn't really outdated. At all.

This post has been edited by Protected by Viper: 07 October 2014 - 08:08 AM

1

User is offline   Forge 

  • Speaker of the Outhouse

#29

that game has been going on for years between nvidia and ati / intel and amd

one of the companies will put out a kick ass product then sit on their ass and wait for the other company to outdo them - in the mean time instead of making innovations and leaping ahead they take their current product and fiddle-fuck with it, make a couple minor tweaks, then re-market it for a quick buck.

it's why i buy mid-tier, mid-range products after they're a generation or two behind the current shit.
they're cheaper and this dick-around game with these companies means they'll still be able to run the same things the "latest and greatest" shit does for the next five years, until it either melts down or it's time to upgrade again.
1

User is offline   Plagman 

  • Former VP of Media Operations

#30

View PostProtected by Viper, on 01 March 2014 - 05:45 PM, said:

GTX 770 experience so far:

-Deus Ex is jittery as hell with both the original and third party OpenGL renderer. I have to use the DirectX 10 renderer and turn off simulated multipass, so all the reflections look like shit. "Superior" OpenGL LOL.

-Intermittent, jittery pre-rendered cutscenes in Mass Effect 3. Pre-rendered. Come the fuck on Nvidia!

More driver bugs in the first month than my Radeon had in two years! *slow clap*


I think you might be confused about what a driver bug generally looks like; it's pretty hard to believe this would have a direct impact on displaying cutscenes. I'm not saying it's impossible, but it's a lot more likely that you're just doing something else wrong.
0

Share this topic:


  • 4 Pages +
  • 1
  • 2
  • 3
  • Last »
  • You cannot start a new topic
  • You cannot reply to this topic


All copyrights and trademarks not owned by Voidpoint, LLC are the sole property of their respective owners. Play Ion Fury! ;) © Voidpoint, LLC

Enter your sign in name and password


Sign in options