Whelp, Nvidia is now officially on my shitlist forever.
#1 Posted 15 August 2014 - 02:26 PM
...And then they try to pull this shit.
This company's business ethics are so lacking they make Intel look like the fucking Red Cross.
That's it. I'm done. Anyone who buys an Nvidia card and isn't doing hardcore 3D modeling is a fucking idiot at this point. Sure, there are a few models I'd recommend. But overall, fuck this company.
This is exactly how you sell products based solely off name and image. I shouldn't have a love/hate relationship with ANYTHING that costs $400, period. God damn Litecoin shortage fucked me.
This post has been edited by Protected by Viper: 15 August 2014 - 02:34 PM
#3 Posted 15 August 2014 - 06:06 PM
#4 Posted 15 August 2014 - 06:28 PM
#5 Posted 15 August 2014 - 06:29 PM
This post has been edited by Jeff: 15 August 2014 - 06:31 PM
#6 Posted 15 August 2014 - 06:39 PM
TerminX, on 15 August 2014 - 06:28 PM, said:
Poor Lunick
#7 Posted 15 August 2014 - 11:02 PM
All caps because caps lock is cruise control for getting fucked by a company.
-Had to go to the source menu on the TV and rename HDMI2 to DVI PC to fix the overscan.
-Spent NINETY FUCKING MINUTES trying to get the color to look right. There was blurring, ghosting, and brightness changing like crazy depending what was on screen.
-Tried countless options, on both the TV and Nvidia control panel, only to discover this:
There is no way to get this card to display RGB 0-255, PERIOD! At least not with this TV!
Nvidia DOES NOT support 0-255 RGB over HDMI or DisplayPort for many TV's. It forces on 16-235 on most TV's. Y Pb/Cb Pr/Cr also looks like shit. The colors look fucking awful. Brightness is fucked and violently changes on a whim depending on what's on screen. There is NO SWITCH in the drivers to force an EDID override (Nvidia sucks at detecting EDID settings anyway). There used to be a third party program that forced it using registry entries, but it doesn't work anymore with newer drivers. Radeons have these overrides out of the box, and auto detect much better too. You know who else does? Fucking Intel of all people.
You know how I made the picture LESS SHITTY? I hooked up a DVI to HDMI cable I had lying around, along with a 3.5mm to RCA cable. Something changed, but the brightness was flickering and the colors were still a bit washed out. The Nvidia Control Panel saw the TV as an HDMI device still, so I was fucked.
But hey, you know why that's no good? Because I wanted to download a trial version of Tridef 3D and have some fun with my 3D glasses after a long day. I planned this ahead all week and got fucked by these idiots. Good luck working your voodoo magic and getting 120Hz 3D working with a DVI connection. Oh and Nvidia doesn't tell you this, but their 3DTV Play software is only 720p and most 3DTV's don't work with it, including this top of the line Samsung. There's no supported device list though. They just want you to click the big "BUY NOW" button, sign up for the Yiddish Download Protection Service, because hey, charging you $8 is more profitable than emailing you a product key, and they'll just hide the trial version link while they're at it. I mean hey, they already have your money, right? "Sorry but we can't refund software."
You know what worked perfectly with this TV? My Radeon. Nvidia has known about this problem for YEARS but chooses to go silent 90% of the time (What a fucking shock) or blame the manufacturer. The GTX 260m in my dead Asus G51Vx works better, but still has color issues. I chalked it up to the G92 architecture having crappy HDMI - it's old after all!
But no. All these years later and they're STILL fucking up the most open closed standard out there.
I'm selling this fucking card ASAP and getting an R9-290. No more. Fuck Nvidia. I am never speccing an Nvidia card in any build I do for anyone either, unless they absolutely need it for something like CAD or Maya or some shit.
This company doesn't give TWO FUCKS about their customers. They don't even give a shit about OEM's or system integrators! Look at the 7900GT "fourth day syndrome," G84/G86 fiasco, high GTX 285/295 failure rates (512 bit memory interface six years ago. WHY?!), GTX 480 VRM failure rates, or their current fuckup, the 880M GTX. Their flagship mobile card throttles like a mother fucker for many users across multiple notebook manufacturers, and their response? Silence. Either that, or making shill accounts that just say "ATI DRIVERS SUCK" to try to keep people from switching.
Fuck this company. They only care about short term profits. The moment they sell a shipment of chips they dust their hands off and say "not my problem." If they don't care about OEM's they certainly don't care about you. I'm not buying from them again, not for a very long time. Their chink founder/CEO needs to fuck off back to China because here in America we'll actually stop buying your shitty product after you fuck us over. It just takes us longer than it should.
Worst. Graphics card. Ever! I've owned TONS of fantastic Nvidia cards. This was my first "high end" graphics card purchase. Nvidia fucked over the wrong customer - never sell a Cavalier tarted up as a Cadillac. The Cimarron nearly destroyed a whole brand.
Jeff, on 15 August 2014 - 06:29 PM, said:
Factory overclocks suck, they exist to bilk people out of money. Rarely are they high enough to truly improve performance.
Nvidia wants me to buy a 780. They didn't want me to tweak my 770, despite it never reaching those speeds, period. Nvidia forces OEM's to lock their voltage to no more than a 10% increase, although every card can be unlocked via third party software, save for Asus ones. Too bad I bought an Asus before people figured it all out! My card has HORRID chip too, it goes literally nowhere. 30MHz is impossible. Normally I could up the voltage but I'm being squeezed by power heebs.
"But Viper, that would damage our profits!"
Oh yeah? I would never spend that much to begin with! Nvidia needs to take a crash course in fucking business ethics. Assholes. It's like Ford selling a Mustang who's ECU disables the whole car if you do so much as add more than a catback exhaust.
This post has been edited by Protected by Viper: 15 August 2014 - 11:19 PM
#8 Posted 15 August 2014 - 11:16 PM
Quote
2. You're selling a card? Oh, god, I hope you didn't overclock it because if it dies on the next owner you just might not have heard the end of it yet.
3. Chink? What is it with you and racist undertones.
4. All overclocks suck, if your machine is running badly enough that you think overclocking is necessary maybe you should take a step back and think about whether you really know what your doing, or the ad-ware, or the shit you left running. The performance must be that poor due to somebodies incompetence and computers don't really make mistakes.
5. You know when a toddler doesn't get it's own way?
6.
(Everyone else uses GIFs to speak, so can I...)
This post has been edited by High Treason: 15 August 2014 - 11:31 PM
#9 Posted 16 August 2014 - 02:24 AM
#10 Posted 16 August 2014 - 05:11 AM
UPDATE: I have your same level of hatred as yours Viper. But this one's not at NVidia but at Time Warner Cable.
This post has been edited by DustFalcon85: 16 August 2014 - 05:58 AM
#11 Posted 16 August 2014 - 07:09 AM
High Treason, on 15 August 2014 - 11:16 PM, said:
O RLY? Is that also the case when you overlock a computer right after you finished assembling it?
Protected by Viper, on 15 August 2014 - 11:02 PM, said:
All caps because caps lock is cruise control for getting fucked by a company.
-Had to go to the source menu on the TV and rename HDMI2 to DVI PC to fix the overscan.
-Spent NINETY FUCKING MINUTES trying to get the color to look right. There was blurring, ghosting, and brightness changing like crazy depending what was on screen.
-Tried countless options, on both the TV and Nvidia control panel, only to discover this:
There is no way to get this card to display RGB 0-255, PERIOD! At least not with this TV!
What were you trying to do, exactly?
The process of getting my nVidia video card to work with my HMZ-T1 head-mounted display (which uses HDMI) amounted to connecting the HDMI input of the visor (its ONLY input) to the HDMI output of the video card, switching the visor on, and switching the computer on.
This post has been edited by Altered Reality: 16 August 2014 - 07:33 AM
#13 Posted 16 August 2014 - 07:15 AM
#14 Posted 16 August 2014 - 08:14 AM
High Treason, on 15 August 2014 - 11:16 PM, said:
AMD and Intel don't sell defective chips to OEM's then try to cover it up. Intel came clean about the P3 1.13GHz, the i820, Cougar Point, and others. AMD admitted the Phenom TLB bug immediately after they discovered it.
Nvidia is an arrogant shitty company with horrible values.
Quote
Look man, you don't know much about overclocking, we've already established that before. The only reason I popped my Biostar board was because I was drinking, hanging out with a friend, and typed in the wrong set of numbers.
I've overclocked everything from smart phones to laptops to AMD 5x86 chips...It always works, and it's always safe and reliable if you do it right.
How can you say no to free performance?
Quote
Chinese people are awesome. Their food, culture, and history are amazing. But chinks stink.
Quote
My computer would eat yours alive, dude. Not that I actually care about who has the faster computer, I'm just trying to illustrate a point here. There's a HUGE difference between 3.3 and 4.5GHz, it's like going from something super fast to straight up science fiction.
Getting to skip an upgrade cycle for the cost of a $35 heatsink is badass.
Quote
You know when someone spends $400 on a "premium" product and it turns out to be anything but?
That warrants bitching. Four hundred bucks is a lot of money, especially when it's nothing more than a component of a larger system.
DustFalcon85, on 16 August 2014 - 05:11 AM, said:
UPDATE: I have your same level of hatred as yours Viper. But this one's not at NVidia but at Time Warner Cable.
TWC? I feel your loss. My brother has TWC in Manhattan.
He gets speeds of 30/5, and Cablevision gives us 101/25 for the same price, and we always get over 120 down.
Altered Reality, on 16 August 2014 - 07:09 AM, said:
He's said before that "overclocking is not acceptable in a production environment." Once he tweaks his first expensive build he'll enter the promised land.
I went out and partied a couple months back and just for shits and giggles I ran Prime95 again. 16 hours stable...but apparently trying to discover a new prime number is more demanding than video editing. He'll see my point some day when his crazy fast CPU renders a video at the speed of light.
Now if I could only fix the damn boot loops on this Gigabyte board. It does it at stock clocks too...just not as often. The new modded BIOS I have seems to help so far, if it starts looping again I just cut the power when I'm done using it, then flick it back on when I go to use it again and do it again next week. Weird, but so far, so good.
Quote
The process of getting my nVidia video card to work with my HMZ-T1 head-mounted display (which uses HDMI) amounted to connecting the HDMI input of the visor (its ONLY input) to the HDMI output of the video card, switching the visor on, and switching the computer on.
HDMI to anything that isn't a television isn't an issue works fine with Nvidia. It's a real bummer too. I would have kept the card otherwise. The performance is fantastic.
Comrade Major, on 16 August 2014 - 07:14 AM, said:
It's disgusting anyway. But someone should really fix that. It's not good for the Duke community when over 30% of potential new members can't use your software.
Micky C, on 16 August 2014 - 07:15 AM, said:
Because it's a meme perpetuated by Nvidia fanboys. It held water in years past, but it certainly doesn't now. Nvidia DOES have longer hardware support, but if you don't keep hardware for 5+ years it isn't really an issue.
Even back in the early 2000's, during the R300 era, Radeons had MUCH better DirectX drivers, but Nvidia fanboys wouldn't admit it. My old Radeon 9700 Pro had the most stable drivers out of any card I ever owned. I still miss that card.
This post has been edited by Protected by Viper: 16 August 2014 - 08:31 AM
#15 Posted 16 August 2014 - 08:34 AM
Protected by Viper, on 16 August 2014 - 08:14 AM, said:
If it's not the top-spec version, it's defective.
Quote
And with that statement I question whether I can ever take anything you say seriously. Alcohol and overclocking? I'm sure I said something about incompetence, I fail to understand who would think that was a good idea.
If I know nothing about overclocking, can you explain to me why I currently have the worlds fastest known Am5x86 in several benchmarks? Along with the fastest SX-class 40MHz system? Hmm... I wonder, seems my overclocking endeavors might be more successful than yours.
Quote
My dick is bigger than yours.
No, I suspect it would, probably because a large chunk of mine came out of a trash can and what was paid for came to less than £350.
Quote
As I didn't blow mine up overclocking it, I got to skip a second upgrade cycle and build a better machine.
Quote
That warrants bitching.
I've noticed.
Quote
It isn't. Stability is simply more important in that setting.
Quote
In this day and age, yeah. Why did you either not build a better one in the first place? But these days, you can run practically everything on a decent i3/i5 rig.
If you want to overclock, go ahead, I'm not telling you to stop... On the contrary I find it rather entertaining for several reasons.
This post has been edited by High Treason: 16 August 2014 - 08:36 AM
#16 Posted 16 August 2014 - 08:49 AM
High Treason, on 16 August 2014 - 08:34 AM, said:
Technically speaking, that is correct. But binned chips should still work fine.
Quote
It was a dark time in my life and another one of my friends was doing heroin .
Quote
That's fucking boss. Not gonna lie. That's freaking sick.
Quote
No, I suspect it would, probably because a large chunk of mine came out of a trash can and what was paid for came to less than £350.
I edited that comment because I realized it came off as something pretty gay. I also drive a Mustang and winning races is irrelevant to me. I don't care if someone has something better than me. Envy is for teenagers.
Quote
Actually, the total cost of going to Intel was so low due to store credit from an unrelated extended warranty on a chair, and this used CPU I just ditched the AMD rig.
Quote
Of course, but perfect stability can easily be had when overclocking.
Quote
If you want to overclock, go ahead, I'm not telling you to stop... On the contrary I find it rather entertaining for several reasons.
They can run anything, but even in day to day use a heavily overclocked i5 feels much quicker.
The biggest bottleneck in my system is the old Crucial M4 128GB SSD I have now...lol. I'll probably replace it soonish, but only cause it's too small for my taste. I like the speed...
This post has been edited by Protected by Viper: 16 August 2014 - 08:50 AM
#17 Posted 16 August 2014 - 09:27 AM
I'm just being a dick for the sake of it; When things suck you drink but I can't do that, so I just act like a dick instead.
Also, if I'm honest, I can appreciate how annoying it is when you spend a lot of money on something and it doesn't do what you expect of it, that shit used to happen to me all the time - why do you think I dig in the trash so much? It's so I can't be disappointed, if I don't pay for it I don't care and then I just run it into the ground, you should see the state of the rig I have now, it leaks cooling fluid constantly but that doesn't matter because the pump is dying and the fluid has gone a nasty color with age, only three out of eight drives are functioning and two of those have to use long cables that go out to the eSata ports, also I have to put a sheet of card between the GPU and the bottom of the drive caddy or else the VRM goes short and burns holes in things. I went through the disappointment thing recently with a laptop I was sold as having Quadro graphics only to find it was the Intel graphics version, a while back I had a GPU that was DOA and a CD changer that worked once before failing to read the discs and refusing to give them back to me. Panasonic's own instructions stated I would have to pretty much destroy the mechanism beyond repair to get them out of there, I knew I should have just got another Teac or Pioneer...
I also have to be honest that I don't read so much into the AMD on the chart I posted as it scores lower than the second place 5x86 in most tests but the Doom realticks hold a lot of weight on the final score - that was a win for me though because I tailored the system towards games. Though it's the only non-PCI one in the list. I should drop an 83MHz POD in there really and see what happens.
#18 Posted 16 August 2014 - 09:59 AM
Hard disks can't keep up with the I/O demands needed by some software now. One of the chief reasons modern consoles and gaming systems need 8GB of RAM is because streaming world data off a hard drive isn't always doable these days. There are plenty of modern games that stutter when texture streaming or loading world data on the fly. World of Warcraft and Mass Effect 1 with the 4K HD texture pack suffer from this the worst in my experience. I've personally tweaked these on multiple computers. WoW has massive 10-30 periods of hitches and FPS drops in populated city centers and ME1 HD stutters like crazy even on a brand new freshly defragmented WD Black. Neither can run without constant hitches on a conventional hard drive, it's just not possible. My buddy kept asking me if an SSD was really necessary and he's an avid WoW player. He told me that within ten minutes of gameplay he knew I made the right decision. We're going to be using TONS of RAM until hard drives go the way of the dodo.
As for stuff not working after dropping tons of money, I've been there, but I find these days companies figured out how to make stuff good enough so you stop liking it on Day 31...that's when you start discovering all the little niggles that drive you nuts.
This post has been edited by Protected by Viper: 16 August 2014 - 10:02 AM
#19 Posted 16 August 2014 - 10:01 AM
This post has been edited by Protected by Viper: 16 August 2014 - 10:01 AM
#20 Posted 16 August 2014 - 10:12 AM
It probably doesn't help that I know three people who had SSDs and they died within a year (Edit: the drives, not the people), the WD Black came with a 5 year warranty for £90 each, seemed like a good deal. I have heard that they are more reliable now though and once the price comes down and the capacity increases I'll be quick to throw the mech drives in the trash, after all, what would SSD's do in RAID 0? It's just be silly.
Ugh, day 31. In my case Day 14 as that's the point when it cannot be returned to the store, I always hold my breath when I start a new device up on that day, you can almost guarantee there will be something not working, an error message appears, windows loads in 16-color VGA mode, a clicking noise can be heard or performance drops for no apparent reason depending on what said device is.
This post has been edited by High Treason: 16 August 2014 - 10:12 AM
#21 Posted 16 August 2014 - 10:24 AM
High Treason, on 16 August 2014 - 10:12 AM, said:
I bet that if you asked those people how they treated those SSDs, they would reply: "Oh, very well! I defragged them at least once a week!"
#22 Posted 16 August 2014 - 11:43 AM
120hz 1080p 3DTV is only possible through dual-link DVI or Dual HDMI. It's the same way with ATI and Intel in that regard, so you can't really blame 'em. It's a bandwidth limitation of the HDMI interface, not the card. (120hz at 1080p on a single HDMI cable isn't possible.)
DisplayPort, on the other hand, is another thing altogether. IIRC, that does have the necessary bandwidth available.
This post has been edited by StrikerMan780: 16 August 2014 - 11:47 AM
#23 Posted 16 August 2014 - 01:30 PM
it creates extra effort on his part to get his marshmallows golden brown 'cause he has to physically spin them over his cpu instead of getting a more efficient dual-coil toaster effect
This post has been edited by Forge: 16 August 2014 - 02:24 PM
#24 Posted 16 August 2014 - 02:45 PM
Altered Reality, on 16 August 2014 - 10:24 AM, said:
Yeah also people tended to buy crap back in the day. Many "famous brands" made SSD's that were total crap compared to the quality of everything else they made.
I only use Crucial or Samsung. My first Crucial M4 died (literally) on day 30, Micro Center took it back no problem though the next day. It was also the first time they saw one go back. I noticed the newer drive had different firmware though, I know some of the earlier ones had severe firmware bugs that would kill the drive. It's now on year 2 and it's been bulletproof. I love this drive. I just wish it was bigger.
StrikerMan780, on 16 August 2014 - 11:43 AM, said:
120hz 1080p 3DTV is only possible through dual-link DVI or Dual HDMI. It's the same way with ATI and Intel in that regard, so you can't really blame 'em. It's a bandwidth limitation of the HDMI interface, not the card. (120hz at 1080p on a single HDMI cable isn't possible.)
DisplayPort, on the other hand, is another thing altogether. IIRC, that does have the necessary bandwidth available.
Yeah the 0-255 color fix seems to be VERY hit and miss. This card simply can't be forced. Also the overscan adjustment in the Nvidia Control Panel looks like shit - try sticking your face near the TV. ATI's does too but you rarely have to use it because they aren't staffed by arrogant idiots.
Forge, on 16 August 2014 - 01:30 PM, said:
it creates extra effort on his part to get his marshmallows golden brown 'cause he has to physically spin them over his cpu instead of getting a more efficient dual-coil toaster effect
I'd sell it anyway. This shit is a deal breaker. Also, short of lighting this thing on fire, it won't make any heat. Asus really worked their magic. As far as hardware quality goes this card is leaps and bounds beyond anything I've ever owned.
This post has been edited by Protected by Viper: 16 August 2014 - 02:47 PM
#25 Posted 16 August 2014 - 03:57 PM
i think i found the problem
This post has been edited by Forge: 16 August 2014 - 04:00 PM
#26 Posted 16 August 2014 - 06:42 PM
had a 4870 X2 way back and aside from having barely any games that properly support crossfire, one of the supposedly good "driver updates" introduced an issue for some people (one of those being me) that stopped the fan from ramping up when it was needed and it caused the card to overheat so bad it killed itself.
Moved to nvidia and haven't looked back since. Been through a 560 ti and now running a 770, neither have had a single driver problem and shadowplay is 1000x better than fraps since i can actually record for more than 30 seconds without it eating up my RAM like fraps does for some reason.
This post has been edited by Bloodshot: 16 August 2014 - 06:42 PM
#27 Posted 16 August 2014 - 08:50 PM
I get hitching and jerky captures with Shadowplay on most games. It appears to be shader related. I also just reformatted a month or so ago and it still happens.
Saints Row The Third has almost no shaders working at all in my captured videos. There's bloom and bullet trails but that's it - and I get framerate drops in the videos despite it running at 60fps the whole time. Isn't 60fps capturing supposed to be the focal point of this technology? Even Kega Fusion is rough and uneven. The only game that seems to capture almost perfect is Mass Effect 3. It's the only game I've tried that's almost free from hitches.
Considering how long it took Nvidia to actually implement this tech in their drivers I wouldn't be surprised if there's something actually broken on the hardware level. I run my games from my SSD and capture onto my WD Black 1TB so it's not like it's a disk I/O issue...
I'm too lazy to re-encode the SR3 gameplay but this gives you an idea of why I dislike Shadowplay. Despite being a flat shaded 3D game Virtual Racing Deluxe is actually really smooth. It's less rough before re-econding, but I have yet to get this card to capture something well from Kega Fusion.
This post has been edited by Protected by Viper: 16 August 2014 - 08:51 PM
#28 Posted 16 August 2014 - 11:44 PM
#29 Posted 17 August 2014 - 12:20 AM
#30 Posted 17 August 2014 - 09:12 PM
Protected by Viper, on 15 August 2014 - 11:02 PM, said:
I didn't purely buy mine for the overclock. It was merely a bonus. Main reason was because it was the cheapest card that had 4 GB of VRAM on it. That and the 600 series and up supports triple monitors without getting one of those TH2Go's. Can't do that with their older cards.
Last time I used Radeons was back in 2003. Some games I play do not play well with Radeons though, so I stuck with Nvidia.
This post has been edited by Jeff: 17 August 2014 - 09:15 PM