Duke4.net Forums: Whelp, Nvidia is now officially on my shitlist forever. - Duke4.net Forums

Jump to content

  • 3 Pages +
  • 1
  • 2
  • 3
  • You cannot start a new topic
  • You cannot reply to this topic

Whelp, Nvidia is now officially on my shitlist forever.

#31

Posted Image
1

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#32

Because Nvidia gave Plagman a $20 gift card to Applebees.

This post has been edited by Protected by Viper: 19 August 2014 - 01:32 AM

3

User is offline   Radar 

  • King of SOVL

#33

This thread was made at the perfect time. Just a few days earlier, I tried to connect my brand new laptop to my TV with HDMI. It wouldn't work at all, so I had to switch to VGA. I wonder if I would have ever figured it out.

This post has been edited by Yay Ponies: 19 August 2014 - 07:33 AM

0

User is offline   Robman 

  • Asswhipe [sic]

#34

Back in 2009 I built a Core2 Duo e6750 rig for myself and a couple months later liquid cooled it and overclocked it from 2.66 to 3.6Ghz, sure felt a lot faster to me! Still use it actually as I've put off the urge to upgrade.

I ended up pulling a noob move though and got water on the motherboard when I was topping up the fluid one day :(
The XFX 680i LT SLi mobo was great but to replace it I bought a XFX 750i thinking it would be just as good or better but the new board could'nt o/c nearly as well.

I put a XFX 9800GT XXX edition vid card in it and replaced the cooler with a much larger aftermarket one as I like my cards to be cool & quiet.( who doesn't?)

Anyway, I tend to choose Intel over AMD and nVidia over ATi as I see them being more of the computing "standard" by which others are measured. Which is kinda sad because AMD appears to be Canadian? Like myself. ( I've driven past their buildings in Toronto quite a few times anyways)

I've always had fun trying to overclock anything that's overclockable. Mostly my CPUs and vid cards and still have lots of older computers and parts to fart around with. I threw out my 286, 386, 486 machines and an IBM XT which I kick myself for doing.

I also learned the hard way that some AMD chips don't have thermal shutdown and can't be quickly "posted" without the cooler on, that bad habit ended fast lol.
0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#35

Thermal shutdown is handled by the BIOS, although pre-Athlon XP chips lack a thermal diode.

I just overclocked a Devil's Canyon Haswell chip yesterday for a friend, an i5-4690k. Hated it. Got a total dud, but Haswell still sucks regardless. Runs wayyyyyyy too fucking hot, even with the new models.

I didn't try too hard, but I got 4.2 stable at 1.20v. Fucking thing peaked at 91c in Prime95 small FFT with a Hyper212 EVO and a Zalman Z11 case. I might have been able to hit 4.3, but keeping that thing at 90c is a nightmare.

Fortunately the R9-280x I put in that thing makes dick shit for heat. I love XFX cards.

This post has been edited by Protected by Viper: 19 August 2014 - 08:40 PM

0

User is offline   Robman 

  • Asswhipe [sic]

#36

It was an Athlon 2800XP that I fried that way.

Sadly the e6750 to date has been the last overclocking venture of mine although couple years back I did build my mother an Intel i7 2600k and fed it 16gb ram, lol.. evga 550 gtx?(iirc) vid card and sabretooth z87? Mobo. A 1200 watt Ultra p/s for total overkill, lol ( it was cheap).

Also built her a cheapy Intel g2120 4gb ram just using the built in video. This was to replace a p4 3.4Ghz HT in the family room which the mobo eventually died.

Heh, it's too much fun playing with puter hardware.
0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#37

Not overclocking a Sandy Bridge is punishable by death.

It's so easy and the gains are fucking huge.

Posted Image

This post has been edited by Protected by Viper: 20 August 2014 - 11:48 PM

0

User is offline   Jeff 

#38

I have my 2600K running at 4.5 GHz w/ 1.3v Vcore (HT enabled). My cooler is a Hyper212+.

This post has been edited by Jeff: 21 August 2014 - 08:50 AM

1

User is offline   Striker 

  • Auramancer

#39

 Protected by Viper, on 19 August 2014 - 01:32 AM, said:

Because Nvidia gave Plagman a $20 gift card to Applebees.


Nah, there's actually a very long story as tp why there's a lot of compatibility issues with AMD cards and modern OpenGL titles, and it stems back to the war between companies with proprietary OpenGL extensions, and the war between OpenGL and Direct3D. AMD / ATI, whether anyone wants to admit it or not, got the shit end of the stick in that deal. I originally posted an article on another forum that explains it in very clear detail, but I since lost the link. I'll try to dig it up.

EDIT: Done, scroll down - http://programmers.s...-prefer-windows

This post has been edited by StrikerMan780: 21 August 2014 - 03:17 PM

1

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#40

AHAHAHAHA THE FAGS IN GREEN ALL BUTTHURT NO ONE WANTS TO BUY THEIR SHITTY TEGRA CHIPS! NVIDIA GETTIN' BTFO!

I had a Motorola Atrix back in the day. Worst phone I ever owned. Much of it was due to Nvidia's horrible buggy driver software, the rest was Motorola's awful reliability. But I was suckered in because it was the first dual core phone...in the world! And it had Nvidia technology!

Rotate the phone? Kernel panic! In your pocket idling away? Kernel panic! Launch the browser? Kernel panic! This phone crashed hard and rebooted five times a month, if that was a GOOD month! The LG Optimus 2X had the exact same issues, so it wasn't Motorola.

The Atrix 2 I FORCED Motorola to replace it with had a TI OMAP chip and a PowerVR GPU like an iPhone. It was better in every way, especially because I didn't have to spend 40 minutes re encoding EVERY HD video file like with that shitty Tegra chip.

Also the closed source drivers are bullshit. What's the only LG phone with a closed sourced kernel? You guessed it: The Tegra 2 based Optimus 2X.

http://techcrunch.co...aign=fb&ncid=fb
0

#41

NVidia sues Samsung and Qualcomm for patent infringment.

http://blogs.nvidia....s-patent-suits/
0

User is offline   Inspector Lagomorf 

  • Glory To Motherland!

#42

 DustFalcon85, on 07 September 2014 - 03:41 PM, said:

NVidia sues Samsung and Qualcomm for patent infringment.

http://blogs.nvidia....s-patent-suits/


This post combined with Viper's above post is just lulzy
0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#43

 DustFalcon85, on 07 September 2014 - 03:41 PM, said:

NVidia sues Samsung and Qualcomm for patent infringment.

http://blogs.nvidia....s-patent-suits/


Nigga I just said that!
0

User is offline   Striker 

  • Auramancer

#44

Tegra 2 was crap I won't deny. It shouldn't have gone out of prototyping until it became more refined and stable. Tegra 3 was meh, but I like the Tegra 4 however, solid as a rock.

This post has been edited by StrikerMan780: 10 September 2014 - 10:22 AM

0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#45

Crap was being polite. Holy hell it was awful.
0

User is offline   Striker 

  • Auramancer

#46

It was rushed to the market before it was even close to a finished product to catch up with the Android smartphone craze, imho.
0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#47

They just wanted to be first to market, not best in the market.

That's the story of basically ever Tegra chip ever.
0

#48

Post deleted. Moved the post over to another thread. Disregard this post.

This post has been edited by DustFalcon85: 19 September 2014 - 05:38 AM

0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#49

 Yay Ponies, on 19 August 2014 - 07:32 AM, said:

This thread was made at the perfect time. Just a few days earlier, I tried to connect my brand new laptop to my TV with HDMI. It wouldn't work at all, so I had to switch to VGA. I wonder if I would have ever figured it out.


Oh man, reason number #762 why this old Dell Studio XPS is the best laptop I've ever owned: It's got a Radeon 3670.

Prehistoric? Yes. Perfect HDMI? Oh, God, you have no idea...it's wonderful. Plugged it into the Samsung and it took immediately. <3 <3 <3 this laptop.

This post has been edited by Protected by Viper: 21 September 2014 - 12:02 AM

0

#50

 Protected by Viper, on 21 September 2014 - 12:02 AM, said:

Oh man, reason number #762 why this old Dell Studio XPS is the best laptop I've ever owned: It's got a Radeon 3670.

Prehistoric? Yes. Perfect HDMI? Oh, God, you have no idea...it's wonderful. Plugged it into the Samsung and it took immediately. <3 <3 <3 this laptop.


Well I <3 <3 my MSI GE Series GE60 0ND-667US. That sweet thing plays DOSBox very well. Best Laptop IMHO.

My other laptop makes DOSBox laggy whenever the level loads in Jazz Jackrabbit. Could it be Windows 8.1 or some bloatware I need to get rid of? Or could it be the i7-4700HQ CPU or do I have to fiddle w/ the DOSBox options?? Gotta find out what's causing the lags.

This post has been edited by DustFalcon85: 28 September 2014 - 03:38 PM

0

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#51

That's your "Other" laptop? Holy shit.

I'd buy a used quad core off eBay and jam it in that fucker.
0

#52

Hmm. NVidia apologizes for the errors and memory performance problems in the GTX 970 series.

http://www.gamespot....y/1100-6424915/
0

#53

Hey POC. What do you think a/b the new NVidia Geforce GTX 1080?

http://www.gamespot....mpaign=homepage
0

User is offline   Inspector Lagomorf 

  • Glory To Motherland!

#54

I will bet my hat he responds with something along the lines of "Just another overpriced Jewforce card with crappy Ching components and shite drivers"
0

User is offline   MusicallyInspired 

  • The Sarien Encounter

#55

Apparently it's not much better (if at all) than the 980ti, from what I've read. The general consensus is that it isn't really worth it. But I guess it just dropped so that could change...?
0

User is offline   Tea Monster 

  • Polymancer

#56

Because ATI and their crappy drivers and non-compliant OpenCL - which, for an open standard, nobody can make work with anything - sucks.

For all it's faults (and there aren't that many) Nvidia just works 95% of the time, which is just wonderful.

TL:DR - If you want to play games and get work done - get Nvidia. If you would rather spend your time on forums bitching about why your card won't work with your game/3Dapp/graphics app etc, or if you would rather just bitch at people rather than do stuff with your PC, then get ATI.
1

User is offline   MrBlackCat 

#57

Reading threads like this makes the idea of having modern PC's absolutely horrible. I will stick with my ChromeBook! It just works... :)

MrBlackCat
0

User is offline   Mblackwell 

  • Evil Overlord

#58

 MusicallyInspired, on 17 May 2016 - 08:35 AM, said:

Apparently it's not much better (if at all) than the 980ti, from what I've read. The general consensus is that it isn't really worth it. But I guess it just dropped so that could change...?


It's 30% faster than a 980Ti on the reference model. Custom versions from vendors will retail for a lower cost (lower than the 980Ti MSRP) and have better cooling and likely reach higher clock speeds.
0

User is offline   Forge 

  • Speaker of the Outhouse

#59

If Viper can't hack the bios and overclock it by 5 volts, then it's a POS

Posted Image

This post has been edited by Forge: 17 May 2016 - 07:26 PM

2

User is offline   Person of Color 

  • Senior Unpaid Intern at Viceland

#60

 DustFalcon85, on 17 May 2016 - 07:18 AM, said:

Hey POC. What do you think a/b the new NVidia Geforce GTX 1080?

http://www.gamespot....mpaign=homepage



 Inspector Lagomorf, on 17 May 2016 - 08:23 AM, said:

I will bet my hat he responds with something along the lines of "Just another overpriced Jewforce card with crappy Ching components and shite drivers"


Dude, I don't care who makes it, if you buy top of the line you're a flaming faggot.

Go ask anyone who bought an R9 290 at launch two and a half years ago what they think of their card.

Chances are they're still fucking in love with it. Go down ONE model and watch the value per dollar explode. It's the same damn product with a slightly defective core.

 Tea Monster, on 17 May 2016 - 02:14 PM, said:

Because ATI and their crappy drivers and non-compliant OpenCL - which, for an open standard, nobody can make work with anything - sucks.

For all it's faults (and there aren't that many) Nvidia just works 95% of the time, which is just wonderful.

TL:DR - If you want to play games and get work done - get Nvidia. If you would rather spend your time on forums bitching about why your card won't work with your game/3Dapp/graphics app etc, or if you would rather just bitch at people rather than do stuff with your PC, then get ATI.


Or if you want real Linux usability, go ATI.

"B-b-b-but muh Linux OpenGL performance!"

First off, if anyone games in Linux, I've got a bullet with your name on it. Go suck on a Glock.

Second off, if you want to do actual work or watch even the simplest video, you aren't going to get vsync working with an Nvidia card. It's just broken as fuck and they don't care.

I have two SSD's. If I want to play games I can reboot in seconds.

This post has been edited by Person of Color: 17 May 2016 - 07:40 PM

0

Share this topic:


  • 3 Pages +
  • 1
  • 2
  • 3
  • You cannot start a new topic
  • You cannot reply to this topic


All copyrights and trademarks not owned by Voidpoint, LLC are the sole property of their respective owners. Play Ion Fury! ;) © Voidpoint, LLC

Enter your sign in name and password


Sign in options