Latest Forum Posts

Latest News Posts
Coming Soon!
Social
Go Back   Techgage.com > Hardware > Video Cards and Displays

Video Cards and Displays AMD, NVIDIA graphics card talk, as well as monitor/HDTV discussion.

Reply
 
Thread Tools
Old 03-08-2010, 02:48 PM   #1
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,347
Default NVIDIA Teases with Tessellation Performance on GTX 480

Late last week, NVIDIA posted a video to YouTube that showcases the company's upcoming GeForce GTX 480 graphics card, based on its Fermi architecture. In the video, NVIDIA's Tom Peterson explains the perks that the card has, and how it will compare to the "competition", and of course, why you will want one. Tom emphasizes superb tessellation performance, while also explaining how it works, with the help of Unigine's Heaven benchmark.



You can read the rest of our post here.
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Old 03-08-2010, 05:59 PM   #2
Relayer
E.M.I.
 
Join Date: Mar 2008
Location: New Zealand
Posts: 67
Default

While I do find it interesting that Fermi seems to handle tessellation much better than Cypress (being able to draw triangles on screen faster is an important spec), I also find it interesting that, other than that, Fermi doesn't appear to be performing any better than Cypress.

Only 2 minutes of one benchmark isn't much to go by though. Why no more performance figures? Seems strange. If they had more positive figures you think that they would show them, wouldn't you?
Relayer is offline   Reply With Quote
Old 03-08-2010, 07:45 PM   #3
b1lk1
Tech Monkey
 
b1lk1's Avatar
 
Join Date: Mar 2006
Location: Ontario
Posts: 821
Default

It is classic Nvidia to release as little as possible before a major launch. They want all that suspense and as I state in most threads like this, Nvidia has little to prove to the Green army who are just salivating at the chance to have these cards. They are not a tough sell, no matter the performance and as long as they atleast compete with the HD5870 on a base level, they will sell like wildfire.
__________________
Oldschool PC:
Intel E6300/Nouctua NH-9U w/dual 92mm Panaflos
XFX 680i LT
Patriot PC2-6400 2X2GB LL
2 X BFG 8800GTX in SLI
Enermax Revolution 620W PSU
ASUS 24X DVD/Seagate 250GB SATA 2/THermaltake A90/Windows 7 Ultimate 64
b1lk1 is offline   Reply With Quote
Old 03-08-2010, 09:01 PM   #4
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,347
Default

Quote:
Originally Posted by b1lk1 View Post
They are not a tough sell, no matter the performance and as long as they atleast compete with the HD5870 on a base level, they will sell like wildfire.
But for the projected price of $200 more? I'd have a hard time believing that they'll sell as easy as you say they will if this turns out to be the case. I could believe it if it was even just $100 higher, but $200 is a lot harder to stomach. Plus, we're not even talking power consumption and temperatures. For all we know, ATI is going to clean house where those two things are concerned.

It's going to be an interesting month, that's for sure.
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Old 03-09-2010, 07:54 AM   #5
crowTrobot
E.M.I.
 
crowTrobot's Avatar
 
Join Date: Feb 2010
Location: Toronto, ON, Canada
Posts: 99
Default

A lot of other factors like comparison to older GTX2*** models are also in play like how good it is in Folding, CUDA apps and 3D Vision, 3 things that are exclusively nvidia right now. I don't think nvidia users are concerned so much about power consumption esp those who fold.
crowTrobot is offline   Reply With Quote
Old 03-09-2010, 11:41 AM   #6
Tharic-Nar
Techgage Staff
 
Tharic-Nar's Avatar
 
Join Date: Nov 2009
Location: UK
Posts: 1,166
Default

Power consumption is a problem because the PCI-E Graphics (PEG) standard says that a single device can not draw more than 300Watts of power, this is why the ATI 5970 was deemed an under-performer, because it hit the 300 watt wall, but they encouraged overclocking. Thats the difference, they can sell you a product that meets the 300 watt criteria and make it very easy to overclock (and consume more than 300 Watts), but that is also the problem, not everyone will. They could of course let the card consume more than 300 Watt's, but then they couldn't sell the card as a PCI-E compatible card, since it would be in breach of the standard. Also, not everyone folds, nor wants their card consuming 300 Watts while in use with the fan screeching away.

I know, it's doubtful that Fermi will pull 300 Watts (or at least i hope so), but it hardly screams efficient when it's doing the same work as its competitor with a 15-25% increase in power. If power wasn't an issue, we'd be using 3000 Watt computers by now.... and in one case, you could (EVGA's dual socket board fully kitted out).
Tharic-Nar is offline   Reply With Quote
Old 03-09-2010, 02:12 PM   #7
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,347
Default

Quote:
Originally Posted by crowTrobot
I don't think nvidia users are concerned so much about power consumption esp those who fold.
That's true, but even so, the number of Folders is going to be awfully low compared to those who are purchasing the cards for pure gaming. As for 3D Vision, I don't really think that's a deciding factor for most people right now, given very few even have a 120Hz TV/monitor.

Quote:
Originally Posted by Tharic-Nar
They could of course let the card consume more than 300 Watt's, but then they couldn't sell the card as a PCI-E compatible card, since it would be in breach of the standard.
Would NVIDIA even be allowed to sell the card if it consumed more than 300W? From my understanding, the PCI-E implementors forum or whatever it is would step in and disallow it. I'm sure they have a 300W limit for a reason (although to be honest, I have no idea why it matters, given the power is mostly handled by the PSU, not the PCI-E slot).
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Old 03-09-2010, 03:42 PM   #8
Tharic-Nar
Techgage Staff
 
Tharic-Nar's Avatar
 
Join Date: Nov 2009
Location: UK
Posts: 1,166
Default

For the most part, i believe the limits are arbitrary, it's like putting transfer speed limits on flash memory (Compact flash and such, Class III memory etc). There are probably reasons, largely relating to Manufacturers, since it doesn't really affect End Users.

With the selling of a card over 300 Watt, they would (probably) only stop the manufacturer if they stated that the card was PCI-E compatible, in that, they could not mark in the specifications or box that it used PCI-E, since it would be false advertising. They could probably get away with selling the cards, but it becomes a gray area since the cards would in effect use the PCI-E interface to run, but they just couldn't declare it, and since it doesn't state what connection it does use, no one would buy it without knowing that you could just plug it in anyway.

Of course, they could just sell them as PCI-E 3.0(draft) devices, like with 802.11n (draft), since the PCI-E 3.0 standard has not been finalized (but supposedly due soon-ish). Last i checked, it allowed for up to 300 Watts over the interface, no mention of external power though, so i'd assume it was safe (but of course, speculation as always).

Last edited by Tharic-Nar; 03-09-2010 at 03:47 PM.
Tharic-Nar is offline   Reply With Quote
Old 03-09-2010, 07:02 PM   #9
b1lk1
Tech Monkey
 
b1lk1's Avatar
 
Join Date: Mar 2006
Location: Ontario
Posts: 821
Default

Irrational fanboyism will dictate the sales of these cards if they are truly $200+ overpriced. If they are closer to the real price that is forecast then enthusiasts will join in the fray. Any way you look at it, these cards are gonna sell because there is a giant market just waiting for them no matter the cost/performance.
__________________
Oldschool PC:
Intel E6300/Nouctua NH-9U w/dual 92mm Panaflos
XFX 680i LT
Patriot PC2-6400 2X2GB LL
2 X BFG 8800GTX in SLI
Enermax Revolution 620W PSU
ASUS 24X DVD/Seagate 250GB SATA 2/THermaltake A90/Windows 7 Ultimate 64
b1lk1 is offline   Reply With Quote
Old 03-10-2010, 09:18 AM   #10
Envy
E.M.I.
 
Envy's Avatar
 
Join Date: Mar 2010
Posts: 52
Default

Quote:
Originally Posted by b1lk1 View Post
It is classic Nvidia to release as little as possible before a major launch. They want all that suspense and as I state in most threads like this, Nvidia has little to prove to the Green army who are just salivating at the chance to have these cards. They are not a tough sell, no matter the performance and as long as they atleast compete with the HD5870 on a base level, they will sell like wildfire.
No offense but you're an obvious Nvidia fanboy. Most people nowadays either get suggestions by proffessionals that know their stuff, or they ARE proffessionals that know their stuff. What I mean by this is that people aren't going to buy the 400 series just because it's Nvidia. They're going to go and buy from AMD/ATI. Why? First of all, It's cheaper. Second of all, nowadays AMD/ATI has about the same performance/a bit better/a bit worse for a better price. Except for maybe the 5970 which is like $700 now right? But it still blows the 295 out of the water.
Envy is offline   Reply With Quote
Old 03-10-2010, 12:04 PM   #11
Doomsday
Tech Junkie
 
Doomsday's Avatar
 
Join Date: Nov 2008
Location: KHI, PAK
Posts: 1,559
Default

i will take power consumption and temps. into consideration! and IF they fail in comparison to ATI, I'll go for an ATI card, or the GTX470+ !! lol!
__________________
PSU: Corsair AX850 - Case: Cooler Master HAF X - CPU:Core i7-2600k - Cooler: Cooler Master V6 GT - Motherboard: Asus Z68 Maximus IV Extreme Z - Memory: Corsair Vengeance 8 GB-1600Mhz - GPU: AMD MSI R6970 Lightning - HDD: WD Caviar Black 1TB, Seagate 2TB Barracuda Green - SSD: Intel 520 Series 120GB - K/B: Razer Lycosa Mirror - Mouse: Logitech G700 - MouseMat: Steel Series 4HD - LCD: Asus VG278H 27" - Speakers: Creative Inspire M4500 4.1 - Headset: Logitech G35 7.1



"Do not look at a man's prayers nor his fasts, rather, measure him by how well he deals with others, the compassion he shows his fellow man, his wisdom and his integrity" - Umar Ibn Al-Khattab


Doomsday is offline   Reply With Quote
Old 03-10-2010, 01:04 PM   #12
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,347
Default

Quote:
Originally Posted by Tharic-Nar
With the selling of a card over 300 Watt, they would (probably) only stop the manufacturer if they stated that the card was PCI-E compatible, in that, they could not mark in the specifications or box that it used PCI-E, since it would be false advertising.
At that point, I just can't see a release happening, unless it had to happen. I couldn't imagine looking at a specs page for a GPU and not even see it list the interface. The 3.0 spec is one idea, but I'm not sure that would work either. I'm sure there is a lot more to the 3.0 spec than simply increased power allowances.

Quote:
Originally Posted by Envy
No offense but you're an obvious Nvidia fanboy.
b1lk1 is an ATI fanboy, actually ;-)

I tend to agree with you on this, because I just can't see people, no matter how devoted to NVIDIA they are, going out to spend 50% more on a GPU that performs like 10% better than another. Yes, there are a LOT of NVIDIA fanboys out there, but I'd have to imagine that they pale in comparison to the number of regular consumers, who have no brand preference.
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Old 03-10-2010, 05:06 PM   #13
Envy
E.M.I.
 
Envy's Avatar
 
Join Date: Mar 2010
Posts: 52
Default

Quote:
Originally Posted by Rob Williams View Post
At that point, I just can't see a release happening, unless it had to happen. I couldn't imagine looking at a specs page for a GPU and not even see it list the interface. The 3.0 spec is one idea, but I'm not sure that would work either. I'm sure there is a lot more to the 3.0 spec than simply increased power allowances.



b1lk1 is an ATI fanboy, actually ;-)

I tend to agree with you on this, because I just can't see people, no matter how devoted to NVIDIA they are, going out to spend 50% more on a GPU that performs like 10% better than another. Yes, there are a LOT of NVIDIA fanboys out there, but I'd have to imagine that they pale in comparison to the number of regular consumers, who have no brand preference.
Yea, I mean sure Nvidia makes great cards but that doesn't mean I have to devote myself to them. I prefer bang for your buck. I mean its like brand clothing today. Who cares so much about what you're wearing?
Envy is offline   Reply With Quote
Old 03-10-2010, 05:15 PM   #14
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,347
Default

That's a fair comparison, but I'd argue that the brand of clothing could actually affect the design, the material, the overall quality and so forth. In that regard, you'd be actually paying for an improvement. That wouldn't be the case with these GPU's though, if you are knowingly paying more for less.
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Old 03-17-2010, 10:10 AM   #15
Kougar
Techgage Staff
 
Kougar's Avatar
 
Join Date: Mar 2008
Location: Texas
Posts: 2,653
Default

I would imagine the 300watt figure is intended to be a safety limit as much as a design standardization to aim for. Entire Quadcore computers run with 300W, you might see where I'm coming from if you imagine piping 300W through 8 yellow wires + a few of the delicate gold contacts in the PCIe Express slot itself.... then consider some company exceeding that limit. It's also a question of the power supply itself, the power draw for each PCIe rail is supposed to be within a set limit so PSU manufacturers can adhere to their own specifications when designing their units.

It wouldn't matter for a company like PC Power & Cooling that uses a single, massive 12volt rail, but that's against Intel's ATX specification. Most PSU manufacturers split the 12volt power amongst 4-6 rails, each PCie rail artificially limited to some number I don't recall. If a GPU pulled less power from one rail, and tried to make up for it by pulling a bit more power from another capped rails it would hit the limit and the card would just crash/error even though the PSU had the power to spare.
__________________
Core i7 4770k 4.2Ghz
Gigabyte Z87X-UD5H
Crucial Ballistix Sport LP 1600MHz 32GB
EVGA GTX 480 HydroCopper FTW
ASUS Xonar DX
Corsair Neutron GTX 240GB | Windows 7 64-bit
Apogee XT + MCP655 & Thermochill Triple 140mm Radiator
Corsair AX1200 PSU | Cooler Master HAF-X

Kougar is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Linux + KDE Delivers Better NVIDIA Performance than Windows Rob Williams Video Cards and Displays 1 09-13-2012 02:53 PM
Crysis 2 DX11: Where Tessellation Becomes Overkill Rob Williams Gaming 7 08-21-2011 09:38 AM
Report: NVIDIA Throttles PhysX Performance on the CPU Brett Thomas Video Cards and Displays 6 07-21-2010 10:54 AM
NVIDIA Teases with Photo of GF100 Running DirectX 11 Rob Williams Video Cards and Displays 0 11-19-2009 02:22 AM


All times are GMT -4. The time now is 04:09 PM.