Latest Forum Posts

Latest News Posts
Coming Soon!
Social
Go Back   Techgage.com > Hardware > Video Cards and Displays

Video Cards and Displays AMD, NVIDIA graphics card talk, as well as monitor/HDTV discussion.

Reply
 
Thread Tools
Old 10-19-2011, 10:32 PM   #1
Psi*
Tech Monkey
 
Psi*'s Avatar
 
Join Date: Jun 2009
Location: Westport, CT
Posts: 785
Default Tesla M2090

I have a Tesla C2070 for double precision number crunching. Paid a ~$2200 for it from a legit company and it is at least 5X faster than the i7-990X OC-ed at 4.5GHz. Nice. I am happy.

But in my what used to be rare to never ventures on eBay I am starting to see these new Tesla M2090s show up and for <<$2K. This card has 512 CUDA cores versus 448 in the C2070 and my software guys tell me to expect a 20 % improvement.

Background in a nut shell ... did I forget to mention that I OC the C2070 by 34%? I am using an MSI utility for their Nvidia cards to handle the OC & fan control. When the C2070 was initially installed (virgin like), the card's temp was around 83 deg C at idle. Confused, concerned, & frustrated I pulled the card out & parked it for a few weeks until I could get more info. Nvidia does offer a utility for setting the Core, Shader, & Memory clocks as well as controlling the on board fan ... but I could not get it to work at all or at least "remember" the settings. I have no recollection of why I decided to even try the MSI utility, but it was trivial to use. So easy that I haven't even tried to find an alternative much less go back to the Nvidia utility. And at idle, the fan keeps it about 58 deg C. When crunching it speeds up as necessary per the card temp. Just like you would want, what the heck?

Back to the M2090. This thing does not have an integrated fan. If you follow that link above, the pic of the card with a large heat pipe heatsink is the only current offering. So while I wait for it to show up, I am thinking about what could be done. I cannot find any kind of picture or board layout (w/o the heatsink) to get an idea before it shows up. Since most people are probably buying these things for >$4K and probably not their money, they may be a bit hesitant to dig into it much. Me, any way to gets things done faster I am all over.

I will be posting pics of the card when it shows up. I suspect that I will pull the heatsink pretty quickly & am hoping that some GeForce GTX 580 cooling solution would fit right on. Maybe someone has WC-ed a GTX 580 & would sell the stock cooler!

Stay tuned ... pics sometime soon.
__________________
Win 7 64 bit, ASUS P6X58D Premium i7-990 @ 4.5 GHz
24 GB CORSAIR DOMINATOR
NVIDIA Tesla M2090 + NVS 290, Seasonic X750
Swiftech Apogee XT block, Indigo Extreme TIM
Swiftech MCR220-QP Radiator, Eheim 1040 pump, 1/2" ID Tubing
Psi* is offline   Reply With Quote
Old 10-20-2011, 11:45 AM   #2
DarkStarr
Tech Monkey
 
DarkStarr's Avatar
 
Join Date: Apr 2010
Posts: 634
Default

well, it depends, the board layout COULD possibly be identical to the 580 3gb just higher density ram on it.... if so any aftermarket cooler or a waterblock SHOULD fit but there is the possibility that they changed vrm locations or other stuff so you will have to check it.
__________________
Intel Core i7 2700k (4.8 @ 100x48) Watercooled - 16GB Crucial Ballistix @ 1600 Mhz 9.9.9.24 2T
8GB
Corsair Vengeance @ 1600 Mhz 9.9.9.24 2T - Asus Sabertooth P67
- Asus Radeon 7970 Ref. (Non Ghz)
Heatkiller 79xx Ni-Bl (Soon) @ 1050/1500
- 64Gb Crucial M4 SSD - 3x Hitachi 1Tb - Corsair TX950W
Azza Genesis 9000B - 2x Samsung SyncMaster S27A550H - Vizio 32" LCD

DarkStarr is offline   Reply With Quote
Old 10-20-2011, 12:08 PM   #3
TheCrimsonStar
Tech Monkey
 
TheCrimsonStar's Avatar
 
Join Date: Apr 2010
Location: Strawberry Plains, TN
Posts: 816
Default

Alright, nub question. What's the difference between a desktop graphics card and a workstation card? Can you use the workstation cards for gaming?
__________________

Intel Core i5-2500k @ 3.3GHz
ASUS Sabertooth P67
Corsair XMS 1600 8GB (2 x 4GB)
XFX Radeon HD 6970
Corsair CMPSU-850TX 850W
Corsair H80 High Performance Liquid CPU Cooler
Windows 7 Professional (x64)
TheCrimsonStar is offline   Reply With Quote
Old 10-20-2011, 01:30 PM   #4
Psi*
Tech Monkey
 
Psi*'s Avatar
 
Join Date: Jun 2009
Location: Westport, CT
Posts: 785
Default

@TheCrimsonStar ... not such a nuby sort of question. There is a lot of discussion about this on many web sites. The short answer is that these cards have superior double precision math capability versus the GTX 580, for instance. I run software that uses a great deal of double precision math & it runs for hours even on this card. With single precision math the GTX 580 runs over the Teslas ... not sure that is still true with the M2090 tho.

There are lengthy geeky posts by people who have written CUDA programs, Perl scripts, etc. that do enable some of the Tesla features on cards like the GTX580. There is even some firmware changes that have been done, but to the best of my scouring of the net they never quite get to the capability of the actual Tesla cards. I also have the C2070 versus the C2050 for the 6GB of on board RAM (I wish there was more!)

In playing around with the card (C2070 which does have a dual DVI video output) I got blazing speed in the normal video benchmarks ... FurMark, FluidMark, Heaven Benchmark, probably some others. Sorry I did not keep the results as I was anxious to start number crunching, but I will revisit & post.

Interestingly, it is reported that using the video port slows the math thru put. So I also have the recommended accompanying NVS 300 card which has only 8 (I think) CUDA cores for video support. On my ASUS P6X58D m/b the C2070 is the primary card & I cannot find a way of changing this. But this only matters when needing to go into bios ... in other words bios is only accessible via the C2070 & the NVS 300 doesn't output video until Windows 7 actually boots. So as long as there is nothing to change in bios the monitor stays on the NVS 300.

Wonder what fun it will be to get the machine going with the M2090? The M2090 has no video output at all ... ma-a-aybe it won't even be an issue?!?!
__________________
Win 7 64 bit, ASUS P6X58D Premium i7-990 @ 4.5 GHz
24 GB CORSAIR DOMINATOR
NVIDIA Tesla M2090 + NVS 290, Seasonic X750
Swiftech Apogee XT block, Indigo Extreme TIM
Swiftech MCR220-QP Radiator, Eheim 1040 pump, 1/2" ID Tubing
Psi* is offline   Reply With Quote
Old 10-20-2011, 01:42 PM   #5
Rob Williams
Editor-in-Chief
 
Rob Williams's Avatar
 
Join Date: Jan 2005
Location: Atlantic Canada
Posts: 13,350
Default

You overclocked a $2,500 workstation card? That is... just awesome.

But 34%? That seems rather high... the desktop cards can't even OC that much. What were the before and after clocks, if you don't mind me asking?

Quote:
Originally Posted by Psi*
the card's temp was around 83 deg C at idle
Did you ever get to the bottom of this? 83C is too damn high at idle... and at that point I am guessing the card wouldn't even get much hotter. The max temperature is something like 95-100C.

Quote:
Originally Posted by Psi*
I will be posting pics of the card when it shows up. I suspect that I will pull the heatsink pretty quickly & am hoping that some GeForce GTX 580 cooling solution would fit right on.
Does it need to be actively cooled, though? If it's not sold with an active cooler (which blows me away), it might be designed to run cooler somehow. You might just want to test it as is first. The lack of a fan concerns me though... dust build-up will only be amplified.

Quote:
Originally Posted by Psi*
I also have the C2070 versus the C2050 for the 6GB of on board RAM (I wish there was more!)
When crunching away, have you ever monitored the GDDR5 usage using a tool like GPU-Z? I'd be interested in knowing if that 6GB is utilized, and how often.

I'm intrigued by all this though. You have some serious kit right there.

Quote:
Originally Posted by TheCrimsonStar View Post
Alright, nub question. What's the difference between a desktop graphics card and a workstation card? Can you use the workstation cards for gaming?
What Psi* said, but I believe it can get even more complicated than that. Workstation cards will be slower for gaming than regular desktop cards due to their tweaked architecture and also the drivers. Workstation cards are meant for game designers, movie creators and then people like Psi* who can take full advantage of the super-fast mathematical performance and parallelism of the GPU.

One of the reasons workstation cards cost so much also ties into the support. Contrary to the best feature of its cards, NVIDIA gives unparalleled (what a horrible joke) support to its workstation customers. These customers aren't just trying to get a game to work, they're often in business where downtime is a non-option.
__________________
Intel Core i7-3960X, GIGABYTE G1.Assassin 2, Kingston 16GB DDR3-2133, NVIDIA GeForce GTX 770 2GB
Kingston HyperX 3K 240GB SSD (OS, Apps), WD VR 1TB (Games), Corsair 1000HX, Corsair H70 Cooler
Corsair 800D, Dell 2408WFP 24", ASUS Xonar Essence STX, Gentoo (KDE 4.11. 3.12 Kernel)

"Take care to get what you like, or you will be forced to like what you get!" - H.P. Baxxter
<Toad772> I don't always drink alcohol, but when I do, I take it too far.


Rob Williams is offline   Reply With Quote
Old 10-20-2011, 02:15 PM   #6
DarkStarr
Tech Monkey
 
DarkStarr's Avatar
 
Join Date: Apr 2010
Posts: 634
Default

I do suppose you have an epic option to turn the new card into a turbine lmao Strap a couple deltas (or at least some decent fans) to it and run it like that. It has a massive heatsink on it so with some fans it would cool extremely well.
__________________
Intel Core i7 2700k (4.8 @ 100x48) Watercooled - 16GB Crucial Ballistix @ 1600 Mhz 9.9.9.24 2T
8GB
Corsair Vengeance @ 1600 Mhz 9.9.9.24 2T - Asus Sabertooth P67
- Asus Radeon 7970 Ref. (Non Ghz)
Heatkiller 79xx Ni-Bl (Soon) @ 1050/1500
- 64Gb Crucial M4 SSD - 3x Hitachi 1Tb - Corsair TX950W
Azza Genesis 9000B - 2x Samsung SyncMaster S27A550H - Vizio 32" LCD

DarkStarr is offline   Reply With Quote
Old 10-20-2011, 03:29 PM   #7
Psi*
Tech Monkey
 
Psi*'s Avatar
 
Join Date: Jun 2009
Location: Westport, CT
Posts: 785
Default

I have a screen capture with the card running a model. Unfortunately I am a dunce in getting this uploaded to the forum & I have to leaverightnow ... later
Psi* is offline   Reply With Quote
Old 10-20-2011, 04:14 PM   #8
Optix
Basket Chassis
 
Optix's Avatar
 
Join Date: Dec 2009
Location: New Brunswick, Canada
Posts: 1,684
Default

You can overclock a $2k card but can't upload? Hehehehe.

Just giving you a hard time, Psi*. Some Scythe Gentle Typhoons we be money!
__________________

Intel i5 3570K, MSI Z77Z-GD55, 4x2GB Kingston Genesis 2133mhz, 120GB Mushkin Chronos SSD, 1TB Western Digital Caviar Blue, Intel 210 cache drive, MSI 7850 Power Edition OC, Corsair H100, Silverstone Strider 750w Gold, Killer2100 NIC, Corsair 600T SE White, LG W2242, ROCCAT Kone+, Isku & Kave, 3TB Seagate Backup+, 200GB Western Digital Scorpio Black external drive


Optix is offline   Reply With Quote
Old 10-20-2011, 04:48 PM   #9
RainMotorsports
Partition Master
 
RainMotorsports's Avatar
 
Join Date: Jul 2011
Posts: 359
Default

Quote:
Originally Posted by Rob Williams View Post
But 34%? That seems rather high... the desktop cards can't even OC that much. What were the before and after clocks, if you don't mind me asking?
While it does seem high. I have pulled 22% out of a laptop gpu. 9800M GS 530-1325-800 to 650-1625-900. Dunno if i ever posted it here but burned it a good 2 hours.
RainMotorsports is offline   Reply With Quote
Old 10-20-2011, 05:59 PM   #10
Psi*
Tech Monkey
 
Psi*'s Avatar
 
Join Date: Jun 2009
Location: Westport, CT
Posts: 785
Default

@Optix ... that was your opening & someone was expected to take!

If the attachment does not fit well on ppl's monitors let me know & I'll split future ones.

So this shows the number cruncher in the background behind GPU-Z, half of the MSI Afterburner display in top right, CPU-Z in lower right, & gud ole Perf. Mon. left of center ... too much in one pic? There are soooo many views to be had.

Default clock rates for the GPU are; GPU clock ... 574, Memory ... 747, Shader ... 1147. So those OCs are 30%, 24%, & 30% respectively. I thought sure that some benchmark program reported 34%. I have looked at too many at odd hours so maybe I did dream that. But I am a liar by only 4%.

Note this is on the i7-920 @ 4.2GHz. This is the machine that has the slow SATA 3 SSD response. We talked about this in some other thread. I have discovered that this is because of the OC. I'll find that thread & add it to this thread. This is sort of, "when is an overclock, not an overclock".
Attached Thumbnails
Click image for larger version

Name:	C2070 running1.png
Views:	269
Size:	438.9 KB
ID:	1300  
__________________
Win 7 64 bit, ASUS P6X58D Premium i7-990 @ 4.5 GHz
24 GB CORSAIR DOMINATOR
NVIDIA Tesla M2090 + NVS 290, Seasonic X750
Swiftech Apogee XT block, Indigo Extreme TIM
Swiftech MCR220-QP Radiator, Eheim 1040 pump, 1/2" ID Tubing
Psi* is offline   Reply With Quote
Old 10-20-2011, 10:13 PM   #11
Psi*
Tech Monkey
 
Psi*'s Avatar
 
Join Date: Jun 2009
Location: Westport, CT
Posts: 785
Default

A couple of useful comments about the MSI Afterburner plots;
GPU1 is the C2070 & GPU2 is the NVS 300 used for display.

So the noise on the GPU2 usage % is due to screen activity ... program updates & mouse moving around.

The top graph is GPU1 Temperature & is in the low 70s C. So, yes, Rob I figured out a solution to the high idle temps of the C2070. The MSI utility has the ability to simply setup a fan control curve where the higher the GPU temp rises (as the input), the faster the fan spins ... so cool.

Lower is the GPU1 % usage at ~80%. This particular software never maxes CPUs or apparently GPUs to 100%.

The bar graph part of the GPU-Z shows the GPU card's memory usage at 3709 GB of the 6 GB total. The number cruncher does give an indication that 75% of the GPU meory is allocated for this problem. Yes, problems sent to the GPU must fit in that memory space. When they don't the number cruncher just sends it to the i7-920. So on problems that get kicked to the CPU I'll try to reduce the problem size to make it fit ... sometimes it makes sense, sometimes not.
Psi* is offline   Reply With Quote
Old 11-09-2011, 09:48 AM   #12
Psi*
Tech Monkey
 
Psi*'s Avatar
 
Join Date: Jun 2009
Location: Westport, CT
Posts: 785
Default

I'm getting a 2nd M2090! Now I need a dual CPU system board.
Psi* is offline   Reply With Quote
Old 11-11-2011, 01:14 AM   #13
Kougar
Techgage Staff
 
Kougar's Avatar
 
Join Date: Mar 2008
Location: Texas
Posts: 2,653
Default

Plenty of dual SBE boards coming out in a week...
__________________
Core i7 4770k 4.2Ghz
Gigabyte Z87X-UD5H
Crucial Ballistix Sport LP 1600MHz 32GB
EVGA GTX 480 HydroCopper FTW
ASUS Xonar DX
Corsair Neutron GTX 240GB | Windows 7 64-bit
Apogee XT + MCP655 & Thermochill Triple 140mm Radiator
Corsair AX1200 PSU | Cooler Master HAF-X

Kougar is offline   Reply With Quote
Old 11-11-2011, 10:00 AM   #14
Psi*
Tech Monkey
 
Psi*'s Avatar
 
Join Date: Jun 2009
Location: Westport, CT
Posts: 785
Default

THAT is good news. I am all about getting a quality m/b with current crop of CPUs (more modest price) & then up grading the CPUs in the future if it makes sense.

As I understand it so far, the will be more PCIe channels now Gen 3 as well as a few legacy Gen2. I forget the mix but that doesn't matter so much. This next build may be more about how many PCIe slots & having a case large enough to accommodate double wide cards. Having 2 Tesla M2090s is serious computing, but 4 would be perfect & the ultimate. Each of the Teslas of current manufacture are PCIe Gen 2 X16. And, there still needs to be at least a single slot video card for actual video.

Mark me as excited & anxiously awaiting the news & reviews! And hoping there isn't a double dip recession.
Psi* is offline   Reply With Quote
Old 03-30-2012, 12:09 AM   #15
tentonine
Obliviot
 
Join Date: Mar 2012
Posts: 2
Default

I am curious about how the cooling turned out with the M2090s. Did it work out without modification or did you have to go with a GTX 580 cooler or at least some fans?
I am thinking of buying an M2090 or other M-series card (there are some really cheap M2050s on eBay), but I'm concerned about this.

Thanks for any comments! I'd be particularly interested in hearing if you found a cooler for a GTX 580 (or some other card) to fit the M2090.
tentonine is offline   Reply With Quote
Reply

Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On

Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Under the hood of a M2090 Psi* Modding 44 11-15-2012 12:10 PM
Tesla Motors in Bad Need of Funding Rob Williams Off Topic 0 10-31-2008 02:49 PM
First Drive of the Tesla Roadster Rob Williams Off Topic 3 02-10-2008 06:01 PM
Tesla to produce 800 cars in first year Rob Williams Off Topic 8 07-28-2007 09:37 AM
Tesla electric car... Rob Williams Off Topic 13 08-22-2006 02:05 AM


All times are GMT -4. The time now is 02:05 AM.