Latest Forum Posts

Latest News Posts
Coming Soon!
Social
Go Back   Techgage.com > Archives > Reviews and Articles > Zalman VF3000F GTX 570/580 GPU Cooler Review

Reviews and Articles Discussion for Techgage content is located here. Only staff can create topics, but everyone is welcome to post.

Thread: Zalman VF3000F GTX 570/580 GPU Cooler Review Reply to Thread
Your Username: Click here to log in
Image Verification
Title:
  
Message:
Post Icons
You may choose an icon for your message from the following list:
 

Additional Options
Miscellaneous Options

Topic Review (Newest First)
05-27-2013 10:13 PM
Rob Williams
Quote:
Originally Posted by spixel View Post
Our fanmates are the same. What I meant is the medium should have been tested at 30 percent. Like you mentioned turning the fan contoller past 50 percent does not increase fan speed so it was a bit pointless putting 50 as medium, instead that should have been high. There is 3 distinct speeds on the fanmate, but they only operate over a half a turn of the control knob.
It's hard to judge what's a true medium when the sound and feel of the fan differs so little at >50% speeds. I do agree somewhat, however. 50% might seem like a medium but if a fan doesn't kick in until well past 0% on the dial, than that could be adjusted.

Quote:
Originally Posted by spixel View Post
I think one reason your temps might have been high is due to poor case airflow. The Zalman dumps all the heat into the case and it doesn't look like you had an exhaust fan in your test setup because of the corsair cpu cooler. There is also no direct fresh air going to the gpu so it will keep recycling the hot air it blows out.
That's a good observation, and I agree. But with the advent of these AIO coolers, I can see many running into this "problem". I wouldn't mind a chassis design where I could mount a fan off the door to point towards the GPU. I have so much stuff crammed in my chassis, it heats up something awful. Almost to the mount where I'm considering a better cooling solution.

For what it's worth, I'm still running this Zalman cooler, and it's been holding up well. Idling at 39C. I moved apartments since we last spoke, and this one happens to be a wee bit warmer.

The test bench with three monitors might be contributing to that a bit though ;-)
05-27-2013 02:29 PM
spixel
Quote:
Originally Posted by Rob Williams View Post
Hi spixel:

I admit I am still left a little confused. You state the RPM doesn't increase after 50%, but the medium and high temperatures differ... I'm not sure what's up there, when mine remained the same. The fanmate I have doesn't state low/medium/high; it's just a dial with the word "up" and an arrow showing which way to turn it. I judged the 50% mark based on how much I had to turn it from its lowest setting to its highest.

For what it's worth, I'm still using the cooler in the same PC it was tested in. Sitting idle at 35C at the moment... not too bad given the rather high ambient temp in here.

/me kicks his A/C.
Epic bump again!

Our fanmates are the same. What I meant is the medium should have been tested at 30 percent. Like you mentioned turning the fan contoller past 50 percent does not increase fan speed so it was a bit pointless putting 50 as medium, instead that should have been high. There is 3 distinct speeds on the fanmate, but they only operate over a half a turn of the control knob.

I think one reason your temps might have been high is due to poor case airflow. The Zalman dumps all the heat into the case and it doesn't look like you had an exhaust fan in your test setup because of the corsair cpu cooler. There is also no direct fresh air going to the gpu so it will keep recycling the hot air it blows out.
09-21-2012 03:09 AM
Rob Williams Hi spixel:

I admit I am still left a little confused. You state the RPM doesn't increase after 50%, but the medium and high temperatures differ... I'm not sure what's up there, when mine remained the same. The fanmate I have doesn't state low/medium/high; it's just a dial with the word "up" and an arrow showing which way to turn it. I judged the 50% mark based on how much I had to turn it from its lowest setting to its highest.

For what it's worth, I'm still using the cooler in the same PC it was tested in. Sitting idle at 35C at the moment... not too bad given the rather high ambient temp in here.

/me kicks his A/C.
09-20-2012 11:05 PM
spixel Sorry for bumping such an old thread but I just had to comment. I never understand how so many review sites out there seem to get things wrong. I bought one of these coolers and it dropped my 570 on load from 83 degrees to 47 degrees on lowest fan speed. 44 @ medium fan and 40 @ high fan.

You're also not the only reviewer who got a bit confused with the low/medium/high settings of the cooler. I noticed right away that after 50 percent the rpm does not increase, however there is still 3 clear low/medium/high settings with the fanmate. You tested low/high/high in your review. I'm not sure why your results are so far off from mine though, maybe it was not mounted properly.
08-17-2011 04:44 PM
Optix Nerd rage tech debates are never over. Dis is teh interwebz!
08-17-2011 10:15 AM
RainMotorsports
Quote:
Originally Posted by Relayer View Post
Your overall complaint currently applies to both brands. Both companies higher end cards use too much power and run too hot, IMO. The HD6900's aren't exactly quiet or cool. I think it's because we are still on 40nm when we should have been at 32nm by now. Soon, when 28nm arrives I think we'll see lower powered, cooler running cards that give better performance.
I was so sure this was so over lol.
08-17-2011 09:42 AM
Relayer
Quote:
Originally Posted by 2Tired2Tango View Post
I'm sorry if this is a foregone conclusion sort of thing... But I don't get it...

What's with NVidia? Their chips are furnaces, they suck more power than some CPUs and far as I can tell, they're nothing to write home about...

Case in point... I recently had a system come to me with power problems. In the process of trying to diagnose the screwup I pulled out the NVidia card and stuck in a $39.00 ATI card and the system straightened right out... But it didn't end there, suddenly it was playing 1080p avi tests perfectly and running DPI Latency tests showed far lower latency than with the NVidia card... Games worked about the same... A $39.00 passively cooled ATI Radeon 4500 beat the pants off a $250 NVidia card with all it's fan noise, external power etc...

And now we have to use external coolers on them???

So why is NVidia such a big contender?
I'm sorry but I just don't understand it....

http://www.thesycon.de/deu/latency_check.shtml
Your overall complaint currently applies to both brands. Both companies higher end cards use too much power and run too hot, IMO. The HD6900's aren't exactly quiet or cool. I think it's because we are still on 40nm when we should have been at 32nm by now. Soon, when 28nm arrives I think we'll see lower powered, cooler running cards that give better performance.
08-11-2011 09:56 AM
2Tired2Tango
Quote:
Originally Posted by RainMotorsports View Post
I just have to say 9 dollar usb cord! Thats expensive lol. I hope it's USB 3.0. I know I know wasn't the point. Ever since Monster invited Gizmodo to do a Mono Price vs Monster HDMI test in their own labs I have been a monoprice customer. Not that I ever bought anything from monster but the other options are not cheap either for HDMI. Cant speak for all the chinese junk they have but the cables are outstanding. I drove to 3 stores to find out a SATA cable was 20 bucks and left and even with the gas burnt still saved 10 bucks going home and ordering it off monoprice.
My point exactly....

I see this stuff all the time... Audiophiles who claim they can hear --actually hear-- bit jittering in a latched sound chip... measured in nanoseconds... Gamers who argue high frame rates on 60hz monitors... hot rodders who debate the merrits of different paints in their race times... People, this is called "lost in the minutia!" and more often than not it does little more than empty your wallet, which is the manufacturer's intent.

The days when the goal was to provide a quality product to fill a consumer need are over. These days it's all about finding new ways to part you from your money... and most often with expensive crap that, by and large, does nothing new or better.

Coolers are a good case in point... Zalman seems to produce heatsinks artistically, looking like some giant alien flower growing in your computer rather than on any scientific bases... they almost all ignore the primary rules of good cooling...
1) You need MASS pull heat off the device.
2) You need surface area to dissipate the heat.
3) You need airflow to take the heat away.

A long time ago, back when Toms Hardware had just released their "What happens if you take off the heatsink" video and AMD was exposed as a fire hazard, I got into this debate and described a cooling solution for AMD chips that ended up being pretty much what they did with their x64 line... big plate on the chip, lots of mass, tall fins and a decent fan... All very basic, but all based on sound science.

Now NVidia is doing the AMD thing... way too much power dissipation, tiny heat exchange surfaces, inadequate ventilation etc... and, as I pointed out from the start, their performance doesn't appear to be anything to brag about... especially when you consider there's probably very little difference in performance as the price climbs exponentially...

At the risk of over-driving this point. The guy who's computer I fixed with the $35.00 card, went out and bought a far more expensive ATI card and got the same result... It's a common --and painfully stupid-- consumer mistake to think that "expensive equals better", when most often it merely means "expensive".

The one review I'd love to see, but I'm betting no reviewer would ever do is the "How much better is it?" comparison... $35.00 ATI vs $150 ATI... $40 NVidia vs $250 NVidia... just how much more performance does your money get you?
08-11-2011 04:39 AM
RainMotorsports I just have to say 9 dollar usb cord! Thats expensive lol. I hope it's USB 3.0. I know I know wasn't the point. Ever since Monster invited Gizmodo to do a Mono Price vs Monster HDMI test in their own labs I have been a monoprice customer. Not that I ever bought anything from monster but the other options are not cheap either for HDMI. Cant speak for all the chinese junk they have but the cables are outstanding. I drove to 3 stores to find out a SATA cable was 20 bucks and left and even with the gas burnt still saved 10 bucks going home and ordering it off monoprice.
08-10-2011 09:20 PM
2Tired2Tango
Quote:
Originally Posted by Kayden View Post
My mistake wont happen again t2t2. I only went off of what you posted and the dude ref wasn't meant to be demeaning it's just how I choose to speak, no insult intended. I just wont comment on half written explanations, you obviously did more then what you said but how was I supposed to know? I let my passion for getting the right thing done get a head of getting more info but I seriously believed you had not done more than that because there was no indication that you had. I just wont let it happen again, sorry I didn't go there before now and you took it the way you did. Again my mistake I wont let it happen again.
No worries... I should probably not have assumed you would understand I'd done the required tests in a semi-credible manner. I'll be more concise next time.
08-10-2011 08:32 PM
Kayden My mistake wont happen again t2t2. I only went off of what you posted and the dude ref wasn't meant to be demeaning it's just how I choose to speak, no insult intended. I just wont comment on half written explanations, you obviously did more then what you said but how was I supposed to know? I let my passion for getting the right thing done get a head of getting more info but I seriously believed you had not done more than that because there was no indication that you had. I just wont let it happen again, sorry I didn't go there before now and you took it the way you did. Again my mistake I wont let it happen again.
08-10-2011 07:21 PM
2Tired2Tango
Quote:
Originally Posted by Kayden View Post
If there is a problem with power and a lower power rated card resolves the problem, then it's a problem for the psu and if putting in another one doesn't fix it then you now have two problems to sort, this 2nd being a driver issue. I know there is one because you had to remove the Nvidia driver and then install ATI drivers, then it started to work just fine. What I don't understand is if you saw the ATI card work better, why didn't you reinstall the Nvidia card and install it's drivers again, to see if there was a performance difference from when it was originally in there?
We did... how do you think I arrived at my conclusions... Trust me, I'm not so dumb as to compare a malfunctioning system to a good one... Even me with my mere 30 years experience knows better than that. I've been fixing this stuff longer than half of my customers have been alive...


Quote:
I am sorry if it caused an upset dude but I only had certain amount info to work with and made the best logically conclusions I could, but I just didn't agree with your process about ts, not about if you think Nvidia or Ati is better.
Actually it works like this... You assumed I was stupid and talked down to me... which is going to bother me every time. ESPECIALLY when people start calling me "Dude".
08-10-2011 07:18 PM
2Tired2Tango
Quote:
Originally Posted by marfig View Post
I see your point.

I think the single most important factor is screen resolution.
The test monitor on my bench is 1080p... Not sure what he's got at home.

Quote:
So what then? Well, most of the problem is a certain "moar is better" culture that surrounds the gaming community, coupled with lots of misinformation and an almost pathological vulnerability to the placebo effect; "I swear I play better at 120 fps than at 80 fps!" (a physiological impossibility).
Especially on a 60hz screen....

Sorry guys I don't care what fram rate you're composing at... y'er still watching 60hz.

Quote:
I won't deny anyone the pleasure of spending money on something really cool and powerful. Pride is a good thing. But then, for some reason, people try to get all sorts of justifications to this behavior. They shouldn't! It's perfectly fine. But they do anyways. And it's those justifications that create myths and misinformation.
I don't much care how people waste their money either... it's the "being mislead" part that sticks in my craw... I once had a guy ask me --are you ready for this-- which power cord would sound better on his stereo... Yes, I said Power Cord... ROFL...

Quote:
But up the screen resolution (and the type of games one plays, or the work they do on the computer) and things start to become more justified. At 1920x1080 certain games will simply not run well on low end cards. Gamers want their games to run at the top settings because these do affect the quality (the beauty) of what one sees on the screen. At that resolution they can't do this with modern games on a $100 card and some will still have some troubles with a $200 card.
Don't get me wrong, I'm not saying there isn't a difference... I'm just wondering it it's anywhere nearly as big a difference as is claimed...
08-10-2011 06:15 PM
Kayden
Quote:
Originally Posted by 2Tired2Tango View Post
What part of...

Quote:
Case in point... I recently had a system come to me with power problems
Did you not understand?

Actually, I'm talking about an entire computer that straighened out and flew right with a much cheaper, much less power hungry, cooler running, video card.

Even patching in a higher rated supply didn't bring that level of improvement.
I understood that and ran with what you said. I wasn't trying to get you upset so just calm down for a sec, so I can bring you into the fold of where my head is at.

If there is a problem with power and a lower power rated card resolves the problem, then it's a problem for the psu and if putting in another one doesn't fix it then you now have two problems to sort, this 2nd being a driver issue. I know there is one because you had to remove the Nvidia driver and then install ATI drivers, then it started to work just fine. What I don't understand is if you saw the ATI card work better, why didn't you reinstall the Nvidia card and install it's drivers again, to see if there was a performance difference from when it was originally in there? What I would have done is put that video card back in with the slaved psu, if it worked fine I would then reconnect the original psu and see if it worked, the problem might have been driver all along, but that's how I would have gone about looking at the problem.

I am sorry if it caused an upset dude but I only had certain amount info to work with and made the best logically conclusions I could, but I just didn't agree with your process about ts, not about if you think Nvidia or Ati is better.
08-10-2011 06:01 PM
marfig I see your point.

I think the single most important factor is screen resolution. This has a strong effect on the card performance. There's really no point in spending hard money on a graphics card is i'm running a 1440x900 screen, for instance. Most entry level cards (~ $100) fill that screen resolution quite nicely. If there is a need to play certain heavy duty games, I'd certainly suggest a card in the middle middle (150 to 200 USD) for piece of mind. So when I see someone bragging about their > $300 card on such low resolutions, I'm left with a smile.

So what then? Well, most of the problem is a certain "moar is better" culture that surrounds the gaming community, coupled with lots of misinformation and an almost pathological vulnerability to the placebo effect; "I swear I play better at 120 fps than at 80 fps!" (a physiological impossibility).

I won't deny anyone the pleasure of spending money on something really cool and powerful. Pride is a good thing. But then, for some reason, people try to get all sorts of justifications to this behavior. They shouldn't! It's perfectly fine. But they do anyways. And it's those justifications that create myths and misinformation.

...

But up the screen resolution (and the type of games one plays, or the work they do on the computer) and things start to become more justified. At 1920x1080 certain games will simply not run well on low end cards. Gamers want their games to run at the top settings because these do affect the quality (the beauty) of what one sees on the screen. At that resolution they can't do this with modern games on a $100 card and some will still have some troubles with a $200 card.
This thread has more than 15 replies. Click here to review the whole thread.

Posting Rules
You may not post new threads
You may post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is On


All times are GMT -4. The time now is 05:37 PM.