Sparkle Calibre P850+ 8500GT 512MB

by Rob Williams on August 21, 2007 in Graphics & Displays

Having trouble deciding on which entry-level GPU to pick up? Sparkle’s Calibre series might make your decision easier with it’s P850+. In addition to DX10 support and a unique LED-readout, it offers an additional 200MHz on the core and 320MHz on the memory over reference 8500GTs.


When the time comes to purchase a new GPU, there are many names that might come to mind. eVGA, XFX, Sapphire, BFG and so on. But one name not likely to be thought of in North America is Sparkle. You might be surprised to know though, that they’ve been around for quite a while.

Established in 1982 in Taiwan, they are one of the world leaders in manufacturing NVIDIA-based graphic cards, and are most popular in Europe, where their cards can be found for purchase quite easily. Months ago, they decided to to dip their toes into the North American market, and should be slowly leaking their products into our favorite e-tailers in the coming months.

Calibre is a branch of Sparkle that specializes in GPUs for enthusiasts, so when compared to their regular line-up, Calibre’s will always look better, be more powerful and include unique features. Comparing a Calibre 8500GT to a Sparkle 8500GT is like night and day. While Sparkle’s version clocks the card at 450/400MHz (Core/Mem), the Calibre version clocks at 650/720MHz (Core/Mem).

As I have come to find out, Calibre’s version of the card is the highest-pre-overclocked card on the market. Stock 8500GT’s are clocked at 450MHz on the core, but Calibre’s upped that by 200MHz to settle at 650MHz. That’s quite the overclock, and the closest competitor I could find had theirs clocked at 600MHz. How much further can it possibly be pushed? You might just be surprised.

Closer Look

As you can tell by the chart below, this card is not for high-end gaming. Rather, it’s suited for those who want a decent gaming experience without a) spending too much cash and b) don’t want the bare minimum. Since Sparkle’s cards are difficult to find in North America, I have no idea on pricing. However, current prices for 8500GT of this ‘calibre’ average at ~$85USD on these shores.

Brand and Model Sparkle P850+ (GeForce 8500 GT)
Core Clock 650MHz
Memory Clock 1440MHz
Memory Size 512MB GDDR3
Memory Interface 128-Bit
Max Resolution 2560×1600
1920x1080i (HDTV)
3D APIs DirectX 10.0
OpenGL 2.0
Other Features Dual-Link DVI
SLI Capable
HDCP Ready

GeForce 8800 UltraGeForce 8800 GTXGeForce 8800 GTSGeForce 8600 GTSGeForce 8600 GTGeForce 8500 GTGeForce 8400 GS
Core Clock (MHz)612575500675540450450
Shader Clock (MHz)15001350120014501190900900
Memory Clock (MHz)10809008001000700400400
Memory Total768MB768MB640MB / 320MB256MB256MB256MB256MB
Memory Interface384-bit384-bit320-bit128-bit128-bit128-bit64-bit
Memory Bandwidth (GB/sec)103.786.4643222.412.86.4
Texture Fill Rate (billion/sec)39.236.82410.88.643.63.6

The packaging is like none other. It’s pure black with a subtle pattern, curved at the edges and opens up at the top. What strikes me though, is that it’s difficult to tell what’s inside… you have to look very close to find model information. It’s found on the side, in very small font.

Included in the box is software, manual, VGA-to-DVI adapter and also a video-out cable.

The 8500GT might not be the most powerful card on the market, but does Sparkle ever have a knack for making it look great. Anything cloaked in a pure black scheme is usually a winner with me. It’s stylish, clean and has flames! Could it get much better than that?

At the back, you can see the two DVI ports as well as the video-out.

What the heck is that LED readout, you ask? I asked the exact same thing, but once installed I found out it’s to read the GPU core temperatures and also for diagnostic purposes if there is a problem. I quickly compared temperatures taken with this readout and also Everest 4.0, and both were identical. It’s bling, really, but it looks cool and serves a purpose.

Lastly, the obligitory shot of the back of the card.

With our look at the physical card out of the way, let’s jump into our testing methodology and then 3D Mark.