Posted on March 9, 2009 8:55 AM by Rob Williams
Remember that ultra-cool all-in-one Eee Keyboard from ASUS that we saw at CES earlier this year? Well it turns out that the company is aiming to release it sometime during Q2, most likely in June. That improves upon the Q3 projected date that we were told in January, so things must really be shaping up. According to the Register, the keyboard will come in two flavors, wired and wireless (referring to the display capabilities).
The latter isn’t going to be too cheap, however. The wired version (meaning you’d plug it straight into your TV, making it a less-than-ideal HTPC) will retail for $400, while the wireless version will tack an additional $200 onto that. It’s unknown at this point exactly what kind of wireless technology it will be implementing, but it’s probably safe to say that the connector of choice will be HDMI.
On a semi-related note, are you tired of seeing ASUS plaster the word “Eee” on everything from netbooks to nettops to peripherals and beyond? According to company Chairman Jonney Shih, there’s no sign of the name diluting their product line-up (although if their CeBIT boot is anything like their CES booth, I’d have to disagree just slightly). Either way, the Eee brand is even in the minds of regular consumers, not just tech geeks, so it’s no wonder the company insists on using it wherever possible.

CEO Jerry Shen said the firm was working on two SKUs for the keyboard PC: a wireless version and a wired version – this simply refers to the way they connect to a monitor or TV. He said that given the economic situation it was dangerous to be specific on pricing this far out. But, he said, right now the firm expected the wired version should come in at around $400 (£285/€317), while the wireless version could be anywhere between $400 and $600 (£428/€476).
Read More
Comment (0)
Posted on March 9, 2009 8:15 AM by Rob Williams
If there’s one bill I begrudgingly pay each month, it’s the cell phone bill. The telecom companies boast about their great value on TV commercials, magazine ads and billboards, but I think most of us can see through the bull… cell phone plans are complete and utter ripoffs. But just how much of a ripoff are they? Well according to one independent study, the true cost may be as high as $3/minute. No joke.
The results are skewed thanks to those who have high-value plans but don’t use their phone often, but the fact remains… we all pay out the rear for phone plans, and chances are good that few of us actually get our money’s worth. I thought about this just the other day, when I received my $60 cell phone bill and noticed that I only talked on it for all of 2 hours during the entire month, and sent an received a total of 15 text messages. Ouch.
It’s estimated from the report that the average consumer pays between $0.50 – $1 per minute, which is helped by the fact that there are numerous extra fees on your bill each month (such as the required “service” fee). The lesson to learn from this is that we all get ripped off, whether you feel like you are or not. If you enjoy calling your money your own, it’s important to make sure your plan isn’t too large for your needs. Time for me to go back to the drawing board and figure out the same…

That $3-per-minute figure is skewed by the relatively small percentage of people who pay for a lot of minutes but barely use any. But even when those folk are taken out of the mix, most wireless customers still pay between 50 cents and $1 per minute, the study found. Shames said this wasn’t a problem just for San Diego residents. He said the findings of the report were representative of cellphone use and bills nationwide.
Read More
Comment (0)
Posted on March 6, 2009 2:00 PM by Rob Williams
If I had to guess, I’d assume that the majority of people reading our site are interested in overclocking in some way or another. Few of us can build a new machine and not have the word “overclock” linger in our head. Most of us can’t stand using a computer at its stock speeds, even if it happens to be extremely fast. There’s just something in us, and it’s hard to shake. Luckily, recent CPUs from both AMD and Intel have been catering to overclocking enthusiasts, with huge frequencies being reached… and most of them stable.
Unless you buy an Extreme Edition or Black Edition processor though, certain limits are imposed. When Core i7 was first launched, these limits were the CPU (or BCLK) multiplier, memory multiplier and also the QPI clock. On the i7-920 and i7-940, the maximum memory clock allowed was DDR3-1066, while the QPI was limited to 4.8GT/s. Well, I may be way out of the loop, but I somehow overlooked the fact that retail chips have since dropped the latter two limits… a great thing for overclockers.
While the CPU multi is still locked, it’s been confirmed by Intel to us that both of the non-EE models can utilize higher memory clocks and also utilize a 6.4GT/s QPI bus. As we found out shortly after launch, the QPI doesn’t make a significant difference, at least past 4.8GT/s, but the increased memory abilities will be appreciated by many. That means the chance at lower latencies, higher bandwidth, and faster performance. Not a bad deal.

As it turns out, Maximum PC stumbled upon an unpublicized feature of production 940 and 920 Core i7 parts. According to the publication, it had verified on three previous 920 and 940 CPUs that there was no access to Turbo Modes and that QPI was locked to 4.6GT/s along with locked memory settings.
Read More
Comment (0)
Posted on March 6, 2009 1:43 PM by Rob Williams
Whether or not you use free e-mail hosting or host your own e-m ail, I think it’s fairly impossible nowadays to use the Internet and not check your e-mail account fairly often. If you happen to use a free service, chances are that it’s either Yahoo!, Hotmail or Gmail, each which currently host millions of e-mail accounts. Currently, Yahoo! has an incredibly demanding lead in overall accounts, at 92.5 million, while Hotmail sits in second, and Gmail in fourth.
If you’re wondering why Gmail is not third, it’s because a little e-mail service called Zimbra (don’t worry, I’ve never heard of it either) just announced that they’ve hit the 40 million mark, which is in all “accounts” impressive. Here’s the kicker… Zimbra is a paid service only, which makes the number even more noteworthy. Zimbra offers various services, but the primary is software that businesses install on their own servers for use.
One of the reasons Zimbra’s numbers experienced a recent surge was thanks to Comcast’s decision to begin using the service. How many mailboxes that brought to the server though is unknown, but it’s likely well over 10 million at least. It’s just too bad that this e-mail service isn’t for the regular Joe, because it looks rather robust. If you happen to run your own server and want to give it a try, there’s a community edition of the software that will allow you to take full advantage of what it offers. What you’ll lack will be support, but if you are techy enough, that might not matter.

Frankly, it’s a shame that Zimbra ended up with Yahoo, which has 92.5 million mailboxes. Though Zimbra is a standout in the industry, Yahoo’s own strength in consumer e-mail likely keeps Zimbra in second place for resources internally, especially since Zimbra’s enterprise-grade e-mail may not be a tight strategic fit. Zimbra would have been an exceptional match for Apple or Adobe with their design-savvy customer bases.
Read More
Comment (0)
Posted on March 5, 2009 3:23 PM by Rob Williams
It was revealed last month that Intel would be offering CPUs later this year that feature both a CPU core and GPU core on the same substrate, and AMD’s Fusion is on track to be released about a year later. So with the two leading CPU companies sharing similar goals of implementing the GPU onto the CPU, what does that mean for integrated graphics parts?
According to Jon Peddie Research, it means that typical IGPs will go the way of the dodo very soon… as early as 2012. Their reports show that during 2008, 67% of graphics chips shipped were of the IGP variety, and they suspect that by 2011, that number will drop to 20%, then in 2013, the number will drop once again, to under 1%. That’s a stark difference, but it seems reasonable to expect such a decline, given that a hybrid design seems to make all the sense in the world.
Companies like ATI and NVIDIA still have reason to feel safe where their bread and butter is concerned, however, as these hybrid designs, while faster, will be unable to compete with discrete graphics, although it’s really difficult at this point to make a real assumption as to the real performance differences. For the first while, we’ll likely just see these designs in both mobile and lower-end desktop PCs, or even HTPCs.

JPR numbers show that in 2008, 67% of all graphics chips shipped were of the integrated variety. The prediction is that by 2011 the percentage of IGPs shipped will be 20%. The decline in IGPs shipping means that there will be gains in discrete GPU shipments and a significant growth in CPUs that feature integrated graphics cores.
Read More
Comment (0)
Posted on March 5, 2009 12:55 PM by Rob Williams
As we pointed out in our review of NVIDIA’s GeForce GTS 250 the other day, CUDA and GPGPU applications in general are sure to see an increase in numbers this year. It sure didn’t take look to see some proof of that. Nero AG at the opening day of CeBIT announced support for CUDA in their Move it application, a tool used to convert/recode various formats for use in many different media players.
I admit, I’ve never heard of this application until now, but I assume it’s rather new. At first glance, it looks to serve a similar purpose as Badaboom, in that it can convert video using the CUDA architecture in order to have it work on various mobile devices, such as the iPod and PSP. In addition, it also handles audio and photos. I’m not sure what the file support is like, but this is one application we’ll be sure to get a hold of shortly to test out.
As you probably would expect, the CUDA enhancements only benefit the video codecs at this time, as I don’t believe there are any audio-specific CUDA applications out there right now. Audio usually converts quite fast to begin with though, while it’s video that needs the most benefit. Either way, it’s good to see GPGPU applications on the rise, but it would be great to see more that utilize ATI’s own graphics cards as well, though.

By using CUDA technology to tap the massive parallel processing power of NVIDIA Graphics Processing Units (GPUs), Nero Move it makes tasks such as customizing an HD video for an iPod, go from hours to minutes. Even more time can be saved when creating full HD video content in the H.264 video compression standard.
Read More
Comment (0)
Posted on March 4, 2009 12:34 PM by Rob Williams
As if Intel didn’t have enough to worry about in their ongoing Intel vs. NVIDIA saga, it looks like the green team is actually planning on making an x86 chip – at least eventually. This is pretty much on par with what a recent rumor mentioned, although there’s a little more to it than NVIDIA simply creating a CPU to compete with the likes of AMD and Intel.
At the Morgan Stanley Technology Conference in San Francisco, NVIDIA’s Senior VP of Investor Relations answered the question of whether or not NVIDIA would ever release a microprocessor, and his simple answer was, “the question is not so much I think if; I think the question is when“. Bold statement, that’s for sure, especially given that CEO Jen-Hsun Huang showed zero interest about the prospect in the past.
But the deal seems to be that it wouldn’t be a desktop processor, but rather an SoC chip, similar to Tegra, but rather based on x86 rather than ARM (which is incompatible with all-things x86). That makes complete sense, as it would open up far more opportunities for Tegra. But then the issue lies with the fact that NVIDIA doesn’t have an x86 license. Who do they have to get it from? Intel. You put two and two together on that one.

However, Hara also pointed out that Nvidia’s x86 CPU wouldn’t be appropriate for every segment of the market, and would be mainly targeted at smaller system-on-chip platforms. “If you look at the high-end of the PC market I think it’s going to stay fairly discrete, because that seems to be the best of all worlds,” said Hara, adding that “a highly integrated system-on-chip is going to make sense” in the MID (mobile intelligent device) and netbook markets.
Read More
Comment (0)
Posted on March 4, 2009 9:32 AM by Rob Williams
What do you think of when you think of a “performance PC”? Chances are, you picture a large chassis filled to the brim with high-end parts, one that not only delivers great performance, but looks the part as well. One area that has gone mostly ignored though, has been the HTPC/SFF crowd. Few manufacturers seem to care about mATX motherboards, and though there are some great ones out there, not many give the same feeling of completeness that their desktop counterparts do. Even the packaging is boring most of the time.
Well that changes with ASUS right now, with the launch of their Rampage 2 GENE mATX motherboard. This board is no slouch, and I’m having a hard time figuring out how they managed to fit so much stuff onto it. First, it uses Intel’s X58 chipset, not a variant normally designed for smaller form-factors. It also packs in support for both CrossFireX and SLI, assuming you can fit the two cards in that you wish to use (dual-slot coolers will be tough).
In addition, despite the lack of space, the board still includes six DIMM slots, allowing up to 24GB of RAM to be installed. It doesn’t stop there either… we also have 7 S-ATA ports, 1x eSATA, 1x IDE, 8-channel audio and more. As you can see in the photo below, this is one packed motherboard, and I find it impressive that we can fit so much into such a small area. Whether or not it will be a hot pick for SFF gamers is unknown (I’m sure pricing will determine that), but it’s definitely nice to have the option of a true enthusiasts mATX board for once.

With the Rampage II GENE, ASUS has masterfully filled the micro ATX gap with its lauded ROG product line. Users can now harness the full power of the Intel Core i7 processor in a chassis the fraction of the size and weight of standard desktop PCs. The Rampage II GENE’s awesome combination of overclockability, tweakability and stability enable it to even outgun other full-sized motherboards.
Read More
Comment (0)
Posted on March 3, 2009 8:20 AM by Rob Williams
Being a Gunner, it’s not often I go out of my way to speak highly of Manchester United, but this story is far too good to ignore. We all know that the iPod and products like it are convenient for various reasons. You can bring music along with you, wherever you go, along with video, images and more. As we learned over the weekend though, there’s another use that Apple likely never thought of during the iPod’s development process.
The Carling Cup final was held on Sunday (sponsored by the popular lager by the same name), which featured Manchester United against Tottenham Hotspurs. Due to a draw at the FT mark, an additional 30 minutes were played, which equated to a 120 minute match. Thanks to a persisting draw (0:0), the game’s resolution could only be decided upon by a penalty shootout… the process of each team offering five players to attempt a goal on the opposing team’s goalkeeper. All in all, it’s never a good way to end a game, but the winner has to be decided somehow.
Picture yourself standing in front a huge football net, with no one else around except the opposing player straight ahead of you. It’s tough to predict what that player is going to do, so how do you get the upper-hand? Why, you watch footage of your opponents last shootout on your coach’s iPod. Yes, seriously. Man U’s alternate GK, Ben Foster, noted how each player took their shots, which gave him an obvious advantage. Sure enough, the game ended 4:1 in Manchester United’s favor.
Surprisingly, this practice was actually let to pass by the Football Association, which really impresses me, given they normally seem more than happy to pass out fines liberally. Perhaps we’ll be seeing even more of this in the future? Only time will tell.

“I’d been told (that) if O’Hara took a kick…he would probably go to my left,” Foster told the Sun, recalling how the information recorded on his iPod helped him out. “It was great; that was exactly what happened, and I managed to get a hand to it.” Thanks to Foster’s save, the United beat the Spurs 4 to 1 on penalties and, as you’re reading this, the players’ heads are feeling as if they have been subjected to unsafe decibel levels. The cause, however, will probably have been somewhat liquid.
Read More
Comment (0)
Posted on March 3, 2009 7:30 AM by Rob Williams
Yesterday, Intel made a surprise announcement regarding a “long-term strategic” cooperation with TSMC (Taiwan Semiconductor Manufacturing Company) to help produce Atom processors. Although at this point in time, the announcement is about a memorandum of understanding, we can expect to see the TSMC producing Intel’s 32nm Atom processors in the near-future.
This announcement marks the first time Intel has ever ported a processor to an external source, and given the number of fabs that the company already operates, what benefit do they actually hope to see here? According to the press release, it’s so that Intel can gain more exposure to other markets, but as Ars Technica reports, there’s no doubt a lot more to it than that.
The problem in theory is simple. Intel’s desktop processors aren’t in demand right now, but their Atom processors are… and they are selling them by the bucket-load. However, each Atom processor costs significantly less to the customer than a desktop CPU, so Intel has to sell a lot more of them in order to see profit. The situation is made even worse when Intel has to spend lots on R&D and also devote fabs to Atom… a product that will never generate near as much revenue as a fab that produces desktop processors.
So by having the TSMC help produce Atom CPUs, they can use their R&D elsewhere, and also free up some space in their own fabs for production of other products. It’s still all a very sticky situation though, because this move proves that Intel cannot rely on Atom for much of anything… the money simply isn’t there. The economy isn’t helping their desktop CPU situation either, so until that all clears out, Intel is in a complicated position.

Intel’s margins on Atom are much thinner than those on regular desktop and server CPUs, and with each process shrink, Atom’s cost (and price) will go down. But fabs get more expensive with each shrink, so the result is that Intel has to sell many more Atoms at 32nm than it does at 45nm to make money. The demand for all of those Atoms may or may not materialize, which is why Intel will pay TSMC to fab them and share the risk that the demand may not be there. Meanwhile, Intel wants to save its (very costly) in-house fab capacity for high-margin products, like its CPUs.
Read More
Comment (0)
Posted on March 2, 2009 10:21 AM by Rob Williams
With the first release candidate for Windows 7 to be released next month (rumor), you might be wondering what kind of changes will be put in place from the beta. Well, you don’t have to wonder, as the developer team posted a blog update last Friday that details 36 different changes, all of which they take a look at in some depth.
If you’ve been using the beta at any length, then many of these changes should be pretty obvious. The first change mentioned regards the “Aero Peek” feature, one that shows an outline for currently opened windows. Due to demand, this has now been tied into Windows’ ALT+TAB functionality. Also new are shortcuts for the quick launch. Pushing the Windows Key + # (respective number for the application in your quick launch), will go ahead and launch it. This can prove useful if you don’t currently have control of the mouse.
Another somewhat major change is the taskbar scaling, and the number of icons that are able to be placed. Because they unified the launching/switching buttons, we’ve gained around ~25% more space, depending on your resolution. Also, since Windows 7 doesn’t allow applications to automatically insert themselves into your quick launch, to make things easier, the OS will place the newly installed application towards the bottom of your start menu, so that you will immediately see it next time you go in there. That’s to allow you the option to drag it to the quick launch, should you want to.
There is a lot more new than what I mentioned here though, so if you want to study what’s en route, be sure to check out the blog post.

Building on the previous post that looked at the broad view of feedback, we want to start posting on the feedback and the engineering actions we’ve taken in responding to the feedback. We won’t be able to cover all the changes (as we’re still busy making them), but for today we wanted to start with a sampling of some of the more visible changes. We’re still on the same path working towards the release candidate and of course we know everyone is anxious for the next phase of our path to RTM.
Read More
Comment (0)
Posted on March 2, 2009 9:55 AM by Rob Williams
We haven’t reported on Amazon’s new Kindle 2 yet, but a new story has given me the push needed to talk about it a bit. First and foremost, the Kindle is a great e-Book reader, no question. One cool feature on the “2” though, was “text-to-speech”, which does exactly as you’d expect. Rather than have to go purchase an audio book, you could simply load up your e-book, and let the Kindle 2 read it to you.
Now, given what I’ve heard from text-to-speech software in the past, I’m assuming that the voice quality is nowhere as good as a real audio book, but that doesn’t much matter, because the Authors Guild (there’s another guild??) had something to say about it, and Amazon had very little delay before caving in. Now, the text-to-speech feature is optional to book publishers, so while some books may use the feature without issue, others will ignore it.
This move, of course, is one of the dumbest that the Authors Guild could have pulled off. As I mentioned, there is no way the quality of the voice is so good that it could actually replace a real audio book, and aside from that, if someone is actually keen on having this feature (someone who is blind, or hard-of-seeing, or someone who just wants to sit back and listen), the best option once again becomes piracy. So, not only would they lose an audio book sale, but they’d lose a book sale in general. I can’t wrap my head around organizations like this…

Well done to the Authors Guild! Amazon revealed last night that the text-to-speech feature in the Kindle 2 will now be optional for publishers. The guild had been tenaciously fighting this feature, arguing that it had the potential to turn the Kindle 2 into a de-facto audiobook player. Right or wrong, Amazon has caved, and now publishers will be able to dictate whether or not the Kindle 2 is able to read aloud their books.
Read More
Comment (0)
Posted on February 26, 2009 1:23 PM by Rob Williams
The release date for Microsoft’s much-anticipated Windows 7 has been up in the air for a while, and even though no one can assuredly pick a date for release (last-minute delays are all too common), it appears that based on current plans, Microsoft hopes to begin shipping in September, or early October. This date was formed by the President of Compal, a manufacturer who produces systems for numerous top-rate PC makers.
If true, then Windows 7 would be launching at both a convenient and inconvenient time. If late September, then school will be in full-swing, and the idea of upgrading a computer might not be on anyone’s mind. On the other hand, they’ll release it far ahead of the holiday season. Whether or not it’s too far ahead though, we’re not sure. Either way, if the OS does indeed come out in September, there are sure to be many pleased fans, who don’t have to stick to the mindset that they’ll be waiting until 2010 for the new OS.
I admit, I haven’t put much time into testing Windows 7 myself, but in the little amount of time I’ve spent with it, I liked what I saw. Some call it nothing more than a robust service pack, and while that might appear to be the case on the surface, taking a deeper look will reveal otherwise. I really can’t wait for the new OS to arrive, because it looks like we’ll have a real reason to want to upgrade. For those who haven’t bothered to upgrade to Vista since its release, the upgrade will be even more impressive.

Amelia Agrawal, a spokeswoman for Microsoft, maintains the official company position that Windows 7 will be available within three years of when Vista shipped. However, the company’s upgrade program plans and other leaks have increasingly suggested that the public goal, which would put the release in early 2010, is deliberately conservative and meant to avoid embarrassment in the event of an unexpected delay.
Read More
Comment (0)
Posted on February 26, 2009 7:30 AM by Rob Williams
Ah, the age-old question. Whether you realize it or not, the market has many operating systems, but there are only a select few that are large enough to be mentioned on a regular basis. On the desktop side of things, these of course include Microsoft’s Windows, Linux and Apple’s Mac OS X. So which is the biggest threat to Redmond’s largest company? Believe it or not, it’s themselves.
In a recent presentation given by Microsoft CEO Steve Ballmer, a slide was shown that exhibited the company’s interpretation of market share between the various OS’. Microsoft’s own Windows OS of course took first place by a rather significant margin, but second place was handed to “Windows Unlicensed”, also known as, “pirated”. No one is naive enough to believe that Windows isn’t pirated by millions of people, but it’s kind of surprising to see it mentioned so starkly on one of the company’s own charts.
In second place lands Linux, which inches above Apple. That fact in itself is also quite interesting, since this foil refers to only home and business PCs. However, this is on a worldwide scale, and given Apple doesn’t have huge success outside of the US as Linux, the results here are not all-too-surprising. Aside from OS usage, the same foil also showcases R&D costs, and not surprisingly, Microsoft takes the cake with $8.2 billion dollars invested (we assume from 2008). Apple in the same period invested $5.3 billion, while RIM invested $2.9 billion.

I think depending on how you look at it, Apple has probably increased its market share over the last year or so by a point or more. And a point of market share on a number that’s about 300 million is interesting. It’s an interesting amount of market share, while not necessarily being as dramatic as people would think, but we’re very focused in on both Apple as a competitor, and Linux as a competitor.
Read More
Comment (0)
Posted on February 25, 2009 1:08 PM by Rob Williams
AMD today has released information regarding live public demonstrations of their upcoming Istanbul server processor that will become available later this year. Istanbul isn’t considered a successor to current Shanghai-based Opterons, but rather a compliment. These new server CPUs will be based on the same underlying architecture, but rather than stick to a Quad-Core design, these will feature six cores (Hexa-Core).
Desktop PC fans will have to settle down though, as AMD hasn’t mentioned anything regarding a potential six-core desktop chip. I think that move is inevitable, however, but it might not happen until later next year. Either way, the technology here is enough to get even non-server users excited. AMD boasts the fact that they’ll be the first ones to offer six-core processors for 2P and 4P (2 or 4 processors) platforms. Pack four together in a 4P machine and you’ll have a staggering 24 cores at your command.
The coolest apart about Istanbul might be the upgrade path it offers. Current users of recent Opteron processors will be able to upgrade to Istanbul with little issue, since the socket hasn’t changed. Luckily, neither has the thermal envelope, although I’m willing to bet that the power envelope has seen a slight increase (I can’t see reference to that anywhere, though). To prove just how easy it is to upgrade though, AMD has posted a video (YouTube) that shows them taking Istanbul processors from one machine and replacing the Shanghai processors in the other. The entire process took a little over eight minutes, which is rather impressive.
As we mentioned yesterday, AMD really needs to get some things sorted out in order to become stable again, and by the looks of things, Istanbul may very well be the next major step in AMD’s plan to strike back. Even Intel’s platforms don’t offer such simple transitioning, so Istanbul is really going to attract users of current Opterons.

Despite putting more cores in the processor, we managed to keep it in the same power and thermal ranges as our existing “Shanghai” processors. And since it fits into the same socket, our OEM customers should be able to bring products to the market quickly. End users will be able to quickly qualify and deploy these servers because the overall platform is the same as what they are using. In today’s challenging economic times, that’s music to the ears of IT departments both near, and as far away as Turkey.
Read More
Comment (0)
Posted on February 25, 2009 11:24 AM by Rob Williams
If there’s a fierce rivalry to ever take the proverbial cake, is has got to be Intel vs. NVIDIA. Both companies have been at each others throats for quite a while now, so when you hear one say something less-than-ideal about the other, it’s hard to know what to believe. The latest issue that’s arisen comes courtesy of Intel, who is trying hard to downplay NVIDIA’s ION platform (which we took a thorough look at here).
According to a document seen by Bit-Tech, Intel is warning their customers about NVIDIA’s supposed exaggerations about what ION can do. This includes jabs at ION for utilizing a chipset that’s both not new, and riddled with issues, while also poking the fact that the ION cannot completely handle HD playback (which it mostly can – the document refers to Tech Report’s article where they said that 1080i content had issues, but 1080p did not).
There’s a lot more to it than that, so I recommend you check out the article there. Since that was posted, Fudzilla has received an official response from NVIDIA regarding Intel’s latest document, and they refute pretty much every claim by Intel. They mention that the graphics power of ION is 10x of what Intel can produce, and that their MCP79 chipset has been picked up by many vendors worldwide. Regarding power consumption claims made by Intel, NVIDIA says that reviewer units didn’t have any power management in place, hence the slightly higher power draw.
One thing’s for sure… this is one heck of a messy duel. The truth is that right now, Intel’s platform exists, while NVIDIA’s is still en route. Intel plans to follow-up with an updated version of their platform later this year, which apparently gives NVIDIA a short “window of opportunity”. Either way, I’m still looking forward to ION, and despite the fact that we haven’t had one in our labs yet, we’ve been very impressed by what we’ve seen in our private meetings with the company. We’ll report more on this story as time goes on.

As well as this, Intel has also taken Nvidia’s claims about the Ion’s benefits over Intel’s own Atom platforms. In response to Nvidia’s claims about HD video decoding, Intel says that “Intel offers full Hi-Def video decode with HW acceleration with the off-roadmap Mobile Intel GN40 Express Chipset.” The company also refers to an article on the Tech Report, saying that “Preliminary press reviews indicate Nvidia’s Ion HD playback may not be as good as Nvidia claims.”
Read More
Comment (0)
Posted on February 24, 2009 11:35 AM by Rob Williams
Two summers ago, Apple at their WWDC conference announced that their Safari browser would be released for the Windows platform, alongside their OS X offering. This was rather significant news, but thanks to some issues that plagued the original release, the browser didn’t remain in the limelight for too long. That changed a bit last spring when the company followed-up with their 3.1 release, but even then, Safari has had a rough time making real headway with the likes of Mozilla Firefox and IE dominating the market.
Apple hopes that this fact will change sooner than later, and to help coax more people to try out their browser, they’ve today unveiled a beta version of 4.0, a high-tech offering that is the first to pass the gruelling Acid3 test (I verified, it does indeed score 100/100). According to Apple, it also renders Javascript 4.2 times faster than Safari 3, and 3 times faster than Firefox 3… bold claims, and claims I’m sure some will be investigated soon.
Improvements as such are nice, but there’s a lot new on the UI front as well. The browser sports a brand-new interface that reminds me nothing of the previous version, and I have to say, it looks quite nice. It’s not as minimal as Google Chrome, but it’s close. Despite the clean look though, there’s a lot of eye-candy, including smooth transitions and even a replica “flipper” feature that allows you to flip through recent sites and searches.
There’s too much new to mention here, but if you are curious about trying out a new browser, or want to upgrade from Safari 3, then the beta sure seems stable enough to begin using for full-time purposes. I could be wrong though, and there’s always a risk you take with betas. But given Apple is actually making this the default download for Safari, even they must find it stable enough.

Apple is leading the industry in defining and implementing innovative web standards such as HTML 5 and CSS 3 for an entirely new class of web applications that feature rich media, graphics and fonts. Safari 4 includes HTML 5 support for offline technologies so web-based applications can store information locally without an Internet connection, and is the first browser to support advanced CSS Effects that enable highly polished web graphics using reflections, gradients and precision masks.
Read More
Comment (0)
Posted on February 24, 2009 10:54 AM by Rob Williams
When AMD released their Phenom II processors during CES, reviewers all over exclaimed just how great they were. Even though they didn’t compare to Core i7, or even beat out Core 2 in all tests, the fact that AMD came that close was nice enough, since their previous products were a bit lacking, and riddled with bad press through their lifetime. Dollar for dollar, AMD’s processors wouldn’t be someone’s first choice, but AMD has been doing a great job of remaining price-competitive, and that’s been extremely important. (For our review of AMD’s most-recent AM3 processors, you can click the image below.)
So with AMD back in action with nicely-performing products, how does their future look? According to Ars Technica, it’s really hard to say, but it’s not looking that great right now. To help put things into perspective, Joel Hruska first found a stable overclock on the Phenom II 940 (4.2GHz) and then compared it to the Core 2 Quad QX9650 (3.0GHz) and also the Core i7-920 (2.66GHz) and i7-965 EE (3.20GHz). On paper, it looks like one should obliterate the rest, but that’s not the case.
In some tests, the majorly overclocked Phenom II couldn’t even outpace the i7-920, which isn’t such a surprise given Intel’s optimizations for certain workloads (like multi-media and math algorithms), but it’s still a troubling to see. The goal of the article was to show that even if AMD released new higher-clocked parts, it’s not going to do them much good. AMD will need to deliver worthwhile architecture upgrades in order to a) surpass Core 2 performance and b) at least come close to Core i7 performance. Given the state that AMD’s been in lately, that’s going to be a difficult thing to pull off, as an ongoing discussion in our forums make note of.
It’s far from impossible for AMD to strike back with a product that competes with Core-i7, but given the economy and the major changes constantly moving within the company, it’s going to prove an extreme challenge.

Deneb’s comparative performance against the Core i7-965 and Core i7-920, however, is rather troubling. Even at 4.2GHz and with an IMC running at 2.53GHz (1120MHz memory clock), Deneb doesn’t always outperform Intel’s lower-end, 2.67GHz solution, much less the top-end i7-965. It’s true that the i7-965 is a $1,000 part today, but a Deneb clocked at the rates we tested (if such a thing existed for the commercial market) would run at least $1K as well.
Read More
Comment (0)
Posted on February 24, 2009 10:28 AM by Rob Williams
If you happen to hold a grudge towards GameStop for any reason at all, you’re bound to get nice and wound up from one of their most recent “training” videos that has been posted online. As Kotaku appropriately puts it, the title of the video should be, “How To Talk To Women And Shill Wii Non-Games“, because that’s essentially the entire point of the video.
Now, I haven’t worked in the retail sector that much in the past, but I have long enough in order to endure a ridiculous training video, although I have to admit the one I’ve seen was nowhere near this bad. My question is though, do these videos actually help, or do they make things worse? I might be somewhat of a rebel, but if I worked at GameStop and had to watch this foolish mess, I don’t think I’d be walking back out onto the floor in a great mood.
The video shows employees how they can up-sell the customer, which is fine, but it also assumes that most GameStop employees know anything about what they are selling. I’m not sure if all GameStops are like this, but in Canada, the stores (Electronics Boutique to be specific, same outlet) are not too impressive. I usually make small talk with people when I’m being rung in, but at our stores here, the people behind the counter know so little about current games, that it’s kind of pointless. Recently, I went through the cash there and found out that the cashier didn’t even own one of the most recent consoles, because he “couldn’t afford it” (he’s been there for more than two years).
Maybe GameStop should start investing more in their employees first, rather than giving them a lackluster paycheck and expecting them to take a great enough interest in everything in the store in order to up-sell customer. </rant>

Poor GameStop staffers were subjected to “Understanding And Selling To Our Expanded Audience” in advance of the “Sharpen The Mind, Shape The Body” campaign-which has since expired-the sales effort that would give buyers of Wii Fit or My Fitness Coach a free subscription to Cosmo or Good Housekeeping. That way, they can unlock the secret of the male G spot in between bouts of getting fit with Jillian Michael’s Fitness Ultimatum. The video helps GameStop employees to understand the fairer sex in terms that anyone can understand: hunter, gatherer, and hopelessly confused mom.
Read More
Comment (0)
Posted on February 23, 2009 9:45 AM by Rob Williams
As much as the music industry would love to see an end to music piracy and album leaks, I think we can confidently say that it’s never going to happen, at least not until extreme measures are taken (which I doubt is going to ever happen). Take the last week’s worth of news for example, which is in all regards hilarious, and ironic. Not one, but two albums were leaked early, and guess what? Neither were due to reviewers who were sent early copies, or any end-user who happened to score one.
Last Wednesday, TorrentFreak posted about U2’s latest album (pictured below), “No Line on the Horizon“, which was leaked onto P2P networks and quickly racked up 100,000 downloads (I’m sure the number is much, much higher by now). What’s ironic about this story is that their record label tried hard to make sure no leaks occurred. They even went to such extreme lengths as to not send out a single reviewer copy, and rather flew the press in to take part in private listening parties.
That kind of dedication didn’t accomplish much, however, as it appears that Universal Music Australia accidentally made the digital album available on their site. It was quickly fixed, but not before a few eagle-eyed fans noticed it. Although that’s the more notable of the two stories, just this past weekend, iTunes Norway goofed up as well, by making Kelly Clarkson’s latest album, “All I Ever Wanted“, available for purchase. Again, a few fans noticed, bought it up, and then it was taken down.

I can’t help but laugh… because the measures that these companies go through seems to be, and well, it is for naught. Despite their dedication, someone will screw up (although, it’s also very possible that someone screwed up on purpose), and as a result, many will download. I’m still fully confident in the fact that many people who download early leaks are comprised of fans and people who will download anything. The music industry wouldn’t have seen a dime from the latter regardless of whether the album was leaked early or not. The fans will always go out and purchase it, even if they managed to score an early leaked copy.
Read More
Comment (0)
Posted on February 23, 2009 9:17 AM by Rob Williams
Last Monday, we linked to an article posted by PC Perspective that took a look at the loss of performance on SSDs over time, with the target of focus being Intel’s X25-M. The article went on to explain that during certain usage scenarios, the sub-block level can become highly fragmented and cause extreme slow-down over time. The worst of it is that typical defragging applications, such as Diskeeper, are unable to work on the sub-block level, and as a result, attempting to defrag the drives will make the issues only worse.
Since that article was posted, it’s been getting a great deal of attention around the web, and as you might expect, Intel has been paying close attention to the issue. In a response to an editor at CNET, Intel notes that they’ve been unable to replicate the issue thus far, but they are of course still investigating the issue. Intel is also quoted as saying, “In our estimation, the synthetic workloads they use to stress the drive are not reflective of real world use.“
I have no doubts that last point is true, but as mentioned in the originating article, the idea to investigate the issue began when the writer noticed performance degradation after a few months of regular desktop use. Intel also goes on to mention that it’s typical to experience slowdown when the drive is full, and on an 80GB model, that’s not hard to pull off. Whether or not that factor played a role in this, we’re unsure, but I’m confident we’ll be hearing from either Intel or PC Perspective with updates soon.
In response, Intel made a statement on Thursday. “Our labs currently have not been able to duplicate these results,” Intel said. “In our estimation, the synthetic workloads they use to stress the drive are not reflective of real world use. Similarly, the benchmarks they used to evaluate performance do not represent what a PC user experiences.”
Read More
Comment (0)
Posted on February 23, 2009 8:42 AM by Rob Williams
If you’re a regular reader of our news, you’re probably well-aware that I like to rant about different things from time to time. It could be anything… DRM, failed game sequels, digital music, DTV transition, you name it. Once in a while though, I’ll receive a bit of flack for whatever I ranted about, and one perfect example of this was with regards to last week’s posting, “iMagic OS: Commercial Linux Distro Gone Wrong“. After posting, I received a rather straight-forward e-mail from Carlos La Borde, the CEO of iMagic OS. I’ll tackle some points here.
In the e-mail, Carlos stressed to me that iMagic OS isn’t “lackluster”, nor is their support non-existent. He noted that his team goes the extra mile where support is concerned, and that thousands of man-hours have been poured into the distro. I don’t doubt this, but as I mentioned to Carlos, my assumptions were all based on what I saw on their website. The fact is, the information was lacking, and given that there were no screenshots available for showing off some of the distro’s major features, I figured I had all of the information I needed to prepare a posting.
After discussing that issue, Carlos had the website’s “Why?” page updated, this time including a lot more of what people should expect to see when perusing a distro site (commercial or not). This would include a good deal of information about the product, and screenshots to show off the major (and unique) features. One shot in particular I wanted to see was for magicOffice, since as a touted feature, it made no sense that a screenshot wasn’t made available. I might understand the reasoning behind that now.

As mentioned on the software page, magicOffice was created “from scratch”. As far as I’m concerned, to build something from scratch would be like baking something from scratch. It would be pounding flour into dough to make a pie crust, rather than buying a pie shell at the market. The screenshot of magicWriter seen here though, shows off that it wasn’t built from scratch, but rather implements the completely free TinyMCE WYSIWYG editor. It’s unique to see TinyMCE used in a non-web application, but that doesn’t suddenly make it “from scratch”.
That issue aside, Carlos also clarified to me that the distro costs $80 for a reason. It includes Codeweaver’s Crossover Office, which is something I overlooked the first time around. Since that product on its own costs $70, it makes a bit more sense as to why the distro itself isn’t cheap. But, with that feature tacked on, this distro is not for most Linux enthusiasts, but rather those who come from Windows and want something a little more familiar.
Have my opinions of iMagic OS changed at all? Some have, and some haven’t. After reviewing the information here, and doing your own investigating, you can come to your own conclusions. The fact that Carlos posted in our forums posing as a customer doesn’t help their credibility either, but as I mentioned, you guys can come to your own conclusions. Carlos is welcome to continue posting in our forums and answer any questions anyone may have, or to further give reasons as to what makes iMagic OS special.
Read More
Comment (0)
Posted on February 19, 2009 7:46 PM by Rory Buszka
A post made yesterday evening to Hulu’s official site blog states that, under pressure from their content providers, they will no longer be allowing streams of their content via the Boxee client. The blog post fails to identify exactly which content providers made the demand to remove Hulu from Boxee, but the list of ‘usual suspects’ is short enough. The blog post, written by Hulu’s CEO, Jason Kilar, expresses disappointment at the content providers’ decision, but reaffirms the company’s commitment to cooperating with the providers’ wishes — simply because without cooperation from content providers, there would be no Hulu.
It’s impossible to say at this point (without speculating wildly and groundlessly) who or what is to blame for the content providers’ decision to end Hulu service via Boxee. Still, it’s difficult to overlook the fact that the two technologies complemented each other in such a way as to create a compelling alternative to broadcast television, whether OTA, cable, or satellite-based — a fact that’s caught the attention of cable and satellite providers. The move is likely to be a bigger setback for Hulu than for Boxee, which continues to provide content from a number of cable channels like CNN and Comedy Central.
Boxee is currently limited to Apple hardware and PC hardware running Linux, but the company announced at this year’s CES that they hadn’t turned a deaf ear to the deluge of requests for a Windows-based client. Their ace in the hole, however, is likely the easily-hackable Apple TV, which many enterprising gadget-freaks have turned into a standalone Boxee player, much like Netflix’s Roku streaming box. It’s clear that the goal in preventing Hulu from streaming via Boxee is to prevent the type of presentation that Boxee affords, which allows easy enjoyment on large screens.
It’s difficult to see how Hulu comes out on top in this, but it’s easy to see how the content providers benefit — by keeping Hulu’s streaming content on the desktop and off the big screen, they satisfy cable and satellite providers, as well as the hundreds of over-the-air broadcast affiliates, while keeping you tuned-in to distribution methods that might have a difficult time competing with the flexibility of Hulu and Boxee.

The maddening part of writing this blog entry is that we realize that there is no immediate win here for users. Please know that we take very seriously our role of representing users such that we are able to provide more and more content in more and more ways over time. We embrace this activity in ways that respect content owners’ — and even the entire industry’s — challenges to create great content that users love. Yes, it’s a complex matter. A tough mission, and a never-ending one, but one we are passionately committed to.
Read More
Comment (0)
Posted on February 19, 2009 11:50 AM by Rob Williams
So, your playing your favorite FPS or MMORPG, and then it hits you, “Dude! I need to check my e-mail!“. Since you’re in a safe spot in the game, you decide it’s a good idea to pull off the ole Alt+Tab trick, check the e-mail, then bring the game back to focus. If it comes back to focus, you’re gold. If not, it can be one heck of a frustrating experience (especially if you were part of a hunting group).
Did you know though, that there exist tools to allow you to browse the web from inside of your games? Well, I for one didn’t, but I admit that the idea intrigues me quite a bit. The cool thing is, there’s not just one solution, but four, or possibly even more. Even Steam has this functionality, although it must be difficult to find, as I’ve never managed to stumble on it.
That aside, Gamespot has taken four different solutions for a spin, including PlayXpert, Rogue, Xfire and of course, Steam. The screenshot below shows off PlayXpert’s (or pXp for short) usage, and I have to say, it looks quite good. But in the end, each one has its own ups and downs, and for the most part, you’re going to have to do a fair amount of testing to see what works best for you in a particular title.

PlayXpert acts as your one-stop hub to the outside world all within the confines of a game. Through PlayXpert’s taskbar-like menu, you can access all sort of tools. You can easily browse the Web, play music, and chat over IM with your buddies right out of the initial install. According to PlayXpert CEO Charles Manning, “The program has broad compatibility and minimizes hits to performance by injecting its UI into the DirectX command stream as the information goes to the GPU.”
Read More
Comment (0)
Posted on February 19, 2009 11:36 AM by Rob Williams
The progress of technology is a fantastic thing, and I don’t think many will disagree to that (well, unless you happen to have a forced RFID chip implanted somewhere on your body), but to me, one of the best innovations has been wireless technology. Everything from wireless Internet to wireless keyboards, I love it. To sit at a desk and not kick around cables under your desk… it’s a great feeling. So, wouldn’t it be nice if everything was wireless?
As years pass, wireless technology will continue to improve, and we’ll no doubt be seeing a lot more of its usage in the years to come. In the future, we may even have wireless power, and since we saw a brief demonstration of just that last year at IDF, it’s definitely a possibility. But, one thing’s for sure, and that’s that if there’s one crowd who would appreciate cutting the cords the most, it would be the home theater crowd.
Where we stand currently in wireless HD video/audio is a little complicated, and there doesn’t seem to exist extremely stable solutions right now – at least, none that are too affordable. Ars Technica attempts to take the confusion out of the technology as a whole though, by taking a look at three companies who each promise to have the best solution available, and seeing that we’ll be hearing a lot more from each of them during 2009, there’s no better time than now to see what each one offers.

While cable costs do drop over time for most electronic hookups (like Ethernet, USB, and so on), the downward curve is likely to be slow for longer HDMI runs. Wireless technologies, in contrast, often see plummeting cost when they start being built in large numbers and integrated directly into existing electronics, in this case television sets and set-top boxes.
Read More
Comment (0)