Original Link: https://www.anandtech.com/show/2499
NVIDIA Analyst Day: Jen-sun Goes to the Mat With Intel
by Derek Wilson on April 11, 2008 12:00 AM EST- Posted in
- GPUs
Introduction
Hot on the heels of the launch of their 9800 series products, NVIDIA is holding a Financial Analyst Day. These are generally not filled with the normal technical glitz and glitter an Editors Day has, but the announcements and material covered are no less important to NVIDIA as a company. NVIDIA has an unusually large institutional ownership rate at 84% (versus 79% and 66% for AMD and Intel respectively) so the company holds these Analyst Days in part to keep its institutional investors happy and well informed about the company’s progress.
As far as we members of the press are concerned however, Analyst Days are a valuable chance to learn about the GPU market, and anything that could impact the bottom line can help us understand NVIDIA's direction, motivation, and even the reasoning behind some of the engineering decisions they make. Today saw a lot of posturing for battles to come, and we were not disappointed.
Waking up the Beast
Most of the morning was dedicated to NVIDIA taking some time to do a little PR damage control. They've stepped out to defend themselves against the doom and gloom statements of other players in the industry. With Intel posturing for a move into the graphics market and proclaiming the downfall of rasterization and discrete graphics at the same time, NVIDIA certainly has reason to address the matter.
And we aren't talking about some standard press release boiler plate filled with fluffy marketing speak. This time, Jen-sun Huang, the man himself, stepped out front and addressed some of the concerns others in the industry have put forth. And he was out for blood. We don't get the chance to hear from Jen-sun too often, so when he speaks, we are more than happy to listen.
One of the first things that Jen-sun addressed (though he didn't spend much time on it) is the assessment by Intel's Pat Gelsinger that rasterization is not scalable and won't suit future demands. He largely just threw this statement out as "wrong and pointless to argue about," but the aggregate of the arguments made over the day all sort of relate back to this. The bottom line seems more like Intel's current approach to graphics can't scale fast enough to meet the demands of games in the future, but that speaks nothing about NVIDIA and AMD's solution which are at least one if not two orders of magnitude faster than Intel graphics right now. In fact, at one point Jen-sun said: "if the work that you do is not good enough … Moore's law is your enemy."
This seems as good a time as any to address the tone of the morning. Jen-sun was very aggressive in his rebuke of the statements made against his company. Many times he talked about how inappropriate it is for larger companies to pick on smaller ones through the use of deceptive marketing tactics (ed: Intel is 11.5 times as large as NVIDIA by market cap). To such attacks, he says "It's just not right!" and "we've been taking it, every single fricking day… enough is enough!" NVIDIA, Jen-sun says, must rely on the truth to carry its message in the absence of massive volumes of marketing dollars.
Certainly, things can be true even if they paint a picture slightly different than reality, but for the most part what Jen-sun said made a lot of sense. Of course, it mostly addresses reality as it is today and doesn't speculate about what may be when Larabee hits the scene or if Intel decides to really go after the discrete graphics market. And rightly enough, Jen-sun points out that many of Intel's comments serve not only to spread doubt about the viability of NVIDIA, but will have the effect of awakening the hearts and minds of one of the most tenaciously competitive companies in computing. Let's see how that works out for them.
Tackling the Market Share Myth
Largely, 2007 and 2008 have been big years for NVIDIA's growth. With the rock solid performance domination of G80 since the last quarter of 2006 - a situation that is largely unheard of in the usually very fast paced and aggressive graphics market - confidence in NVIDIA (or rather a lack of confidence in the competition) has helped bolster their position in the industry as a whole. AMD certainly still offers some good price/performance alternatives in the midrange, as they can compete in price with the added flexibility of their smaller GPU and fab process. But they don't have the high margin high end market or the mind share to match NVIDIA right now.
The market gains made by the NVIDIA juggernaut combine with some interesting insight into sales data have combined to show NVIDIA as the current king of the roost in terms of desktop graphics sales. For a long time, Intel had been able to claim that it shipped a higher volume of computer graphics hardware than anyone in the world. This is true due to the pervasiveness of Intel's integrated chipsets on the desktop and in mobile solutions. Intel does offer a good solution for people who are uninterested in graphics performance or quality. They offer a 2D solution that supports a minimal set of DirectX features but, as Jen-sun said, "is a joke" when compared to any real 3D hardware.
So what's different aside from the already clear gains NVIDIA has made in the market place? NVIDIA says that something called double attachment is much to blame for inaccuracy of the data spread by Intel and analysts. Jen-sun claims that a huge proportion of Intel motherboards with include integrated graphics have discrete graphics cards plugged into them. The idea is that Intel basically gives away their integrated hardware and there's no reason not to ship it in a system. But shipments say nothing to illustrate the actual usage of these parts.
As an example, Jen-sun made the point that if Intel integrates a tiny graphics core on to all their CPUs, they would be able to claim 100% market share of graphics running on Intel systems using their current logic. The problem is that if you give away crap it doesn't mean people will use it. To help determine double attachment, NVIDIA looked at a couple different metrics relating to CPU and GPU sales.
With total GPU shipments at 336 Million units and total CPU shipments hitting only 273 Million, double attachment can help explain why so many more GPUs were sold than CPUs: if CPU sales more closely represents the number of systems sold or built last year, there are a large number of computers with unused integrated graphics in them which count as two shipped GPUs. This overlap would mean that Intel's shipped graphics number greatly over inflate the market impact of their graphics products.
Of course, we can't ignore the fact that the average PC enthusiast will likely upgrade their graphics card before their CPU and that multi-GPU solutions do account for at least a few of those shipments. We can't discount all of these shipments as double attachment, but it seems at least plausible that NVIDIA's real market share is somewhere between 65% and 75% based on the numbers they showed us. This is definitely more impressive than what we see on the surface.
Intel's Graphics Performance Disadvantage
It is no secret that Intel's integrated graphics are very, very slow and nearly useless for most modern 3D graphics applications. So when Intel says we should see their integrated graphics parts increase in performance 10x by 2010 we should get excited right? That is much faster than Moore's law in terms of performance over time (we should only expect a little more than 2x performance over a 2 year period, not an order of magnitude).
The problem with this situation as noted by NVIDIA is that today's Intel integrated graphics parts are more than 10x slower than today's affordable discrete graphics parts.
Looking at that chart, we can see that Intel integrated graphics will be competitive with today's sub $100 hardware by the year 2010 in 3DMark06. With 4 year old games, NVIDIA's current hardware will still blow away Intel's 2010 integrated solution, and the margin just climbs higher if you look at a modern game. NVIDIA claims that these tests were done with low quality settings as well, but we can't speak to that as we weren't the one's running the tests. If that's the case, then the differences could be even larger.
The bottom line is that Intel doesn't have a solution that is within reach of current graphics hardware even if it could produce 10x performance today. In two years, after NVIDIA, AMD, and even S3 have again at least doubled performance from what we currently have, there's no way Intel can hope to keep up.
And NVIDIA wasn't pulling any punches. Jen-sun went so far as to say: "I wouldn't mind it if sometimes they just say thank you – that its possible to enjoy a game on an Intel microprocessor [because of NVIDIA graphics hardware]." This is certainly an audacious statement of the facts, even if it happens to be the truth. AMD can take some of the credit their as well, of course, but the point is that Intel couldn't make it on its own as a gaming platform today without the help of its competitors. Believe me when I say that we are trying to put a neutral spin on all this, but Jen-sun was out for blood today, and he absolutely hit his mark.
To prove the point further, they went to the data Valve has been collecting through Steam. This data must be taken with the proper dose of salt – it doesn't reflect the entire GPU market by any means. The Steam data reflects a specific subset of GPU users: gamers. Even more to the point, this is a specific subset of gamers: gamers who play games using Steam who chose to report their system information (anonymously of course). Yes the sample size is large, but this is by no means a random sample of anything and thus loses some points in the statistical accuracy department.
Steam data indicates that NVIDIA GPUs are utilized in 60% of Steam gamers’ boxes, while Intel GPUs are only being utilized in 2% of those surveyed. As Jen-sun pointed out, this isn't about juggling a few percentage points: gamers who use Steam clearly do not use Intel GPUs to play their games. The picture gets even grimmer for Intel when you look at DX10-specific data: NVIDIA has about 87% of the GPUs running in DX10 boxes while Intel powers only 0.11% (which Jen-sun said "I think that's just an error," and may well have been right). He was on target when he said that in this case "approximately zero seems statistically significant."
The implication is clear that while few gamers use Intel GPUs in the first place, gamers aren’t using Intel GPUs for DX10 gaming at all. We certainly buy that, as Intel does not even have a DX10 part on the market right now, but again this data is not a statistically accurate representation of the entire gaming population. Steam data is certainly not a bad indicator when taken in a proper context and as part of a broader look at the industry.
Further refuting the idea that Intel can displace NVIDIA, Jen-sun addressed Ron Fosner's assertion that multicore processors can handle graphics better than a dedicated graphics card ever could. This is where NVIDIA gets into a little counter FUD action, where Jen-sun shows that adding cores does nothing for gaming or graphics benchmark performance. In this case, Intel was certainly referring to the ability of CPUs to handle graphics code written specifically for multicore CPUs. But for the time being, when you compare adding cores to your system to adding a more powerful GPU, NVIDIA offers up to 27x more bang for your buck.
Currently Intel has some fairly decent lab demos, and there have been murmurs of a software renderer renaissance (I'd love to see John Carmack and Tim Sweeney duke it out one more time in software; maybe that's just me), but there just isn't anything in production that even tries to show what a CPU can or can't do when in direct competition with a GPU for graphics quality and performance. And there are reasons for that: currently it still isn't practical to develop the software. Maybe when everyone is running their 8 core 16 thread CPUs we'll see something interesting. But right now and even in the next few years rasterization is going to be the way to go and pure FLOPS with massive parallelism will win every time over Intel's programmability and relatively light parallelism.
Which brings us to a point NVIDIA made later in the day: GPUs are already multicore to the point where NVIDIA likes to refer to them as manycore (as we saw Intel do this with 100+ core concepts a few years back when they were first starting to push parallelism). It's a stretch for me to think of the 128 SPs in a G80 or G92 GPU as "cores" because they aren't really fully independent. But with the type of data GPUs normally tackle it's effectively very similar. Certainly no matter how you slice it GPUs are much wider hardware than any current multicore CPU.
The point NVIDIA needs to make is that the argument is far from over as the battle hasn't even really begun. At one point Jen-sun said "[Intel] can't go around saying the GPU is going to die when they don't know anything about it." This is a fair statement, but NVIDIA can't write Intel off either. They certainly will know about GPUs if they truly intend to go down the path they seem destined to travel. Either they'll be pushing the CPU (and all it's multicore glory) as a bastion of graphics power (for which some free multicore CPU graphics development tools might be nice, hint hint), or it will be essentially entering the graphics market outright with whatever Larabee ends up actually becoming.
The Tenderloin and the Two Buck Chuck
As for the idea of Intel integrating a GPU onto their CPUs, NVIDIA painted a rather distasteful picture of mixing together something excellent with something incredibly sub par. The first analogy Jen-sun pulled out was one of someone's kid topping off a decanted bottle of '63 Chateau Latour with an '07 Robert Mondavi. The idea of Intel combining their very well engineered CPUs with their barely passable integrated graphics is an aberration to be avoided at all costs.
This isn't to say that CPUs and GPUs shouldn't work together, but that Intel should stick to what they know. In fact, NVIDIA heavily pushed the idea of heterogeneous computing but decried the idea that taking a system block diagram and drawing a box around the CPU and GPU would actually do anything useful. NVIDIA definitely wants their hardware to be the manycore floating point compute hardware paired with Intel's multicore general purpose processors, and they try to paint a picture of a world where both are critical to any given system.
Certainly CPUs and GPUs are currently needed and unless Intel can really pull out some magic that won't change for the foreseeable future. NVIDIA made a big deal of relating this pair to Star Trek technology: you need both your impulse engines and your warp drive. Neither is useful for the task the other is designed for: short range navigation can't be done with a warp drive, and impulse engines aren't suitable for long distance travel requiring faster than light speeds. The bottom line is that hardware should be designed and used for the task that best suits it.
Again, this says nothing about what happens if Intel brings to market a competitive manycore floating point solution. Maybe the hardware they design will be up to the task, and maybe it won't. But Jen-sun really wanted to get across the idea that the current incarnation of the CPU and the current incarnation of Intel's GPU technology are nowhere near sufficient to handle anything like what NVIDIA's hardware enables.
Coming back the argument that it's best to stick with what you know, Jen-sun stated his belief that "you can't be a great company by doing everything for everybody;" that Intel hardware works fine for running operating systems and for applications where visualization is not a factor at all: what NVIDIA calls Enterprise Computing (in contrast to Visual Computing). Going further, he postulates that "the best way for Google to compete against Microsoft is not to build another operating system."
Making another back handed comment about Intel, Jen-sun later defended their recent loss in market share for low end notebook graphics. He held that the market just wasn't worth competing in for them and that other companies offered solutions that fit the market better. Defending NVIDIA's lack of competition in this market segment, he doesn't say to himself: "Jen-sun, when you wake up in the morning, go steal somebody else's business," but rather "we wake up in the morning saying, 'ya know, we could change the world.'"
New Spin on Computer Marketing
Beyond all the FUD fighting, NVIDIA talked about a new push toward marketing computers not in terms of CPU speed or GPU model number or whatever the spec of the week may be, but in terms of what the computer is designed to do. NVIDIA, OEMs, and retailers have all gotten behind the idea that it would be a good practice to start building and marketing their systems not as low-end, midrange, or high-end, but as gaming computers, multimedia computers, workstations, or business application computers.
If system builders choose to balance CPU and GPU power to favor specific applications rather than just throwing all low-end components, all midrange components, or all high-end components at a system, they can deliver much better bang for the buck to people looking to use their PC for a specific purpose. NVIDIA refers to this idea as the Optimized PC Initiative. It's kind of a side note and not an NVIDIA centric line of thought, but it is an idea that could really help the uninitiated understand what they are getting when they purchase a system. In fact, this is one of the areas that really impressed us with the Gateway P-6831 FX notebook: that it is balanced for great midrange gaming performance.
Final Words
No one can deny Jen-sun's love for his company and his hardware, but while his presentation was impressive and impassioned, we must not discount Intel's ability to compete. They are impressive in capability (they've got a lot of brilliant engineers over there) and size (they've also got a lot of money). We also can't forget that Intel is a silicon company. They've got absolutely huge resources to dedicate to producing the most bleeding edge silicon base to house their ICs. With the sheer size and power requirements of today's GPUs, every little bit helps. The combination of Intel engineers building a massively parallel floating point engine to match the power of the GPU, and then fabbed on Intel silicon could be a huge coup if they are only willing to really commit to the task and put their money (and their minds) where it matters most.
Now that both Intel and NVIDIA have hit the mat and acknowledged each other as true competitors, we hope to see some huge things happen in terms of computer graphics and massively parallel floating point computing in general. Today marks the beginning of a new era in the desktop PC world: the beginning a battle between the world's greatest silicon company and the world's greatest dedicated IC design house.