I have to admit that that I wasn’t particularly enthralled when NVIDIA announced Cg. It’s bad enough that we’ve got DirectX 9’s HLSL and OpenGL 2.0’s GSLang. As someone who got to program DirectX, OpenGL, Glide and CIF (ATI’s proprietary 3D language, RIP), I really hate competing graphics languages. I feel it’s a waste of time to reinvent the wheel in a different flavor, and I really would rather be creating something new than porting code. On the other hand, I ran across this article by Colin Stoner that hits on some of the more recent uneasiness. While Colin does raise some interesting points, like even though Cg is open source and theoretically could output for ATI chips, ATI doesn’t give a shit as they are on the HLSL/GSLang wagon, and how NVIDIA is getting games to brand themselves with the NVIDIA logo, he really is complaining about something that I don’t think is a bad direction for the PC graphics world in general, at least I think so…
I got a fair amount of (good-natured but strident) flak from NVIDIA for my “short sighted” viewpoint. Yeah, well sorry. Cg is so close to HLSL that I understood the need to get something out there while HLSL shaped up. But hey – HLSLs here now, so why hasn’t Cg been merged into HLSL?. The recent Cg book by Fernando Kilgard (a really nice book by the way, see the Gamasutra article) is being followed by another one. Which seems strange to me. The Cg book is selling well, but it just came out – it’s a bit soon to be following up with another one. This seems to point to Cg being around for a while, and NVIDIA pushing some not insignificant resources at it. Let’s face it, NVIDIA ain’t stupid, they’ve got some of the smartest engineers in the business even though I don’t care much for some of their marketing practices. What other reason could Cg exist when HLSL/GSLang could easily fill the role? It’s got to be because NVIDIA owns Cg. It owns a high level rendering language. What can you do with that? It’s only an advantage if you can do something with it. NVIDIA quietly picked up some IP last year that really could make CineFX engine a real cinematic experience. This is all sheer speculation on my part, but it’s what I’d do if I had those resources. NVIDIA’s market share is still twice as big as ATI’s, a nice end run could cut ATI off at the knees if executed correctly and put to rest any doubt about who’ll supply the chips for Xbox2.
So how can this be a good thing? DX9 is going to be here for a while. ATI is settling down to a slower R&D cycle. Ho hum. Nothing new on the horizon for a while, it looks like smooth, straight sailing for a while. Unless someone decides to rock the boat.