DirectX 10.1 makes current hardware obsolete? Not!

Having just returned from Siggraph this year I was fielding some questions from some co-workers. “Didja hear that Microsoft announced DirectX 10.1?”, “Yeah”, “And that it makes all DX10.0 hardware obsolete?” – To which my witty reply was “huh?” I was there when the DX10.1 features were described and I’m sure I would have noticed it if Microsoft had made such an announcement.

What I do remember was the description of the architecture for DX10 and why it’s breaking from the DX9 interface. Basically it’s that fact that the API has just gotten bigger and bigger and has gotten to the point where 1) The underlying hardware doesn’t work the way the API was originally laid out, and 2) the drivers are now these huge things to force the legacy API calls to talk to the current hardware setup. DX10 (and OpenGL 3 for that matter) is where we make a clean break and get back to a thin driver layer over the GPU. Gone are the fixed lighting pipeline of yor. In fact the whole Begin Scene – render – EndScene architecture is gone. Cap bits are finally going away and DX is adopting (waaaay to late) the OpenGL conformance test model. In other words DX10.0 is an API specification. If you want you hardware to be certified as being DX10.0 compatible, it has to run all the features that are in the DX10.0 spec. (And I assume it has to generate conforming output when tested against the API). The programmers now get to code to one API, not various flavors, and the consumer gets to know that a DX10 card will run all DX10 games.

OK, so what’s the difference between DX10.0 and DX10.1? Basically what I heard was that DX10.1 was what Microsoft wanted for Vista ship, but not all the major hardware vendors could get all the features in the current hardware generation. So what shipped was most of the features minus some more esoteric things (like how 4 sample full screen antialiasing is implemented). The reason DX10.1 is coming out so quickly is that Microsoft wants the spec out there so developers can see what’s going to be available in the near future as well as putting a stake in the ground that hardware venders have to meet. The new features that are in DX10.1 are:

  • TextureCube Arrays which are dynamically indexable in shader code.
  • An updated shader model (shader model 4.1).
  • full 32-bit floating point filtering
  • The ability to select the MSAA sample pattern for a resource from a palette of patterns, and retrieve the corresponding sample positions.
  • The ability to render to block-compressed textures.
  • More flexibility with respect to copying of resources.
  • Support for blending on all unorm and snorm formats.

So, as the Microsoft guy said, it’s all about the rendering quality. So, I doubt that when DX10.1 comes out suddenly your DX10.0 game will stop working. These are just enhancements to the API that don’t reflect the current state of the hardware, just where the hardware will be forced to go in the near future. The DX9 API is getting a final revision that then will be frozen so any non-Vista OS will be able to run a DX9 (or 8, or 7) game, as will Vista since the DX9 DLL will coexist with the DX10.x DLL on Vista. If you want to try it out you’ll need the Vista SP1 beta and the D3D10.1 Tech Preview – both will be downloadable from Microsoft.

Posted in DirectX, Graphics API | Leave a comment

Nival closes LA office, Kevin Bachus not seen…

I’ve always been fascinated by some of the folks who used to work in the DirectX/Xbox groups at Microsoft and how they manage pursue (seemingly) lucrative, somewhat high-profile careers with nothing but string of empty promises behind them. Kevin is one of my favorites. After leaving Microsoft he and a few other famous game industry names formed the Capital Entertainment Group, which made grandiose plans and folded after a year without really doing much. He then was hired as CEO a company called Infinium with a never-to-be-release gaming console/service called the Phantom. He was often seen singing praises about how great the Phantom would be, how great the service, and how, No! it wasn’t vapor ware. How after a while he managed to move the company offices from Florida to LA (where he lived) and then left the company 5 months later. And then sued them for back pay. He was hired by Russian RTS developer Nirval (Heroes of Might and Magic) (which was owned by Florida-based Ener1 Group) as CEO in March 2006. Apparently Ener1 wasn’t happy with the low profits coming out the LA office and quietly closed it in December 2006. No word on what Kevin is up to now…

Update: There’s a nice -albeit short – reflective guest piece by Kevin on the XBox development effort you can read here.

Posted in Ex-Microsoft, Too Weird | Leave a comment

Is Intel just getting back into the graphics business, or are they going to change it?

It’s no great secret that Intel has been eyeing the discreet graphics market. Intel typically owns about 30-40% of the desktop graphics market, but that’s strictly integrated (and hence – usually considered underpowered) graphics. ATI and Nvidia own most of the other part of the market and anyone who’s interested in playing 3D games wouldn’t consider using an integrated graphics solution if they could help it. Apparently the acquisition of ATI by rival AMD has set Intel to aggressively start talking about discreet graphics. In fact Intel has recently started aggressively hiring engineers (both software and hardware) for their Visual Computing Group. From the recruitment copy:

Join us as we focus on strengthening our leadership in integrated and high-throughput graphics and gaming experiences by developing innovative processing products based on a many-core architecture. We’re looking for engineers, developers, and architects who share our vision and understand what can happen when serious skills and vast resources join forces.

It would seem that Intel is pushing a multi-core architecture for the 2008-2009 timeframe. Given Intel’s manufacturing chops they could, if they set their mind to it, make a pretty deep impression on the discreet graphics market. Given that timeframe you’re looking at a 10x to 20x performance boost over the current top GPU, an Nvidia G80. Even if Intel makes a few missteps and produces something that’s underpowered compared to the best from ATI or Nvidia, they would still probably be competitive on price alone.

Or perhaps they could be doing something to surprise us all? Intel recently did something pretty smart, they hired Daniel Pohl, recent Erlangern University graduate. Daniel coauthored a paper – Realtime ray tracing for current and future games in which the case is made that the traditional hardware rendering pipeline – i.e. object built up independently of each other with depth created through the use of a z-buffer, and all object-object visual interactions (shadows, reflections, etc.) having to be added on – has just about reached the end of its lifetime. Daniel is pushing raytracing instead. Raytracing lets you build up complex scenes in which all the lighting, shadowing, transparency, reflection, caustics, etc. are all handled by the ray tracer. Granted, that the framerates that Daniel gets on the custom raytracer board were about 10% of current consumer level boards, but raytracing will be the next big step in graphics architecture, not a kludge like the doomed Talisman architecture. Raytracing really is the way to render scenes. Maybe Intel will be the first?

Posted in Graphics Hardware | Leave a comment

Using Aero Glass on Vista: my Microsoft Developer Network Article is up!

One of the nice things about Vista is that they rewrote the display architecture to be a composition engine. Every window gets some off screen memory to display itself to and then all these windows are composited together onto the desktop. This means that the windows are totally independent from what they are rendered over and that it’s possible to stick effects into the composition pipeline. Vista’s Aero Glass interface is a simple demonstration of the power of this new architecture. The “glass” effect is the ability to tag regions of the window as being “glass” and then these areas are composited with whatever parts of the desktop are underneath the regions and then blurred with a hard coded pixel shader to give the impression of a frosted glass edge to Aero Glass enabled windows. Sadly, the Basic version of Vista can’t run Aero Glass. But if you’re running the Premium, Business, or Ultimate versions and you have some recent (i.e. DX9) hardware , you’re all set. That a look! You can find the article here.

Posted in Code, Published Articles | Leave a comment

NVIDIA Releases Developer Tools

At the GDC, graphics chip maker NVIDIA announced they are releasing a bunch of updated and new tools. The tool upgrades are: FX Composer 2, PerfHUD 5, ShaderPerf 2. They are also releasing a new GPU-accelerated texture tool, plus a Shader Library. The most interesting toolkit is the DX10 SDK. Targeted towards the GeForce 8 series of GPU (the only GPU that can run D3D10 so far), it’s a collection of samples for both OpenGL and DirectX, executables and source, that demonstrate and showcase DX10 features. The installer looks and acts like the Microsoft DX Sampler.

The DirectX collection makes up the bulk of the samples, while the OpenGL side is a bit thin. The samples include Rain, Smoke, Fur, Shadows, cloth simulation, Render Target usage, etc. The Direct3D SDK is 256MB while the OpenGL SDK is 45MB.

To compile the source you’ll need Microsoft Visual Studio 2005 plus have the Feb. 2007 DirectX SDK installed (which you can get here.) if you want to compile the DX samples.

If you actually want to run the code you’ll need a DirectX10 video card – which currently means an NVIDIA GeForce 8. There are videos of the programs so even if you don’t have a DX10 video card you can still see the programs running.

Posted in DirectX, Graphics API, Graphics Hardware, OpenGL, PC Graphics | Leave a comment

Enhancing Vista Performance for under 10 bucks

ReadyBoost: SuperFetch will preload data to you hard drive’s virtual memory page file so the data is in the right format to be read into memory, it’s just sitting on your hard disk. ReadyBoost is the next step. If you have a memory device on your machine (flash memory like a USB drive, SD card, etc.) you can designate (all or part) it as a ReadyBoost device. The SuperFetch data will be loaded onto this device. When you insert a memory device you’ll see the “Speed up my System” option appear.

ReadyDrive: A ReadyDrive is a new piece of hardware. It’s essentially a hard disk with ReadyBoost flash memory built in to it. These hard disks are just starting to be built now, but in the mean time you can get most of the benefits of one by designating some USB memory for use as ReadyBoost.

After Windows tests and passes the device, you’ll get the READYBOOST properties dialog.

Here are some facts that you should know before you start.

How fast?: The memory should be pretty fast: 2.5MB/sec throughput for reads, 1.75MB/sec for writes. Vista will test the memory and fail the device if it’s too slow. I picked up a cheap 2GB USB drive only to find it didn’t pass muster. Pick a good one. (Sometimes reformatting as NTFS will allow it to pass.)

What size?: The range is 250MB – 4GB. A ratio of 1:1 (low) to 1:2.5 (high) of system memory to Flash memory. The current limitation is one ReadyBoost drive per machine.

Is the data secure?: All cache data is encrypted.

What happens when the drive is pulled?: It falls back to the hard disk. The device holds a cache of what’s already on the hard disk, so the USB memory just speeds access to it. SD cards have an advantage- they don’t have an “in-use” LED like most USB drives. In my floppy drive/USB/19-in-one-device-reader drive this is an advantage.

ExtremeTech did a test of some USB drives in their article: USB Flash Memory for Windows Vista ReadyBoost

Here’s a list of ReadyBoost tested devices.

For more info on these technologies see the Microsoft Vista Performance page

See Tom Archer’s Blog for a ReadyBoost Q&A

I’m going to pass along one cool thing I found out about Windows Vista. Now I normally try to put a lot of my working files on a RAM disk – I spend most of my day writing code and a lot of time is spent writing file to the hard disk. The Vista folks revisited this idea by enhancing the caching mechanism in Vista by allowing you to add an inexpensive USB 2.0 memory device to your PC to boost performance by up to 100% in some situations. I was able to add a cheap 2GM SD memory card to my PC for under $10 (from Buy.com after rebate) and boost my PC’s performance. First lets go over the technologies that make this possible:

SuperFetch: SuperFetch is an enhanced version of XP’s PreFetch feature, which examines what apps you frequently load and intelligently preloads them into memory. It is pretty dependent upon how predictable you are, but if you tend to use a few apps frequently, then SuperFetch will attempt to preload data into memory and can significantly speed up load times.

ReadyBoost: SuperFetch will preload data to you hard drive’s virtual memory page file so the data is in the right format to be read into memory, it’s just sitting on your hard disk. ReadyBoost is the next step. If you have a memory device on your machine (flash memory like a USB drive, SD card, etc.) you can designate (all or part) it as a ReadyBoost device. The SuperFetch data will be loaded onto this device. When you insert a memory device you’ll see the “Speed up my System” option appear.

ReadyDrive: A ReadyDrive is a new piece of hardware. It’s essentially a hard disk with ReadyBoost flash memory built in to it. These hard disks are just starting to be built now, but in the mean time you can get most of the benefits of one by designating some USB memory for use as ReadyBoost.

After Windows tests and passes the device, you’ll get the READYBOOST properties dialog.

Here are some facts that you should know before you start.

How fast?: The memory should be pretty fast: 2.5MB/sec throughput for reads, 1.75MB/sec for writes. Vista will test the memory and fail the device if it’s too slow. I picked up a cheap 2GB USB drive only to find it didn’t pass muster. Pick a good one. (Sometimes reformatting as NTFS will allow it to pass.)

What size?: The range is 250MB – 4GB. A ratio of 1:1 (low) to 1:2.5 (high) of system memory to Flash memory. The current limitation is one ReadyBoost drive per machine.

Is the data secure?: All cache data is encrypted.

What happens when the drive is pulled?: It falls back to the hard disk. The device holds a cache of what’s already on the hard disk, so the USB memory just speeds access to it. SD cards have an advantage- they don’t have an “in-use” LED like most USB drives. In my floppy drive/USB/19-in-one-device-reader drive this is an advantage.

ExtremeTech did a test of some USB drives in their article: USB Flash Memory for Windows Vista ReadyBoost

Here’s a list of ReadyBoost tested devices.

For more info on these technologies see the Microsoft Vista Performance page

See Tom Archer’s Blog for a ReadyBoost Q&A

Posted in Miscellaneous | Leave a comment

New “Longhorn” due in April WinHEC

In an e-mail to developers on Wednesday, Microsoft said it would offer a new developer preview release of Longhorn at the company’s Windows Hardware Engineering Conference (WinHEC), April 25-27 in Seattle. The preview will be the first public build of Longhorn in a year, and a lot has changed, internally, since Microsoft moved Longhorn to a new component-based structure that will make the system easier to install and modify. Given Longhorn’s schedule, the WinHEC build will be a pre-beta 1 release. Other topics include enhancements that Longhorn will bring to mobile computing, including support for secondary displays. Microsoft lists a session devoted to the hardware requirements for Longhorn. “This session explores the components that define a Windows Longhorn-ready PC and covers core system requirements, marketing considerations, and the timeline for customer awareness leading up to Windows Longhorn PC availability,” Microsoft said on the site.

Posted in Miscellaneous | Leave a comment

Take Two faces “murder training” lawsuit

Take Two, the publisher of the Grand Theft Auto game series, is once again facing yet another frivolous a lawsuit that alleges its software was complicit in murder. The legal action was filed on behalf of the families of police force staff shot dead in Fayette, Alabama in 2003, allegedly by one Devin Thompson. Thompson was apprehended on suspicion of driving a stolen car. The lawsuit maintains that Thompson’s actions that day were inspired by the GTA series, games he is claimed to have played obsessively. The games amount to “training” for the alleged killings. The lawsuit claims the video game “Grand Theft Auto” led a Thompson to shoot two police officers, Arnold Strickland and James Crump, and a dispatcher, Leslie Mealer, to death in 2003, mirroring violent acts depicted in the popular game. Thompson is accused of killing the three men in June 2003 after being brought to the Fayette police station on suspicion of driving a stolen car. Thompson allegedly grabbed one of the officer’s guns, shot him and the other two, then fled in a patrol car. “What has happened in Alabama is that four companies participated in the training of Devin … to kill three men,” attorney Jack Thompson told The Tuscaloosa News, which reported the suit’s filing.

Thompson is now 18 years old, but at the time of the shootings he was 16. As such, the lawsuit claims, he should not have been sold GTA III and GTA: Vice City, which carry an M rating – for ‘mature audience only’, ie. anyone 17 years old or more. On that basis, the plaintiffs requested that the book also be thrown at retailers Wal-Mart and Gamestop for allegedly allowing Thompson to buy the games. It also names Sony, as manufacturer of the PlayStation 2 console on which Thompson is said to have played the games. This isn’t the first time GTA has got its publisher and retail partners in trouble. At least two lawsuits relating to the game are currently pending against Take Two and, separately, BestBuy. The lawsuit was announced in the same week that the US Interactive Entertainment Merchants Association (IEMA) publicly criticized the California legislature’s attempt to ban the sale of violent games to children.

Posted in Game Industry | Leave a comment