header image
 

Zafehouse 2 makes the move to Direct3D… without the 3D bit

About three weeks ago, I put the finishing touches on the Direct3D 9 renderer for Zafehouse 2. Previously, the game exclusively used GDI+, the default graphics API for .NET. Now it runs on a bizarre fusion, with Direct3D performing compositing and GDI+ painting the final image onto the screen.

Well, except when I use native GDI calls to paint instead. Actually, now that I think about it, GDI+ doesn’t do a heck of a lot for Zafehouse 2 anymore.

Now, this change raises a number of questions: Why not use Direct3D for the entire process? Why change APIs in the first place? Isn’t the Direct2D API in DirectX 10+ basically what you’re describing?

These are all fair questions, and I’m going to try to answer them.

Why not use Direct3D for the whole shebang?
Because I’ve already done a lot of groundwork for image rendering that replacing the engine wholesale would add a lot of time to the development process. Considering the game is already years overdue, I doubt anyone would be excited about the news of me delaying the game just to rewrite a chunk of code that doesn’t need to be rewritten. It all works fine, I just needed something more powerful than GDI+ to get the heavy lifting done.

Why move to Direct3D in the first place?
Primarily, it was a question of speed. Secondarily, I wanted to improve my knowledge of the DirectX API, having already implemented my own xWMA audio engine, which relies on the XAudio2 API. So the game already depended on DirectX, through the managed SlimDX wrapper.

Isn’t what you’ve described basically Direct2D?
It kind of is, and I could have just used the Direct2D API to quickly accomplish the GDI/GDI+/Direct3D fusion. Except Direct2D is a god-awful API. That, and Direct2D requires a DirectX 10-compliant graphics card to run, as well as Vista or Windows 7. It’s bad enough Z2 needs a DirectX 9.0c-compliant card now, and I didn’t want the make the situation worse. By writing my own routines, Z2 remains compatible with Windows XP and graphics cards circa 2005. Which is pretty good, all things considered.

How does it work?
In technical terms, Zafehouse 2 generates a couple of D3D-compatible textures from PNGs for compositing purposes at startup. This takes a couple of milliseconds. We also create some device independent bitmaps for direct blitting, but we don’t have to worry about that.

Whenever compositing is required, the game sends the textures to the D3D renderer, paints them to an off-screen surface, downloads the data into system memory, and then blits it directly to the GDI+ graphics object using GDI calls. In the case of Intel chips, we lock the surface and perform the blit directly. In the case of NVIDIA/AMD cards (which use dedicated video RAM), we copy the surface to another surface in system RAM using GetRenderTargetData, and then blit directly from that surface.

The reason for the two different paths is because it’s faster to copy directly on Intel GPUs as they don’t have dedicated video RAM so, essentially, locking the rendering surface is like locking another chunk of system RAM and doing a direct copy which, as you can imagine, is faster than copying it to another surface in system RAM and copying from that.

For cards with dedicated RAM, locking the video surface directly is actually slower. GetRenderTargetData is heavily optimised for these chips, so doing a surface to surface copy followed by a blit is actually pretty fast, even on crappy low-end NVIDIA/AMD cards.

The result is a GDI/GDI+ based graphics engine that offloads compositing (so alpha-blending and overlays) to video hardware. This means we can make use of hardware acceleration, as GDI+ is completely software driven.

It’s true that copying data from video RAM to system RAM is a bit of a performance hog (and I’m sure there are more than a few coders out there shaking their heads), but in practice it’s much, much faster than even the most heavily-optimised GDI+ routines.

What is GDI+ good for then?
Rendering text. My god, it’s great at that. I actually implemented text rendering via D3D in my renderer, but it looked like garbage. GDI+, on the other hand, looks amazing, as it supports a number of quality features D3D’s Font class does not (hinting, gamma correction, etc). So, I’m happy to take a minor performance hit to keep text rendering in software.

GDI+ also makes alpha-blending images easy, but the performance hit is quite nasty, even with double-buffering and razor-accurate dirty rectangles. Credit where credit’s due – it’s simple to use but not at all a great performer.

~ by Logan on April 26, 2011.

4 Responses to “Zafehouse 2 makes the move to Direct3D… without the 3D bit”

  1. Woot! Can’t wait, this is gonna be amazing. Hopefully it’ll get noticed by someone big, and you’ll get a job!

  2. Since i am an Xp-user, i can just say: Wise choice. xD

  3. Still waiting on the game. I check back every month or two to see if its done, ever since the beginning of 2010. Currently from what I’ve seen it looks amazing. Thanks for putting time into what should be a great game.

  4. *nudge* It’s been a few months :(