Friday, February 04, 2005

Wednesday, February 02, 2005

GPGPU Forums kick ASS !!!

In many way these forums are the best source of practical GPGPU prgrammign information on the web. I will mine them
extensively

Some observations/thoughts:


  • The output of the fragment processor need not go directly to the framebuffer ... it can go to an offscreen buffer, which can have full floating-point precision if you want it to (32-bit floats per channel on NVIDIA's NV3x and up, 24-bit floats per channel on ATI's R3x0). You can get the result back down into system memory by using a simple glReadPixels() call. Just don't call that too often because it's a performance bottleneck

  • Crypto on the GPU !!!

  • In topic: http://www.gpgpu.org/forums/viewtopic.php?t=226 it is suggested that genetic algorithms would map nicely to a gpu !!!

  • Article on nv40 architecure







longer note:

Q: "Which Geforce FX would you recommend for experiments with GPU programming?"

A: " I would suggest an Nvidia 6800GT or an ATI X800. If you need 32-bit precision, you will need to stick with Nvidia for now, although the ATI boards are noticeably faster at the moment for floating point ops. I believe only ATI currently supports GLSL in their official drivers, but Nvidia does have devel drivers that are supposed to suport GLSL.

The last generation Nvidia part will probably do okay, but the PS3.0 support in the NV40 (6800 series) is very interesting. However, for the last gen boards, I would stay away from the lowend boards, like the FX5200, and get something more midrange like the 5700. But, since the NV40's are shipping now, the FX5950's are quickly dropping in price. "

Monday, January 31, 2005

Quartz Compositing, OpenGL and the GPU



From www.apple.com/macosx/pdfs/Quartz_TB.pdf...

Mac OS X v10.2 takes computer graphics to the next level by using hardware to accelerate the Quartz Compositor. If a supported GPU 1 is installed, Mac OS X v10.2 automatically enables Quartz Extreme which moves the Quartz Compositor from the CPU to the GPU. This allows the CPU(s) to focus on application-specific needs. As a result, the entire system feels faster amore responsive when Quartz Extreme is enabled, and CPU use drops dramatically.

...

With Quartz Extreme, every pixel on the screen is sent through the Mac OS X OpenGL pipeline. Each onscreen element—2D, 3D, and video graphics—becomes an OpenGL texture applied to objects representing those elements. The elements are composited in real time to deliver the unique user experience offered by Quartz Extreme. You’ll enjoy more fluid, higher-frame-rate graphics in intricately composited scenes, such as complex, translucent 3D objects overlaying a full-motion DVD video. That means shadows will drop more quickly and transparent objects will layer faster—and Mac OS X can do more processing in the background while you work in the foreground.

A New Hope...



Two things happened today:

1. I talked with Maja about her user-re-authentication code today. Most of it was written using some kind of visual programming bullshit. Totally non-portable (so it would seem). Seems a re-write might be in order. Bad news.

2. I had some thoughts regarding GPU-Sniffing. Quartz for Mac OS X uses OpenGL (hardware accelerated) to render the desktop. This must mean that the kernel has some kind of access to the OpenGL libraries. Well, my GPU-sniffer scheme needs to be able to make OpenGL API calls from within the kernel (via modifications to the video driver) to setup and drive the graphics pipeline. This suggests 2 possibilities:

i) Use Mac OS X as the dev. environment. Interesting. Can I make these kind of changes to the video driver in Mac OS X?

ii) I remember some time ago hearing about a project to make X use OpenGL to render the desktop for Linux. Maybe I can piggy-back off of this codebase.