As already mentioned, I added some code to irrEdit
and made the radiosity renderer run on the GPU instead on the CPU. My first results were disillusioning: The computed lighting was the same, but the time needed to compute it was the same as well. No speed increase. Hmpf. Then I optimized some parts of the code, minimizing a few render state changes and texture transfers, but the speed increase still wasn't that big at all. Maybe my GPU or the AGP transer was too slow? So I tried it out on my notebook, with some more modern hardware. Here, the speed increase was very noticeable:
Computation time for this scene: 9 seconds while something like one minute on the CPU. I also was able to remove some of the artifacts, but as you can see on the edges, some are still remaining. A lot of room for improvements, but I like it already.