# Tag Archives: computer graphics

Anything related to computer graphics

# The rendering equation

The rendering equation, the keystone of computer graphics rendering:

$$L_o(x, \vec w) = L_e(x, \vec w) + \int_\Omega f_r(x, \vec w’, \vec w) L_i(x, \vec w’) (\vec w’ \cdot \vec n) \mathrm{d}\vec w’$$

(This is at the same time also a test for MathJax support on my weblog in order to provide correct mathematical typesetting using proper typography instead of some lame pixelated image.)

# AmanithVG test on FreeBSD

I recently had the pleasure to have Matteo Muratori of Mazatech s.r.l. ask me to test their OpenVG implementation called AmanithVG. On my FreeBSD 6.2-STABLE machine running X.Org 6.9 with NVIDIA binary drivers I proceeded to test their 1.0 candidate implementation. I had to install the graphics/libglut port to satisfy a dependency, but after that every thing worked quite well. The speed was quite impressive, especially given that my current work PC is not exactly state of the art.

The possibilities that it, and OpenVG, presents are impressive. It seems to be reasonably easy to create blurring filters that apply in real-time, overlapping images that allow colours to blend, zooming in and out of images, and so on. This opens a lot of possibilities for normal desktop environments as well, I think. If you look at what, say, Beryl is doing on the Unix desktops I think it can benefit from what OpenVG does. Of course, the real goal for OpenVG is more aimed at mobile phones or other handhelds, but I think we can easily look beyond just those platforms.

# Switching GPUs on the fly

According to a what older article at LaptopLogic NVIDIA has an idea to make the system use its integrated simpler GPU for handling day-to-day desktop graphics, while switching to the stronger and more featureful (and often more power-consuming and warmer) GPU when needed for 3D work or gaming.

Interesting idea, but with SoC designs coming from Imagination Technologies, Falanx, and other designers that reduce a GPU’s power-consumption and warmth build-up you can wonder if such a design is interesting enough to work out.

# Intel planning something massive for GPU market?

An item over at the Inquirer said that at the E3 they were unofficially informed that Intel is working on getting back into the games graphics market. This was in May.

In July we have the following item over at the Inquirer in which Intel reveals their G965 graphics solution, which is using technology licensed from PowerVR, back then codenamed Eurasia and now known as GSX (Graphics Shader Accelerator). Now furthermore the article tells us that Intel will also start work on using Muse (Media Unified Shading Engine) for its graphics offering. Simplistically compare Muse to what Cell did, cheap cores in a parellel setup thus enabling improved calculations. In this case with multiple GSX cores (due to the Universal Scalable Shader Engine I think, which combines the pixel and vertex shaders into one design) it might even build a multi-pipeline graphics solution. The only funny thing is that Wikipedia’s PowerVR article lists Muse as a mobile solution according to a company presentation. And indeed, when one checks another article at the Inquirer you see that the roadmap indeed introduces Muse and Athena, the first aimed at portable computing, the latter aimed at desktops (introducing programmable shaders). Of course, if the GSX is a SoC design the lower energy consumption and die size (and gate count)

Funnily enough on the Intel site right now there are two job openings (posted end of July and early August) in the United States for a Senior Graphics Software Engineer and Software Engineer 3D Graphics. The first focusing on what seems to be identify from the end-user (including game developers) point of view where the bottlenecks in the hardware design lie. The latter job seems to focus on driver improvements and driver support for OpenGL 2.0 and DirectX 10. Coincedence?

# OpenGL fully supported on Vista

In a SIGGraph 2006 presentation by NVIDIA it shows that Microsoft has revisited its stance on how they will support OpenGL within Windows Vista. You may recall when I first wrote about this last year that Microsoft’s initial plan was to layer OpenGL through DirectX.

This time last year…

• The plan for OpenGL on Windows Vista was to layer OpenGL over Direct3D in order to obtain the Aeroglass experience

The situation today…

• OpenGL accelerated ICD now fully supported under Windows Vista
• OpenGL works fully with the Aeroglass compositing desktop
• Performance and stability will rival Windows XP by driver release

So it seems some complaining still works given sufficient pressure.

# And while we’re at it

This module contains stuff which Intel can’t publish in source form, like Macrovision register stuff and other trade secrets. It’s optional, so if you don’t want to use a binary module, you don’t get to use code written by Intel agents for these features. And, we also haven’t figured out how and when to release this binary blob, so there’s no way you can use it today. The driver remains completely functional in the absense of the binary piece, and in fact has no reduction in functionality from previous driver releases.

So much for that idea eh?

# Graphics market in a bit of turbulence

With recent news of AMD acquiring ATI there was some incorrect reporting that AMD would be dropping the ATI brand in favour of tacking Radeon on its own brand name of AMD. According to Ars Technica the name drop was a miscommunication that spread like fire through the blogosphere as well as online news. This is both the advantage and disadvantage of the quick turn-around time of the online media. One would think that serious technical journalists, however, would verify this with AMD and/or ATI themselves before reporting.

In related news Intel revealed a website where they are offering source code for their 965 IGP. And it becomes even more interesting when put into perspective with an article from InfoWorld where it is said that AMD is strongly considering releasing open source at least a part of the ATI graphics drivers. One can only wonder how NVIDIA will react to these developments.