This mode is implemented by emulating the X server extension within the OpenGL API client-side library and using the HP Virtual Memory Driver (VMD) to perform Xlib rendering.
VGL provides flexibility for OpenGL users, but does not provide the same level of performance as is available to servers supporting GLX.
Bool hpglXDisplayIsVGL(Display *dpy, int screen)to determine if a particular display connection is operating in VGL mode. The return value is "True" if dpy is VGL; otherwise, the value returned is "False." This is an HP function that is not available on other implementations of OpenGL.
Because HP VMD uses the X11 protocol to display the images, this targeted drawable may be local or remote. This may include rendering to X terminals, older HP devices, or a personal computer. The only requirement is that the output is directed to an X11 drawable. (See Chapter 1 for a list of supported VMD configurations) VMD is also the driver used to render to GLX pixmaps.
When a GLX context is created for rendering three-dimensional graphics using OpenGL, GLX first checks to see if the X server supports the GLX extension. If it does not, the Virtual Memory Driver will be used. GLX examines the available list of X visuals and decides which ones can be software extended to be GLX visuals (see the supported visuals list). Buffers are allocated in virtual memory for the OpenGL color and ancillary buffers. When the application issues a glFlush(), glFinish(), or a glXSwapBuffers() call, the contents of the corresponding virtual-memory-color buffers are sent to the X11 window using X protocol.
Double buffering for VMD is implemented using the X11 Double-Buffering Extension (DBE). Double-buffered visuals are not available for HP OpenGL rendering with VMD on X servers that do not support DBE.
Because of the way VMD works (rendering to a VM buffer and then displaying the images through X11 protocol), it will behave a bit differently than hardware devices. In particular, since VMD renders to VM buffers, changes to the X11 window will not appear until a buffer swap or a glFlush/glFinish.
Resource usage needs to be taken into consideration,as well. VM buffers are allocated for all of the OpenGL color and ancillary buffers. Color buffers are allocated when the context is created. Other buffers (depth, stencil, accumulation) are allocated at first use. These buffers can be quite large.
For example, consider an X11 window 750 pixels wide and 600 pixels high. The size of each VM color buffer for an 8-bit visual is:
(750 600) pixels 1 byte/pixel = 450,000 bytes
Consider that an OpenGL application may use two color buffers (for double buffering), a 32-bit depth/stencil buffer, and a 48-bit accumulation buffer. The size of the virtual memory required then becomes 5,400,000 bytes. In addition, the amount of virtual memory required is correspondingly larger for 12-bit and 24-bit color buffers.
Following are the steps required to run HP's implementation of OpenGL "stereo in a window" mode:
export DISPLAY=myhost:x.y /opt/graphics/OpenGL/contrib/xglinfo/xglinfoThe output from xglinfo lists the OpenGL capabilities of the specified X Display, and includes all GLX visuals that are supported. If one or more of the listed GLX visuals are marked as stereo capable, then you can proceed to step three.
/opt/graphics/common/bin/setmon graphics deviceNote that graphics device is a name such as "/dev/crt" that is included on the Screen line in the /etc/X11/X*screens file for the X Server that you want to configure for stereo. The setmon command is interactive and will present you with the possible monitor configurations allowable for the specified device. You should select one of the configurations that is listed by setmon as stereo capable. If none of the configurations indicate stereo capability, then your graphics device cannot be used for OpenGL stereo rendering.
After successfully re-configuring your monitor, the X Server will be restarted, and you can verify the availability of GLX stereo visuals by running the xglinfo command again.