Actually you'd not even need to inject a libGL.so at all. You could as well just modify the PLT/GOT entries for the glDraw… functions to a dumper function and then jmp to the actual glDraw… call. This could be done using the system debugging APIs.
So the problem is twofold:
There are the OS-ABI OpenGL functions (i.e. OpenGL functions that are exposed due to the ABI demands of the operating system) and there are the extended functions. The Windows ABI mandates OpenGL-1.1, Linux used to be OpenGL-1.2 but the recently released LSB bumped that to 2.1. These ABI level functions are expected (by the ABI) to be provided by the base system interface libraries in the form of regular, non-hooked symbols.
And then there is the extended functionality, i.e. everything not covered by the base system ABI, so modern OpenGL, and extension functions.
Windows and Linux treat the later case differently! In Windows extended functions (pointers) must be assumed to be depending on the context the've been gathered from. In Linux, namely the GLX specification it's explicitly stated that extended functions' pointers are not tied to the context, but rather the GLX interface.
Regarding OpenGL, or rather the typical implementations of it that leads to an interesting problem: Dispatch. Every OpenGL function eventually must be dispatched to the right driver. Using indirect GLX this is easy, since OpenGL calls are translated into GLX opcodes and transmitted via the GLX/X11 extension wire protocol; the X11 server then submits it to the graphics driver proper (and honstly IMHO transmitting command buffers to a server is the only proper way to do it: The benefits of direct GL stem mostly from the fact that the X11 protocol takes into account TCP transport; if we consider localhost-only connections highly efficient RPC and zero-copy-SHM protocols are possible).
But as soon as you hit direct GL it becomes a TLS context jump table indirection mess. And only because the first implementers of OpenGL (who also wrote the specification) were not fully aware of some of the leeway their own words would provide them: *Nowhere in the OpenGL specs it is stated that OpenGL functions shall reside in the global namespace. And nowhere in the OpenGL specification it is forbidden, that actual OpenGL implementations add a "context" 1st-parameter (akin to the implcit `this` of C++ class member function) to every function call.
Regarding OpenGL this is a No-Go.
On Linux (and Solaris and the *BSDs) The actual OpenGL driver resides in the libGL.so, due to the lack of a standardized ICD hooking mechanism (as it exists on Windows). Hence the libGL.so on your system depends on the installed driver and version. Also on Linux people expect to be able compiling and installing their libGL.so themself. On Windows the OpenGL ICD resides in a DLL that gets loaded into the program by the graphics driver the moment a OpenGL context is created. That ICD again depends on the driver vendor and version. So checksumming is not possible as well.
To make matters worse the proprietary drivers of NVidia and ATI/AMD, if they detect program with known issues, yet broad audience (think about every AAA game ever) will actually patch parts of the program text in the memory image to silently fix bugs in that program. If you wondered why every big game release is usually accompanied by a driver update release from NVidia and ATI/AMD, well, that's why.
But even if DLL/.so checksumming were applicable, you could still ptrace into the program binary and patch the PLT/GOT entries for the `glDraw…` jumping to a little bit of dumper code (added with ptrace again) that extracts the data and then trampolines into the actual `glDraw…` function called.