I don't think this is at all easily solvable - when the X server starts up, the
card is probably in console mode using the VGA emulation, which is pretty
brain-dead and doesn't touch much of the card memory (when you have 32M or 64M
on-card, that 640x480 gets lonely sitting in the corner). The X server first
has to pop it into the native NVidia/ATI/whatever graphics mode (remember, it
has to do that *before* it can access the video memory - you can't get there
while still in VGA emulation). Then it can proceed to clear out the on-card
memory. Unfortunately, if the X server pauses in between setting the mode and
clearing the memory, you get to see the uninitialized (and therefor left-over)
buffers. About the best you can do here is fix the server to try to not do any
time-intensive operations between the mode set and the clear.