RE: [GD-Windows] BitBlt() syncing to VBL
Brought to you by:
vexxed72
From: Brian H. <bri...@py...> - 2002-01-15 19:57:03
|
> The comment at the end about using 555 buffers is > interesting. Maybe that's why you are using GDI, not DDRAW? - Actually, I'm using 555 BECAUSE I'm using GDI, not the other way around. GDI's 16-bit DIB sections are 555. > I have used GDI in the past to do 555 to primary blits with > very few issues, it's very fast, esp. when you involve > stretching. However, I know of very few displays that > actually have a desktop running in 555 - almost everything > these days has a 565 desktop (in 16 bit mode). Correct. > difference between the mac and PC is that the mac has a 555 > desktop? Maybe the PC is being forced to do an expensive > conversion from 555 to 565. It definitely is (as per the XLATEOBJ_hGetColorTransform thread of a while ago), but I don't think that's accounting for all the difference. Both machines have NVidia graphics accelerators, and they're both running in 16-bit, and I don't think the GF series support a 555 mode. Brian |