RE: [GD-Windows] BitBlt() syncing to VBL
Brought to you by:
vexxed72
From: Andy G. <an...@mi...> - 2002-01-15 20:15:46
|
CreateDIBSection supports 555 and 565 16 bit modes, check out the docs?? I would either work in 565 all the time (as most people will have this) or do your own custom color convert/blt code - you should be able to easily match or beat the speed of GDI and it's kinda fun mmx code. Andy. -----Original Message----- From: Brian Hook [mailto:bri...@py...]=20 Sent: Tuesday, January 15, 2002 11:57 AM To: gam...@li... Subject: RE: [GD-Windows] BitBlt() syncing to VBL > The comment at the end about using 555 buffers is > interesting. Maybe that's why you are using GDI, not DDRAW? -=20 Actually, I'm using 555 BECAUSE I'm using GDI, not the other way around. GDI's 16-bit DIB sections are 555. > I have used GDI in the past to do 555 to primary blits with > very few issues, it's very fast, esp. when you involve=20 > stretching. However, I know of very few displays that=20 > actually have a desktop running in 555 - almost everything=20 > these days has a 565 desktop (in 16 bit mode).=20 Correct. > difference between the mac and PC is that the mac has a 555 > desktop? Maybe the PC is being forced to do an expensive=20 > conversion from 555 to 565. It definitely is (as per the XLATEOBJ_hGetColorTransform thread of a while ago), but I don't think that's accounting for all the difference. Both machines have NVidia graphics accelerators, and they're both running in 16-bit, and I don't think the GF series support a 555 mode. Brian _______________________________________________ Gamedevlists-windows mailing list Gam...@li... https://lists.sourceforge.net/lists/listinfo/gamedevlists-windows |