From: Al T. <alb...@pr...> - 2002-06-07 13:43:44
|
Another wacky idea from the lurker: I was looking at a voodoo? (yes, they still exist) review the other day talking about texture compression and the effect it has on the texture when finally rendered. While viewing the screenshots, it struck me that the effects greatly resembled jpeg's lossiness when at about 50% quality. Here's the idea: would it be beneficial to convert a texture to a low-quality jpeg, then back again to take advantage of some of the inherent lossiness? It, in theory, should reduce the size of the texture without effecting the bit depth and shouldn't require a ton of code to get working (use libjpeg). I don't forsee a big problem performance wise if the textures are mangled during the loading stage, but I'm still learning ... I'll poke around in Mesa for myself this weekend, if there's time. -Al Tobey ******************************************************************** This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error please notify the Priority Health Information Services Department at (616) 942-0954. ******************************************************************** |
From: Ian R. <id...@us...> - 2002-06-07 16:42:18
|
On Fri, Jun 07, 2002 at 09:42:22AM -0400, Al Tobey wrote: > Here's the idea: would it be beneficial to convert a texture to a > low-quality jpeg, then back again to take advantage of some of the > inherent lossiness? It, in theory, should reduce the size of the > texture without effecting the bit depth and shouldn't require a ton of > code to get working (use libjpeg). I don't forsee a big problem > performance wise if the textures are mangled during the loading stage, > but I'm still learning ... Wha...? Let me get this straight. You're suggesting that when an OpenGL user calls glTexImage?D with GL_COMPRESSED_* as the internal format to compress the image as a JPEG. Then, when the texture is used, decompress the texture and upload the uncompressed image (since no card that I know of can work directly with a JPEG as a texture) to the card? It's an interesting idea, BUT unless you can get help from the card decompressing the JPEG on upload (perhaps the Radeon iDCT unit could help?) -or- you come up with some sort of blazing fast, hand-tuned, assembly-coded JPEG decoder, the performance will sink faster than the Titanic...and will rot at the bottom of the ocean for just as long. :) Hmmm...I wonder if the fragment shader units on modern cards could be used to do a VRAM-to-VRAM decompression of such a texture...hmm...I still think the performance would be horrible, though. -- Tell that to the Marines! |
From: Michael <le...@nt...> - 2002-06-07 17:11:00
|
On Fri, Jun 07, 2002 at 09:42:08AM -0700, Ian Romanick wrote: > It's an interesting idea, BUT unless you can get help from the card > decompressing the JPEG on upload (perhaps the Radeon iDCT unit could help?) > -or- you come up with some sort of blazing fast, hand-tuned, assembly-coded > JPEG decoder, the performance will sink faster than the Titanic...and will > rot at the bottom of the ocean for just as long. :) On upload implies swapping which is already a 'big problem performance wise' for some applications and users so you've got negative time to do the decompression...;o) -- Michael. |
From: Ian R. <id...@us...> - 2002-06-07 18:05:46
|
On Fri, Jun 07, 2002 at 06:10:55PM +0100, Michael wrote: > On Fri, Jun 07, 2002 at 09:42:08AM -0700, Ian Romanick wrote: > > It's an interesting idea, BUT unless you can get help from the card > > decompressing the JPEG on upload (perhaps the Radeon iDCT unit could help?) > > -or- you come up with some sort of blazing fast, hand-tuned, assembly-coded > > JPEG decoder, the performance will sink faster than the Titanic...and will > > rot at the bottom of the ocean for just as long. :) > > On upload implies swapping which is already a 'big problem performance > wise' for some applications and users so you've got negative time to do > the decompression...;o) Exactly. My random thought about the Radeon was that we could store the texture in DMA space as JPEG or a single frame of an MPEG or whatever then use the Radeon's built-in engine to decompress it as it loads it DMAs it into on-card memory. I would think that moving 1/10th the data across the bus and decompressing it on the fly would be faster than copying the original. Dunno...just a random thought, probably worthless. -- Tell that to the Marines! |
From: Michael <le...@nt...> - 2002-06-07 18:49:00
|
On Fri, Jun 07, 2002 at 11:05:21AM -0700, Ian Romanick wrote: > On Fri, Jun 07, 2002 at 06:10:55PM +0100, Michael wrote: > > On Fri, Jun 07, 2002 at 09:42:08AM -0700, Ian Romanick wrote: > > > It's an interesting idea, BUT unless you can get help from the card > > > decompressing the JPEG on upload (perhaps the Radeon iDCT unit could help?) > > > -or- you come up with some sort of blazing fast, hand-tuned, assembly-coded > > > JPEG decoder, the performance will sink faster than the Titanic...and will > > > rot at the bottom of the ocean for just as long. :) > > > > On upload implies swapping which is already a 'big problem performance > > wise' for some applications and users so you've got negative time to do > > the decompression...;o) > > Exactly. My random thought about the Radeon was that we could store the > texture in DMA space as JPEG or a single frame of an MPEG or whatever then > use the Radeon's built-in engine to decompress it as it loads it DMAs it > into on-card memory. I would think that moving 1/10th the data across the > bus and decompressing it on the fly would be faster than copying the > original. Dunno...just a random thought, probably worthless. Offtopic, but in a similar sense, I did wonder whether my dc10+'s ability to compress/decompress jpgs could be used for something like mozilla with a suitable interface for it to pass the image to, the card could even dma the result straight into the fb. On a more on-topic theme, I once wanted to do something like this a) read avi frame decompress with dc10+ into (preferably video) memory b) do 3d frame rendering with that memory as texture (without having to do any texture loading per frame) c) Capture 3d window contents d) Feed that into the dc10+ to make an mjpeg frame e) Write that frame to new avi f) rinse and repeat Sort of poor man's 3d video transitions but the alternative at the time was doing software 3d video transitions and software mjpeg encoding on a p166mmx you'll see that if the above managed 5fps it would be quick enough. I suspect you could do the above with even my lowly 800mhz duron and just use software mjpeg en/decoding. -- Michael. |
From: Al T. <alb...@pr...> - 2002-06-07 17:33:17
|
So, I'm guessing that textures aren't loaded into memory before starting the rendering? I don't really mind waiting a few more seconds for textures to load (if they're all loaded up front) if my app/game will run faster as a result. I think my idea was misunderstood (just an idea!). I suggest convert->jpeg->original format because jpeg will remove some data from the texture resulting in something smaller. I'm only talking about at load time, up front reducing the actual size of the texture. Not real time compression/decompression. Here is a file with 100% quality, and the same at 50% quality: 22407 Jun 7 13:25 hal.jpg 4403 Jun 7 13:25 hal1.jpg Now, the data that has been removed from the first file to create the second is gone forever. So, I guess what I really mean is "stripping" the image. There are other algorithms (with libraries to use, even) that can do similar things. So, here's what I'm thinking, just to be clear: load texture convert texture to lossy format convert back load into the proper place the (hopefully) smaller texture Really, it's just size reduction, not compression. The desired effect is lowered bandwidth usage at the cost of lower quality textures. Of course, people without enough memory to chew through this stuff quickly would not want this turned on. -Al On Fri, 2002-06-07 at 12:42, Ian Romanick wrote: > On Fri, Jun 07, 2002 at 09:42:22AM -0400, Al Tobey wrote: > > > Here's the idea: would it be beneficial to convert a texture to a > > low-quality jpeg, then back again to take advantage of some of the > > inherent lossiness? It, in theory, should reduce the size of the > > texture without effecting the bit depth and shouldn't require a ton of > > code to get working (use libjpeg). I don't forsee a big problem > > performance wise if the textures are mangled during the loading stage, > > but I'm still learning ... > > Wha...? Let me get this straight. You're suggesting that when an OpenGL > user calls glTexImage?D with GL_COMPRESSED_* as the internal format to > compress the image as a JPEG. Then, when the texture is used, decompress > the texture and upload the uncompressed image (since no card that I know of > can work directly with a JPEG as a texture) to the card? > > It's an interesting idea, BUT unless you can get help from the card > decompressing the JPEG on upload (perhaps the Radeon iDCT unit could help?) > -or- you come up with some sort of blazing fast, hand-tuned, assembly-coded > JPEG decoder, the performance will sink faster than the Titanic...and will > rot at the bottom of the ocean for just as long. :) > > Hmmm...I wonder if the fragment shader units on modern cards could be used > to do a VRAM-to-VRAM decompression of such a texture...hmm...I still think > the performance would be horrible, though. > > -- > Tell that to the Marines! > > _______________________________________________________________ > > Don't miss the 2002 Sprint PCS Application Developer's Conference > August 25-28 in Las Vegas -- http://devcon.sprintpcs.com/adp/index.cfm > > _______________________________________________ > Dri-devel mailing list > Dri...@li... > https://lists.sourceforge.net/lists/listinfo/dri-devel ******************************************************************** This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error please notify the Priority Health Information Services Department at (616) 942-0954. ******************************************************************** |
From: Brian P. <ba...@pi...> - 2002-06-07 17:56:54
|
On Fri, Jun 07, 2002 at 01:31:59PM -0400, Al Tobey wrote: > > So, here's what I'm thinking, just to be clear: > load texture > convert texture to lossy format > convert back At this "convert back" part, you just expanded the image back to its original uncompressed size. No gain and you lose image quality in the process. The "convert back" part needs to happen on the card or you get no compression. > load into the proper place the (hopefully) smaller texture > BAPper |
From: Ian R. <id...@us...> - 2002-06-07 18:11:48
|
On Fri, Jun 07, 2002 at 01:31:59PM -0400, Al Tobey wrote: > I think my idea was misunderstood (just an idea!). > I suggest convert->jpeg->original format because jpeg will remove some > data from the texture resulting in something smaller. I'm only talking > about at load time, up front reducing the actual size of the texture. > Not real time compression/decompression. As an aside, there is something that *might* be worth trying for cards that support palleted textures. If the texture is uploaded with GL_COMPRESSED_* as the internal format and the source texture is 16-bit or 24-bit, convert the source image to a palleted (i.e., 8-bit) image. This would cut the image to 1/2 or 1/3 its original size and, assuming the card supports palleted textures, would not require any "decompression" on upload. -- Tell that to the Marines! |
From: Michael <le...@nt...> - 2002-06-07 18:13:05
|
On Fri, Jun 07, 2002 at 01:31:59PM -0400, Al Tobey wrote: > So, here's what I'm thinking, just to be clear: > load texture Let's say the texture loaded is 16-bit raw 256x256 texture. > convert texture to lossy format Now you have a jpg of that texture - much smaller > convert back Now you have a 16-bit raw 256x256 texture image that quite probably looks worse than the original same sized texture. > load into the proper place the (hopefully) smaller texture How will it be smaller? -- Michael. |
From: Ian R. <id...@us...> - 2002-06-07 18:07:51
|
On Fri, Jun 07, 2002 at 01:31:59PM -0400, Al Tobey wrote: > So, I'm guessing that textures aren't loaded into memory before starting > the rendering? I don't really mind waiting a few more seconds for > textures to load (if they're all loaded up front) if my app/game will > run faster as a result. > > I think my idea was misunderstood (just an idea!). > I suggest convert->jpeg->original format because jpeg will remove some > data from the texture resulting in something smaller. I'm only talking > about at load time, up front reducing the actual size of the texture. > Not real time compression/decompression. > > Here is a file with 100% quality, and the same at 50% quality: > 22407 Jun 7 13:25 hal.jpg > 4403 Jun 7 13:25 hal1.jpg Oh. You're misunderstanding something here. You're starting with a JPEG which is worthless to the card. A more fair example would be to take an uncompressed TGA or TIFF or PNG, convert it to JPEG, and convert it back. The final image will be the same size as the original, but it will have worse quality. -- Tell that to the Marines! |
From: Al T. <alb...@pr...> - 2002-06-07 20:53:48
|
Ok. I didn't understand the format that the card was expecting. A large misunderstanding on my part. Thank you for clearing that up. -Al On Fri, 2002-06-07 at 14:07, Ian Romanick wrote: > On Fri, Jun 07, 2002 at 01:31:59PM -0400, Al Tobey wrote: > > So, I'm guessing that textures aren't loaded into memory before starting > > the rendering? I don't really mind waiting a few more seconds for > > textures to load (if they're all loaded up front) if my app/game will > > run faster as a result. > > > > I think my idea was misunderstood (just an idea!). > > I suggest convert->jpeg->original format because jpeg will remove some > > data from the texture resulting in something smaller. I'm only talking > > about at load time, up front reducing the actual size of the texture. > > Not real time compression/decompression. > > > > Here is a file with 100% quality, and the same at 50% quality: > > 22407 Jun 7 13:25 hal.jpg > > 4403 Jun 7 13:25 hal1.jpg > > Oh. You're misunderstanding something here. You're starting with a JPEG > which is worthless to the card. A more fair example would be to take an > uncompressed TGA or TIFF or PNG, convert it to JPEG, and convert it back. > The final image will be the same size as the original, but it will have > worse quality. > > -- > Tell that to the Marines! > > _______________________________________________________________ > > Don't miss the 2002 Sprint PCS Application Developer's Conference > August 25-28 in Las Vegas -- http://devcon.sprintpcs.com/adp/index.cfm > > _______________________________________________ > Dri-devel mailing list > Dri...@li... > https://lists.sourceforge.net/lists/listinfo/dri-devel ******************************************************************** This email and any files transmitted with it are confidential and intended solely for the use of the individual or entity to whom they are addressed. If you have received this email in error please notify the Priority Health Information Services Department at (616) 942-0954. ******************************************************************** |
From: Jacek <jp...@in...> - 2002-06-09 03:44:38
|
On Fri, Jun 07, 2002 at 09:42:22AM -0400, Al Tobey wrote: > I don't forsee a big problem performance wise if the textures are mangled > during the loading stage, but I'm still learning ... Please keep in mind that some applications update texture every frame. |
From: Mike M. <che...@ya...> - 2002-06-09 07:11:59
|
I was going to CC Ian but coulden't get his address. --- Jacek Pop³awski <jp...@in...> wrote: > On Fri, Jun 07, 2002 at 09:42:22AM -0400, Al Tobey wrote: > > I don't forsee a big problem performance wise if the textures are mangled > > during the loading stage, but I'm still learning ... > > Please keep in mind that some applications update texture every frame. > GASP! Are there any cards or drivers that actually upload textures every frame? I can understand some vary special apps using real time generating of textures, but even that seams insane. That is to say that if the whole image was uploaded every frame. I know of at least one chip, don't laugh, but the VIC-20 (Yes, the C64) had support for this in 2D. A brief lesson in history, for those who where never there or don't remember. In the C-64, I still have my C64 if it's of any use :), there was only MMIO and one interrupt. Several MMIO to bring up the card, if you will, and set video mode. Then registers held an offset of where the sprite(a tiny page/frame/picture) lived and several others held it's position, if it was above or bellow text. So all you had to do to sprite flip was one memory write, and then marrio's legs would move, Then increment the X position address. There was also some feed back about things like when spites collide, I.E. you where hit buy that fire ball. Is there any thing like this for DRI? This to me would seem to be an important thing to add to the supported feature list. __________________________________________________ Do You Yahoo!? Yahoo! - Official partner of 2002 FIFA World Cup http://fifaworldcup.yahoo.com |
From: Jacek <jp...@in...> - 2002-06-09 07:36:44
|
On Sun, Jun 09, 2002 at 12:11:58AM -0700, Mike Mestnik wrote: > Are there any cards or drivers that actually upload textures every frame? I don't know any technical details, but AFAIK some emulators (snes9x? xmame?) draw virtual screen on texture. And animation is smooth. |
From: F. <j_r...@ya...> - 2002-06-09 08:08:36
|
On 2002.06.09 08:41 Jacek Popławski wrote: > On Sun, Jun 09, 2002 at 12:11:58AM -0700, Mike Mestnik wrote: > > Are there any cards or drivers that actually upload textures every > frame? > > I don't know any technical details, but AFAIK some emulators (snes9x? > xmame?) > draw virtual screen on texture. And animation is smooth. > And video players - some of them use OpenGL to accelerate 2D. See e.g. http://www.mplayerhq.hu/DOCS/video.html#2.3.1.10 but I don't know if this is the case for 3DFX. And probably some games - I remember RTCW intro where there was a videoprojector displaying a movie onto the wall... But, I'm sure that these situations can be easily detected and avoided with carefull planning, for e.g., have a 2nd texture age that says how many times a texture as been uploaded and only make the compression once that number is reasonable (or just >1). José Fonseca |
From: Mike M. <che...@ya...> - 2002-06-09 08:59:53
|
--- Jacek Pop³awski <jp...@in...> wrote: > On Sun, Jun 09, 2002 at 12:11:58AM -0700, Mike Mestnik wrote: > > Are there any cards or drivers that actually upload textures every frame? > > I don't know any technical details, but AFAIK some emulators (snes9x? xmame?) > draw virtual screen on texture. And animation is smooth. > I'm glad to hear that it currently works well for current software. However I wonder how things like bullet holes are done in Q3UT, I bet with a good design this could be made to look awesome and be fast. I know Vanilla Q3 doesn't have damaged skins, wonder why? too slow. Imagine if the clothing got stained(Blending the clothing with a red tint) at the exact place where the bullet left. This would require both reading and writing to the texture. __________________________________________________ Do You Yahoo!? Yahoo! - Official partner of 2002 FIFA World Cup http://fifaworldcup.yahoo.com |
From: Michael <le...@nt...> - 2002-06-09 09:25:28
|
On Sun, Jun 09, 2002 at 01:59:52AM -0700, Mike Mestnik wrote: > --- Jacek Pop?awski <jp...@in...> wrote: > > On Sun, Jun 09, 2002 at 12:11:58AM -0700, Mike Mestnik wrote: > > > Are there any cards or drivers that actually upload textures every frame? > > > > I don't know any technical details, but AFAIK some emulators (snes9x? xmame?) > > draw virtual screen on texture. And animation is smooth. > > > > I'm glad to hear that it currently works well for current software. However I > wonder how things like bullet holes are done in Q3UT With extra polygons textured with the bullet hole texture blended against the background, I bet. Altering the texture means you need to duplicate it, otherwise 1 shot is going to generate a lot of bullet holes you didn't fire. -- Michael. |
From: Mike M. <che...@ya...> - 2002-06-09 09:39:22
|
--- Michael <le...@nt...> wrote: > On Sun, Jun 09, 2002 at 01:59:52AM -0700, Mike Mestnik wrote: > > --- Jacek Pop?awski <jp...@in...> wrote: > > > On Sun, Jun 09, 2002 at 12:11:58AM -0700, Mike Mestnik wrote: > > > > Are there any cards or drivers that actually upload textures every > frame? > > > > > > I don't know any technical details, but AFAIK some emulators (snes9x? > xmame?) > > > draw virtual screen on texture. And animation is smooth. > > > > > > > I'm glad to hear that it currently works well for current software. > However I > > wonder how things like bullet holes are done in Q3UT > > With extra polygons textured with the bullet hole texture blended > against the background, I bet. > > Altering the texture means you need to duplicate it, > otherwise 1 shot is going to generate a lot of bullet holes you didn't > fire. > > -- > Michael. > Your right it's not realy use full. I know the driver must do this and for videos but hopefully there arn't to many other good reasons why an app would want to load textures after inital setup? __________________________________________________ Do You Yahoo!? Yahoo! - Official partner of 2002 FIFA World Cup http://fifaworldcup.yahoo.com |
From: Michael <le...@nt...> - 2002-06-09 07:57:13
|
On Sun, Jun 09, 2002 at 12:11:58AM -0700, Mike Mestnik wrote: > --- Jacek Pop?awski <jp...@in...> wrote: > > On Fri, Jun 07, 2002 at 09:42:22AM -0400, Al Tobey wrote: > > > I don't forsee a big problem performance wise if the textures are mangled > > > during the loading stage, but I'm still learning ... > > > > Please keep in mind that some applications update texture every frame. > > > > GASP! > > Are there any cards or drivers that actually upload textures every frame? I think the key word was applications, not drivers/cards. It's how rtcw does the in game videos, unless I'm mistaken. Plus, you'll swap textures if the app uses more texture space per frame than you have free, so yes, in some situations you'll have a few swaps per frame. (rtcw with textures on max with a 32mb card for instance - or simply running from one part of a level to another) > Is there any thing like this for DRI? > This to me would seem to be an important thing to add to the supported feature list. Well it's Mesa (an opengl implementation) atop a hardware specific card driver using the DRI architecture. What Mesa implements is OpenGL 1.x + extensions - that's the place to look to see what 'supported features' there are. What the drivers implement is typically hardware features that improve the performance of those OpenGL features + extensions - falling back to software if the state means the hardware implementation can't be used. Some hw features might not be enabled but, in principle, they could be as they'll improve performance for an existing OpenGL feature (e.g. Radeon hyperz) Some features might not have a corresponding OpenGL feature, in which case you're not likely to gain much by adding it to DRI unless there is either a) an OpenGL extension you can add to Mesa that will let you use the feature from an application or you write one, or b) you invent your own API and plug it on top of DRI. -- Michael. |
From: Leif D. <lde...@re...> - 2002-06-09 17:39:52
|
On Sun, 9 Jun 2002, Michael wrote: > On Sun, Jun 09, 2002 at 12:11:58AM -0700, Mike Mestnik wrote: > > --- Jacek Pop?awski <jp...@in...> wrote: > > > On Fri, Jun 07, 2002 at 09:42:22AM -0400, Al Tobey wrote: > > > > I don't forsee a big problem performance wise if the textures are mangled > > > > during the loading stage, but I'm still learning ... > > > > > > Please keep in mind that some applications update texture every frame. > > > > > > > GASP! > > > > Are there any cards or drivers that actually upload textures every frame? > > I think the key word was applications, not drivers/cards. > > It's how rtcw does the in game videos, unless I'm mistaken. Do applications use texture sub-images for video? AFAIK, the DRI drivers all swap out and upload the entire texture image for glTexSubImage[123]D, where it could be optimized to only upload the changed portion of the texture. -- Leif Delgass http://www.retinalburn.net |
From: Mike M. <che...@ya...> - 2002-06-09 22:01:33
|
--- Leif Delgass <lde...@re...> wrote: > On Sun, 9 Jun 2002, Michael wrote: > > > On Sun, Jun 09, 2002 at 12:11:58AM -0700, Mike Mestnik wrote: > > > --- Jacek Pop?awski <jp...@in...> wrote: > > > > On Fri, Jun 07, 2002 at 09:42:22AM -0400, Al Tobey wrote: > > > > > I don't forsee a big problem performance wise if the textures are > mangled > > > > > during the loading stage, but I'm still learning ... > > > > > > > > Please keep in mind that some applications update texture every frame. > > > > > > > > > > GASP! > > > > > > Are there any cards or drivers that actually upload textures every frame? > > > > I think the key word was applications, not drivers/cards. > > > > It's how rtcw does the in game videos, unless I'm mistaken. > > Do applications use texture sub-images for video? AFAIK, the DRI drivers > all swap out and upload the entire texture image for glTexSubImage[123]D, > where it could be optimized to only upload the changed portion of the > texture. > > -- > Leif Delgass > http://www.retinalburn.net > I was thinking the same thing, however Michael pointed ought that if the app realy wanted this it could use extra polygons correctly textured to do the same thing. It's true that the card could have some feature that the app might benefit from though, like accepting diffs. __________________________________________________ Do You Yahoo!? Yahoo! - Official partner of 2002 FIFA World Cup http://fifaworldcup.yahoo.com |