From: PlasmaJohn <dr...@pr...> - 2003-04-08 13:04:21
|
Mike A. Harris wrote: > On Tue, 8 Apr 2003, Ian Molton wrote: > > >isnt it about time we got something like > >--enable-dodgylegalstatus-s3tc-code then? ... > Doesn't matter to me much either way though. If someone does > stick any patent encumbered code into Mesa, DRI, or XFree86 > however, I hope they do it in an easy-to-rip-out manner, so that > distribution maintainers don't have to go through hell. I've been thinking about this. How hard would it be to build this as a loadable module/LD_PRELOAD hack? Earlier NWN betas had some issues with one of the OpenGL calls that some bright young lad(?) hacked around with an LD_PRELOAD. If the module is easily compilable, or even available as a binary in a sane IP domain, then risk is the individual's. BTW, this was such a pain to track down, so for posterity (and archives): Patent Number: 5,956,431 System and method for fixed-rate block-based image compression with inferred pixel values. Date Filed: Oct. 2, 1997 Date Issued: Sep. 21, 1999 I've seen a reference that this is a superset of S3TC. John |
From: PlasmaJohn <dr...@pr...> - 2003-04-09 14:19:32
|
Sven Luther wrote: > > Do you mean they shouldn't use it at all? I guess this is not an option, > > Well, at least not in their linux version. There is no real point in > releasing a game most users have problem with, is there ? "Most" users seem to be using Nvidia hardware with the Nvidia binary or the proprietary FireGL drivers with 9500/9700 boards. They have no problems with S3TC. NWN *does* run without S3TC support, it just "looks like ass". I realize that this is not the fault of the DRI developers, but unless this situation is resolved, you're probably going to lose most of the gamers. All 5 of us. > That said, are there no free compression methods they could use instead ? 3dfx's FXT1, but nobody uses it. Subjectively it looks worse. http://www.digit-life.com/articles/reviews3tcfxt1/ > Also, it seems Sonic Blue is in liquidation right now, let's start a > donation box or something such, and try to buy the patent from them, so > there will be no such problems anyway. S3 sold off their video technology to Via. Notice that the chipsets that the EPIA motherboards use integrate a AGP Savage. John |
From: Alan C. <al...@lx...> - 2003-04-09 15:06:01
|
On Mer, 2003-04-09 at 15:19, PlasmaJohn wrote: > S3 sold off their video technology to Via. Notice that the chipsets that the > EPIA motherboards use integrate a AGP Savage. VIA only own part of S3, and EPIA boards don't use Savage either. EPIA uses the trident video EPIA-M uses castle rock, which is their own |
From: PlasmaJohn <dr...@pr...> - 2003-04-10 00:52:07
|
On 9 Apr 2003, Alan Cox wrote: > On Mer, 2003-04-09 at 15:19, PlasmaJohn wrote: > > S3 sold off their video technology to Via. Notice that the chipsets that the > > EPIA motherboards use integrate a AGP Savage. > > VIA only own part of S3, and EPIA boards don't use Savage either. Ok. > EPIA uses the trident video > EPIA-M uses castle rock, which is their own I thought I heard that Castle Rock was Savage(-ish). Regardless, chalk it up to a caffine deficiency. Thanks, John |
From: Alexander S. <Ale...@at...> - 2003-04-09 16:29:27
|
> I have an "Powered by ATI" card that the ATI drivers do not > work with (but DRI does). Which card/chip are you having exactly? -Alex. |
From: Pawel S. <pa...@th...> - 2003-04-09 17:02:40
|
On 2003.04.09 18:28, Alexander Stohr wrote: > > I have an "Powered by ATI" card that the ATI drivers do not > > work with (but DRI does). > > Which card/chip are you having exactly? Radeon8500LE by Sapphire - but I must say I have tried the binary drivers only once when I got the card half a year ago. I am aware that ATi released an update but I did not have time to check it out since DRI drivers worked just fine for my primary applications (i.e. not games). (I should have written "ATI drivers did not work with when I tried them", I apologise for unnecessarily inflamatory statement). 01:00.0 VGA compatible controller: ATI Technologies Inc Radeon R200 QL [Radeon 8500 LE] (prog-if 00 [VGA]) Subsystem: PC Partner Limited Radeon R200 QL [Sapphire Radeon 8500 LE] Flags: bus master, stepping, 66Mhz, medium devsel, latency 64, IRQ 11 Memory at d8000000 (32-bit, prefetchable) [size=128M] I/O ports at d800 [size=256] Memory at d7000000 (32-bit, non-prefetchable) [size=64K] Expansion ROM at d7fe0000 [disabled] [size=128K] Capabilities: [58] AGP version 2.0 Capabilities: [50] Power Management version 2 Also for the record, I used to have GeForce2MX but the drivers used to lock the machine after a while (probably an interaction with the motherboard; I frequently have quite high CPU/io load) a I gave up on doing any 3D with that card. Pawel Salek -- Pawel Salek http://www.theochem.kth.se/~pawsa/ Theoretical Chemistry Division, KTH voice: +46 8 5537 8418 |
From: Martin S. <Mar...@un...> - 2003-04-09 19:23:49
|
Alexander Stohr <Ale...@at...> wrote: >> I have an "Powered by ATI" card that the ATI drivers do not >> work with (but DRI does). > Which card/chip are you having exactly? Take a Xelo Radeon9100 for instance. On loading the kernel module 'fglrx.o': Apr 9 20:32:11 quickstep kernel: [fglrx] Maximum main memory to use for locked dma buffers: 430 MBytes. Apr 9 20:32:11 quickstep kernel: [fglrx:firegl_init] *ERROR* Device not found! This is built from the 'fglrx-glc22-4.2.0-2.5.1' package. I thought this _should_ work because the 9100 is supposed to be an improved 8500 board, Martin. -- Unix _IS_ user friendly - it's just selective about who its friends are ! -------------------------------------------------------------------------- |
From: Alexander S. <Ale...@at...> - 2003-04-09 19:30:07
|
okay, give it a try with the very next version of the driver. -Alex. > -----Original Message----- > From: Martin Spott [mailto:Mar...@un...] > Sent: Wednesday, April 09, 2003 20:37 > To: dri...@li... > Subject: Re: [Dri-devel] Neverwinter Nights // Radeon (R100)? > > > Alexander Stohr <Ale...@at...> wrote: > >> I have an "Powered by ATI" card that the ATI drivers do not > >> work with (but DRI does). > > > Which card/chip are you having exactly? > > Take a Xelo Radeon9100 for instance. On loading the kernel > module 'fglrx.o': > > Apr 9 20:32:11 quickstep kernel: [fglrx] Maximum main memory > to use for locked dma buffers: 430 MBytes. > Apr 9 20:32:11 quickstep kernel: [fglrx:firegl_init] *ERROR* > Device not found! > > > This is built from the 'fglrx-glc22-4.2.0-2.5.1' package. I > thought this > _should_ work because the 9100 is supposed to be an improved > 8500 board, > > Martin. > -- > Unix _IS_ user friendly - it's just selective about who its > friends are ! > -------------------------------------------------------------- > ------------ > > > ------------------------------------------------------- > This SF.net email is sponsored by: Etnus, makers of > TotalView, The debugger > for complex code. Debugging C/C++ programs can leave you > feeling lost and > disoriented. TotalView can help you find your way. Available > on major UNIX > and Linux platforms. Try it free. www.etnus.com > _______________________________________________ > Dri-devel mailing list > Dri...@li... > https://lists.sourceforge.net/lists/listinfo/dri-devel > > |
From: Martin S. <Mar...@un...> - 2003-04-09 20:49:58
|
Alexander, the following is not intended to attack you personally. Alexander Stohr <Ale...@at...> wrote: >> Alexander Stohr <Ale...@at...> wrote: >> >> I have an "Powered by ATI" card that the ATI drivers do not >> >> work with (but DRI does). >> >> > Which card/chip are you having exactly? >> >> Take a Xelo Radeon9100 for instance. On loading the kernel >> module 'fglrx.o': > okay, give it a try with the very next version of the driver. > -Alex. I bought ATI cards because there was a chance to run software I can build myself from source - I'm 'maintaining' my Linux box just by using the compiler for about eight years now. I've avoided Nvidia graphics boards like a plague because of their driver policy. ATI had the taste of supporting OpenSource driver development. BUT, anyone would be a fool thinking that I'd start using binary-only drivers with ATI boards. If I were really be forced to run binary-only drivers on Linux because of missing features in XFree86 I promise that I'll avoid using ATI stuff because there are boards whose drivers are obviously much easier to handle - I _know_ there are quite a few poeple out there thinking the same way. If people agree on using binary-only, then most of them are buing Nvidia. Period. I really start getting pissed off ATI driver policy ! This duplicate effort is horrible nonsense. Why don't ATI just release a few objects for optional _clean_ inclusion into DRI drivers - just for the parts that they don't want to disclose documentation - and concentrate on supporting the DRI project ? Please tell this to the people responsible for the appropriate decisions - in case you have a channel to report, Greetings, Martin. -- Unix _IS_ user friendly - it's just selective about who its friends are ! -------------------------------------------------------------------------- |
From: Jason C. <ja...@ds...> - 2003-04-08 19:32:30
|
I expect to be chewed out for this: I've seen so many people question the idea of includeing s3tc code in the DRI. While I agree that it will be unlikely in the near future and that there are far more pressing issues with driver compatibility, I have run accross a way that some bright and determined individual might be able to start laying a foundation for it. I would ask that any such individual please take a look at this: http://www-2.cs.cmu.edu/~dst/DeCSS/Gallery/ Read carefully and realize that Dr. David S. Touretzky put himself on the line for this. Transcriptions of his courtroom battle are linked to this site. He was only able to prevail through his creativity and his logical argument. The patent issue with regard to s3tc is different, and I fully admit that I am not legally trained. So please do not take what I state here as any form of consent or legal advice. But, if someone were curious, determined and (perhaps) foolhardy enough to try something similar to what Dr. Touretzky has done with decss, then they might just get away with it. The problem happens later. If an individual posts s3tc code standalone they're probably safe. But if that individual then posts how to include the code in the DRI, or any other opengl implementation without an official license, then chances are patent holders will attack those implementations, even if they don't include the code in official distributions. No one wants that to happen. Whether any lawsuit would hold up in court is irrelevant to the setback that would happen as a result. S3tc is a mistake for open source period. Why opengl can even incorporate it on any operating system is beyond my comprehension. I read an article some time ago (I wish could I reference it properly) that questioned the future of texure compression of any sort. As newer graphics cards are capable of so many billions of triangles/particles per second we end up with millions of smaller surfaces. Texuring these surfaces becomes tedious, but the size of the texture needed to fill a surface becomes smaller, forgoing the need for texture compression. So while the number of surfaces increases the size of textures decreases. So holding this to be true, would we not need to implement vertex compression in such a way to compensate for this shift? I don't know if this is an acurate prediction or even a valid argument. But it is somewhat compelling, nonetheless. I don't understand 3d operations well enough to have a very strong opinion on the matter. As a side note: What ever happened to FXT1 with 3Dfx cards? I was under the impression that FXT1 could actually work with s3tc textures. If that were true, then would there have been a way to "trick" an application requiring s3tc to use the fxt1 extension? A wrapper of some sort. Or just purposely mislabel the extension name. Maybe that would have patent issues as well. But what about games like quake3 or rtcw that allow for options to be enabled like "compress_textures". Is this a "generic" compression of some sort? It does not implicitly state "s3tc_compress_textures" as does UT 2003. I was curious if anyone could help me understand that. Are some texures pre-compressed while others are compressed on the fly? PlasmaJohn wrote: >Mike A. Harris wrote: > > > >>On Tue, 8 Apr 2003, Ian Molton wrote: >> >> >> >>>isnt it about time we got something like >>>--enable-dodgylegalstatus-s3tc-code then? >>> >>> > >... > > > >>Doesn't matter to me much either way though. If someone does >>stick any patent encumbered code into Mesa, DRI, or XFree86 >>however, I hope they do it in an easy-to-rip-out manner, so that >>distribution maintainers don't have to go through hell. >> >> > >I've been thinking about this. How hard would it be to build this as a >loadable module/LD_PRELOAD hack? > >Earlier NWN betas had some issues with one of the OpenGL calls that some bright >young lad(?) hacked around with an LD_PRELOAD. > >If the module is easily compilable, or even available as a binary in a sane >IP domain, then risk is the individual's. > >BTW, this was such a pain to track down, so for posterity (and archives): > > Patent Number: 5,956,431 > > System and method for fixed-rate block-based image compression with > inferred pixel values. > > Date Filed: Oct. 2, 1997 > Date Issued: Sep. 21, 1999 > >I've seen a reference that this is a superset of S3TC. > >John > > > >------------------------------------------------------- >This SF.net email is sponsored by: ValueWeb: >Dedicated Hosting for just $79/mo with 500 GB of bandwidth! >No other company gives more support or power for your dedicated server >http://click.atdmt.com/AFF/go/sdnxxaff00300020aff/direct/01/ >_______________________________________________ >Dri-devel mailing list >Dri...@li... >https://lists.sourceforge.net/lists/listinfo/dri-devel > > > |
From: Daniel V. <vo...@ep...> - 2003-04-08 20:00:00
|
> I don't know if this is an acurate prediction or even a valid argument. Neither. > Is this a "generic" compression of some sort? Quake 3 uses S3_s3tc if I'm not mistaken. You could talk to Timothee to get him to change it to use ARB_texture_compression if S3_s3tc isn't exposed so the driver could pick FXT1 internally though I doubt that's worth it. UT2003 uses precompressed textures. -- Daniel, Epic Games Inc. |
From: Ian R. <id...@us...> - 2003-04-08 21:45:10
|
Jason Cook wrote: > As a side note: What ever happened to FXT1 with 3Dfx cards? I was under > the impression that FXT1 could actually work with s3tc textures. If that > were true, then would there have been a way to "trick" an application > requiring s3tc to use the fxt1 extension? A wrapper of some sort. Or > just purposely mislabel the extension name. Maybe that would have patent > issues as well. But what about games like quake3 or rtcw that allow for > options to be enabled like "compress_textures". Is this a "generic" > compression of some sort? It does not implicitly state > "s3tc_compress_textures" as does UT 2003. I was curious if anyone could > help me understand that. Are some texures pre-compressed while others > are compressed on the fly? The trick that 3dfx did, IIRC, is decompress S3TC textures and recompress them as FXT1 in the driver. Their driver exported one of the S3TC extensions and swizzled it around in software. Quite a few drivers have a flag like "compress all textures" that causes the driver to try to compress all textures on upload. Supporting GL_3DFX_texture_compression_FXT1 is interesting for DRI because we support two pieces of hardware, Voodoo4/5 & i830/i845G, that support those formats in hardware. Perhaps the best route for open-source drivers is to expose that extension on hardware that can support it and encourage developers to use it. Until the owner of the S3TC IP speaks up, my personal opinion is that more discussion on our parts is not likely to be of any use. :( /me runs up the white flag. |
From: Daniel V. <vo...@ep...> - 2003-04-08 22:26:35
|
> Supporting GL_3DFX_texture_compression_FXT1 is interesting for DRI > because we support two pieces of hardware, Voodoo4/5 & i830/i845G, that > support those formats in hardware. Perhaps the best route for > open-source drivers is to expose that extension on hardware that can > support it and encourage developers to use it. Implementing GL_3DFX_texture_compression_FXT1 is a waste of time - using FXT1 on those cards for generic GL_ARB_texture_compression compression might not be. -- Daniel, Epic Games Inc. |
From: Leif D. <lde...@re...> - 2003-04-08 22:54:21
|
On Tue, 8 Apr 2003, Daniel Vogel wrote: > > Supporting GL_3DFX_texture_compression_FXT1 is interesting for DRI > > because we support two pieces of hardware, Voodoo4/5 & i830/i845G, that > > support those formats in hardware. Perhaps the best route for > > open-source drivers is to expose that extension on hardware that can > > support it and encourage developers to use it. > > Implementing GL_3DFX_texture_compression_FXT1 is a waste of time - using > FXT1 on those cards for generic GL_ARB_texture_compression compression might > not be. It's not a waste of time since you need the former to do the latter, but it's true that there probably aren't (m)any apps out there that will supply pre-compressed FXT1 textures. -- Leif Delgass http://www.retinalburn.net |
From: Ian R. <id...@us...> - 2003-04-08 23:03:16
|
Daniel Vogel wrote: >>Supporting GL_3DFX_texture_compression_FXT1 is interesting for DRI >>because we support two pieces of hardware, Voodoo4/5 & i830/i845G, that >>support those formats in hardware. Perhaps the best route for >>open-source drivers is to expose that extension on hardware that can >>support it and encourage developers to use it. > > Implementing GL_3DFX_texture_compression_FXT1 is a waste of time - using > FXT1 on those cards for generic GL_ARB_texture_compression compression might > not be. That's the same thing. ARB_texture_compression doesn't define any compressed formats. They layered extensions, like 3DFX_texture_compression_FXT1, add specific formats that can be used as source and internal formats. One of the things that an app can do is query the internal format of the texture. If the app does: glTexImage2D( GL_TEXTURE_2D, 0, GL_RGB_COMPRESSED, width, height, border, GL_RGB, GL_BYTE, data ); And we compress the texture using FXT1, then: glGetTexLevelParameteriv( GL_TEXTURE_2D, 0, GL_TEXTURE_INTERNAL_FORMAT, & value ); Should set <value> to GL_COMPRESSED_RGB_FXT1_3DFX. If that much is done, the only thing left to do to support the full FXT1 extension is support the app sending in precompressed textures. That sounds like less than 10% of the work to me. :) I can see the wisdom in not spending much time optimizing the fallback decompression paths, though. |
From: Daniel V. <vo...@ep...> - 2003-04-09 00:12:40
|
> That's the same thing. Sorry, there was a "just" missing. It should've read: "Implementing just GL_3DFX_texture_compression_FXT1 is a waste of time - using FXT1 on those cards for generic GL_ARB_texture_compression compression might not be." -- Daniel, Epic Games Inc. |
From: Ian R. <id...@us...> - 2003-04-09 01:37:11
|
Daniel Vogel wrote: >>That's the same thing. > > > Sorry, there was a "just" missing. It should've read: > > "Implementing just GL_3DFX_texture_compression_FXT1 is a waste of time - > using > FXT1 on those cards for generic GL_ARB_texture_compression compression might > not be." Ah! Gottcha. You are 100% correct, then. :) |