RE: [GD-Windows] DirectX Pure Device / HW Vertex Processing Stuff
Brought to you by:
vexxed72
From: Vek <ve...@ho...> - 2001-09-23 20:21:33
|
Thanks for the response. Dang... its way, WAY above that. So I guess this is nothing to do with AGP... I'm going to do it the long way (Rip the renderer out of my engine, create some test models in MAX (one with lots of tris but one texture, one with lots of textures but same amount of tris, one with many submodels one texture, etc) and try to see why its grinding so badly. For some reason it drops terribly when I mess with vertex buffers myself, before throwing it to the vid card... but it also drops significantly when simply throwing pre-built, static models (in vertex buffers) directly to the card, using a vertex shader with no pixel shader. The real question, I suppose, is what would case a program to go SLOWER using hardware vertex processing, than if using software? And I mean a heck of a lot slower. :( -Nick Lawson nl...@ho... PS. If nobody's seen this kind of thing before and responds, I will go ahead with the renderer tests and post the results here. (Unless someone objects of course!) =) -----Original Message----- From: gam...@li... [mailto:gam...@li...]On Behalf Of Dirk Ringe Sent: Sunday, September 23, 2001 3:21 AM To: gam...@li... Subject: RE: [GD-Windows] DirectX Pure Device / HW Vertex Processing Stuff Simple stuff. Run benmark from nvidia. If you are below 20mio triangles per second, then there is an agp config error. Greets, Dirk -----Original Message----- From: gam...@li... [mailto:gam...@li...]On Behalf Of Vek Sent: Sunday, September 23, 2001 7:35 AM To: gam...@li... Subject: [GD-Windows] DirectX Pure Device / HW Vertex Processing Stuff I guess this is the best place to ask this - since DX is windows specific... and its not really about Algos or such. I recently got ahold of Geforce3 hardware to test my vertex shaders / pixel shaders in, on my Game Engine. To my surprise though, simply flipping the engine into 'hardware vertex processing mode' cut my framerate to about 50% of what it is in software vertex processing mode. This is when dumping raw vertices to the card to transform and light and render via a vertex shader... using vertex index buffer and vertex buffers... no pixel shaders involved. It also drops to 5% of what it is in software vertexprocessing mode if I additionally mess with those vertexes (Transforming them by my program manually instead of by a vertex shader). Are there AGP specifics I'm missing out on? Or some situations which I'm not aware of? I'm setting the device up almost exactly like the NV demos, etc... except those go at several hundred frames a second in the same situations. I would have thought that hardware vertex processing would be the same speed or faster... not 5% as fast. 500 polygons are enough to bring it to a crawl. If I can't mangle this soon, I'll take the renderer code out and place it into a nice 'clean room' to mess with it, and find out whats going on by brute force ;) But I was hoping someone out there already ran into this. -Nick Lawson ve...@ho... project: http://members.home.net/vektuz/gw2k hooray. _______________________________________________ Gamedevlists-windows mailing list Gam...@li... https://lists.sourceforge.net/lists/listinfo/gamedevlists-windows _______________________________________________ Gamedevlists-windows mailing list Gam...@li... https://lists.sourceforge.net/lists/listinfo/gamedevlists-windows |