gamedevlists-general Mailing List for gamedev (Page 67)
Brought to you by:
vexxed72
You can subscribe to this list here.
2001 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
|
Sep
(3) |
Oct
(28) |
Nov
(13) |
Dec
(168) |
---|---|---|---|---|---|---|---|---|---|---|---|---|
2002 |
Jan
(51) |
Feb
(16) |
Mar
(29) |
Apr
(3) |
May
(24) |
Jun
(25) |
Jul
(43) |
Aug
(18) |
Sep
(41) |
Oct
(16) |
Nov
(37) |
Dec
(208) |
2003 |
Jan
(82) |
Feb
(89) |
Mar
(54) |
Apr
(75) |
May
(78) |
Jun
(141) |
Jul
(47) |
Aug
(7) |
Sep
(3) |
Oct
(16) |
Nov
(50) |
Dec
(213) |
2004 |
Jan
(76) |
Feb
(76) |
Mar
(23) |
Apr
(30) |
May
(14) |
Jun
(37) |
Jul
(64) |
Aug
(29) |
Sep
(25) |
Oct
(26) |
Nov
(1) |
Dec
(10) |
2005 |
Jan
(9) |
Feb
(3) |
Mar
|
Apr
|
May
(11) |
Jun
|
Jul
(39) |
Aug
(1) |
Sep
(1) |
Oct
(4) |
Nov
|
Dec
|
2006 |
Jan
(24) |
Feb
(18) |
Mar
(9) |
Apr
|
May
|
Jun
|
Jul
(14) |
Aug
(29) |
Sep
(2) |
Oct
(5) |
Nov
(4) |
Dec
|
2007 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
(11) |
Sep
(9) |
Oct
(5) |
Nov
(4) |
Dec
|
2008 |
Jan
|
Feb
|
Mar
|
Apr
(1) |
May
(34) |
Jun
|
Jul
(9) |
Aug
|
Sep
|
Oct
|
Nov
|
Dec
(4) |
2016 |
Jan
|
Feb
|
Mar
|
Apr
|
May
|
Jun
|
Jul
|
Aug
(1) |
Sep
|
Oct
|
Nov
|
Dec
|
From: Tom N. <t.n...@vr...> - 2002-12-10 14:13:27
|
Hi, In the scripting thread, Brian mentioned "faster development" (i.e. no rebuilds) as one of the reasons one might use scripting. Would anyone actually go so far as to implement a scripting engine just for this purpose? Or, to take it one step further, would you consider writing your application in a different language altogether (not C/C++), just for the sake of improving programmer productivity? The reason I ask is because Brian brought up the build times for Quake 2. Our current project is probably roughly the same size as Q2 (110K lines of code), and takes just under 5 minutes to rebuild. Quake 2 took 40 seconds to build on the same machine -- something I can only dream of. But! You may not have heard about this, but a bunch of guys have taken it upon themselves to translate the entire Quake 2 source code to Delphi/Object Pascal (see http://sourceforge.net/projects/quake2delphi/). They have > 150K lines of code, and it builds in less than 3 seconds -- so fast that my watch isn't really accurate enough to time it. If it is indeed a fact that using STL in a C++ project would badly increase compile times, then this might be seen by some as a valid argument against the use of STL. By the same logic, if you knew that you could get your work done faster using another programming language, would you do it? _Has_ anyone actually done it (e.g. even if only for internal tools)? I'd love to hear about the build times for large(-ish) Java or C# projects, for example. -- Tom |
From: Noel L. <ll...@co...> - 2002-12-10 14:02:00
|
On Mon, 09 Dec 2002 18:39:37 -0600 Chris Carollo <cca...@io...> wrote: > Quake2: > .c - 188 files, 3.9MB > .h - 78 files, 0.6MB >=20 > Our codebase (which includes the editor, and admittedly a ton of code=20 > that could use some re-engineering/stripping): > .cpp - 1188 files, 14.6MB > .h - 1450 files, 5.8MB For another data point, our code base for MechAssault was: =2Ecpp - 1164 files (13.2 MB) =2Eh - 1291 files (3.5 MB) Game-specific code and engine code are split exactly half and half (it just worked that way). So considering that it's roughly one order of magnitude larger than the Quake 2 source code, our full compile times (around 15 minutes) are not totally out of line. Still, I'd like them to be shorter, and especially the link times, which is the most important thing for iterative programming. I imagine using incremental linking with VC7 will help, but every time I've tried it we've had problems and had to turn it off. --Noel |
From: <cas...@ya...> - 2002-12-10 13:49:51
|
Thatcher Ulrich wrote: > it seems nobody under the age of 30 can be bothered to type "make" :). hey, I'm just 21 ;-) > The simple STL-alike array<> and hash<> templates that I use for my > personal stuff take about 350 lines, and I consider them very > worthwhile. (They're mediocre, but public domain, if anyone wants the > link.) yep, I like them and use slightly modified versions. I'm getting damn scared with those project sizes. My projects don't usually have more than 2Mb of code (without external libraries) and usually build in 1-2 minutes. I don't use STL, mainly because it's a pain to debug, but never though it could affect performance in such ways! Ignacio Castaño cas...@ya... ___________________________________________________ Yahoo! Sorteos Consulta si tu número ha sido premiado en Yahoo! Sorteos http://loteria.yahoo.es |
From: Jamie F. <ja...@qu...> - 2002-12-10 13:30:12
|
i think one key point that hasn't been touched on (enough? :) yet is that the scripting language (whether data definition, a full language, or whatever) should be appropriate to the people who will use it. some designers can think like a coder. some can't. similarly with artists. if you pitch the scripting at the appropriate level, it will work great. this is my main complaint with generic scripting systems: they've been designed at a particular level of complexity, and that simply can't be appropriate for every user. jamie -----Original Message----- From: gam...@li... [mailto:gam...@li...]On Behalf Of mike wuetherick Sent: 10 December 2002 03:55 To: gam...@li... Subject: Re: [GD-General] Scripting > My instinct is that a scripting language as an "extension" (vs. "as > data") seems ill advised at times, because you kind of have > responsibilities that are mixed and often poorly defined. Using it as > your primary application language means it's no longer "scripting", it's > just the language you happen to be developing in and you're now calling > into a separate native engine for system specific tasks. That seems > completely reasonable if you feel the language itself is much more > productive than C or C++ or whatever. good point, didn't see the difference initially. the way things have worked generally is that they start out as simple 'entities' (which are simple structs that have custom pragmas that the editor recognizes, providing public and private data for this particular entity (private data is used by the game engine on runtime, public is exposed to the level designer as the entities properties) that the designer can drop into their levels (lights, etc) and if the complexity of the feature becomes too unwieldy to edit via entities (as adding many of a similar entity can become), then that functionality is moved into scripting. for example, the players in-game actions are currently read in via an ini file - there are basic 'set' actions that the player can do, essentially a reference to an animation & that's about it. this is being updated to provide links to scripts (or functions within one) that will allow for more complicated actions (all custom-created by the designer without needing to recompile the engine) such as special moves, animated camera motions during specific in-game actions (pull off a special finishing move & trigger a flyby camera sequence), etc... of course, certain features are scriptable by default as well, and anything simpler is left as entities. the combinations seems to work well & provides an extremely good range of possiblities for the designer. cheers, mike w www.uber-geek.ca ------------------------------------------------------- This sf.net email is sponsored by:ThinkGeek Welcome to geek heaven. http://thinkgeek.com/sf _______________________________________________ Gamedevlists-general mailing list Gam...@li... https://lists.sourceforge.net/lists/listinfo/gamedevlists-general Archives: http://sourceforge.net/mailarchive/forum.php?forum_id=557 |
From: Javier A. <ja...@py...> - 2002-12-10 13:21:24
|
Brian Hook <bri...@py...> wrote: > And C++ hierarchies, to coin a phrase from Scott Bilas, tend to > "harden" over time. Yeah, make as little use of hierarchies as possible. In general, avoid any kind of dependencies between your classes. It's a code design issue, not so much a C++ issue. (read below) > But in the process of writing them > in ANSI C and acutely examining the dependencies that existed, I > found that I was writing MUCH cleaner code, and not just because I > expected others to read the code. I believe this has more to do with your experience in both languages than anything else. You've huge experience writing C code, but (from what I can infer) little experience working with large C++ codebases. What's worse, C++ can be badly used in more varied and obscure ways than C or other simpler languages, and some design philosophies even encourage these uses. Flat hierarchies, single inheritance and interfaces all make your life damn easy. Javier Arevalo Pyro Studios |
From: Root, K. <kr...@fu...> - 2002-12-10 10:26:07
|
My 2 cents =) I was just doing full rebuild of one of our server application with it services on my development PC - here are some details. .cpp - 1357 files (17.3MB) .c - 303 files (6.8MB) .h - 2039 files (11.9MB) Full rebuild time - ~17 minutes. That code consist of 113 sub-projects (components in server and helper services). We heavy use STL in code along with our huge template library which wrap some STL templates, Windows API functions and so on. My development PC is 2xP3 1000, 512 RAM, 2x40GB IDE (in RAID). Konstantin Root Software Engineer fusionOne Estonia www.fusionone.com / www.mightyphone.com > -----Original Message----- > From: Dan Thompson [mailto:da...@cs...] > Sent: Tuesday, December 10, 2002 6:02 AM > To: gam...@li... > Subject: Re: [GD-General] Compile times > > > Just to add my 2 cents...I just built linux on our > isntruction servers at UW, which arent the best machines, > doing a make clean then make all. I know it ins't building > the full 110 megs(68 megs of .c files) for our particular > implementation due to architecture and driver stuff, but I > think 4:41(5 > minutes...) is a nice build time for an operating system. For > me it kinda puts things in perspective, anyway, depsite the > optimizations and such. > > -Dan > > ----- Original Message ----- > From: "Brian Hook" <bri...@py...> > To: <gam...@li...> > Sent: Monday, December 09, 2002 6:32 PM > Subject: RE: [GD-General] Compile times > > > > > Tony Hawk's Pro Skater 4 has 10.6 MB of code (engine > included), and > > > takes 7.5 minutes for a full code rebuild on a 2Ghz/1GB > PC using GCC > > > 2.93 (Targeting the PS2) > > > > That's downright speed-of-light compared to most people's comments, > > and to me is a pretty clear indicator that the problem isn't > > necessarily just the size of the code base. > > > > Brian > > > > > > > > ------------------------------------------------------- > > This sf.net email is sponsored by:ThinkGeek > > Welcome to geek heaven. > > http://thinkgeek.com/sf > > _______________________________________________ > > Gamedevlists-general mailing list > > Gam...@li... > > https://lists.sourceforge.net/lists/listinfo/gamedevlists-general > > Archives: http://sourceforge.net/mailarchive/forum.php?forum_id=557 > > > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Gamedevlists-general mailing list > Gam...@li... > https://lists.sourceforge.net/lists/listinfo/gamedevlists-general > Archives: http://sourceforge.net/mailarchive/forum.php?forum_id=557 > |
From: Javier A. <ja...@py...> - 2002-12-10 09:09:57
|
Brian Hook <bri...@py...> wrote: >> On macosx, compile for the same code takes about 45 minutes. >> On linux, about 30 for a full rebuild. > > That seems...weird. Granted, these are different machines, but in my > experience VC hasn't been that much faster unless you're heavily > leveraging incremental linking and precompiled headers. For what is worth, during the tentative (and aborted) Mac port of Commandos 2, they started a Debug build of the game on the mac (note - it's a HUGE codebase, the Win32 retail exe was 8 megs IIRC), and two days later it still had not finished. Can't give details about machine, RAM, etc, but frankly, it was scary. And no, it wasn't using STL. Javier Arevalo Pyro Studios |
From: Ray <ra...@gu...> - 2002-12-10 08:11:05
|
Hm, I did a little test with -O0, -O1, -O2, and -O3 on this 1 file that takes forever to compile. -O0: 12 seconds -O1: 46 seconds -O2: 1 minute 15 seconds -O3: 4 minutes 40 seconds (times are approximate) using this command-line: date ; c++ -c -D_POSIX -Wall -DDarwin -D_BIGENDIAN -fpascal-strings -O2 -fomit-frame-pointer -ffast-math -mdynamic-no-pic -DIMAGE_DIR=\"images\" -DTEXTURE_DIR=\"textures\" -DC3D_ENABLED -I../../vendetta2/oe -I../../gk -I../../vendetta2 -o bin.macosxppc.release/cradar.o gkobjects/cradar.cpp ; date This is on my state-of-the-art ;) g3-350 mac with osX and gcc3.1 optimizing sure takes a long time for some reason on this file. It uses stl::vector and it doesn't use that for very much. It does use one of our home-brew templates though. But other files using that same template doesn't take that long to compile. It also does a fair amount of 3d math and calls a bunch of virtual functions. I just don't get it. Oh well, I'm going to sleep now. - Ray Ray wrote: > I guess I should say that my mac is a g3-350 with gcc3. I think that's > why it takes like 45 minutes to compile. > We also get the compiler to generate an internal compiler error on a > couple of our files. > > 420 .h files (1.7 megs) > 455 .cpp files (4.0 megs) > 140 .c files (2.5 megs) (mostly libpng/zlib/libjpg) > > my win32 machine is a athlon 1700+ in win2k > linux box is pIII-450 with gcc 2.9something > mac box is g3-350 w/os10.2 gcc3 > So that's not really comparing oranges to oranges. > > We use Jam for mac and linux. BUT we don't use the built-in version of > Jam on the mac because its internal Jamrules are not compatible with a > couple of things we do. (I'm not quite sure what, because I wasn't the > one to get it working) > We also have a problem with it not correctly generating dependancies for > a few files so it doesn't compile them when it should. > I also can't figure out how to force Jam to compile a specific .cpp file > for me. It's pretty much either the main target app or nothing. > > We were just playing around with trying to get things to compile faster > and found that gcc3.x is much slower than gcc 2.x. > > I also discovered gcc(at least on the mac) has some form of precompiled > header support which I didn't try to use yet. > > All I can say is gcc is ssslllllllllllooooooooooooowwwwwwwwww. > > faster harddrives help compiling speed too. > > I just can't understand why stl is the main culprit for that much of a > slow-down. > > Speaking of slow-down, my brain slowed down so I think I will go to bed > now. > > - Ray > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Gamedevlists-general mailing list > Gam...@li... > https://lists.sourceforge.net/lists/listinfo/gamedevlists-general > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_id=557 > |
From: Javier A. <ja...@py...> - 2002-12-10 07:49:03
|
Noel Llopis <ll...@co...> wrote: > We're using C++ and STL, and even though we have a good physical > organization (like Lakos described), compiling all our libraries and > game code for one platform could take up to 15-20 minutes. That's only > for a full build; a quick change somewhere still requires building > that file and then doing a link, which is about 30-40 seconds. > > I find 30-40 seconds for a trivial change close to unbearable. > Especially when it takes another 10 seconds for the game to start and > another 10 seconds for the level to load. We're using C++ (MSVC6) with no STL, project broken down into about 12 subprojects, two of them are DLLs and the rest are .libs. Total size of the game codebase (.cpp + .h) is 2316 files (1099 are .cpp) taking up 30 megs (538K lines of actual code). In my "development machine" (as little hands-on development as I do these days), a P3-600 with 512Mb of RAM, any change to a .cpp takes about 3-5 seconds to recompile. A full rebuild is in the order of 20 minutes. The game takes less than one second to start up, and most missions / maps take less than 5 seconds (many are about 2 seconds) to load in release build, about twice as much in debug. All these times are lower for the programmers, who use P4-1.7 + 512Mb machines. All machines have a 7200 rpm primary hard drive. We're not doing anything exceptional about compile times, other than keeping an eye on dependencies, using PCHs with some care (one of the reasons for the number of subprojects), not using STL, and following common Lakos wisdom. About load times, we have kept all our data file organisation very straightforward, and most of them are read sequentially (we have a log warning whenever a file seek operation goes backwards, and the only things that trigger it are WAV format loading and graphics reload). Javier Arevalo Pyro Studios |
From: Ray <ra...@gu...> - 2002-12-10 07:41:44
|
I guess I should say that my mac is a g3-350 with gcc3. I think that's why it takes like 45 minutes to compile. We also get the compiler to generate an internal compiler error on a couple of our files. 420 .h files (1.7 megs) 455 .cpp files (4.0 megs) 140 .c files (2.5 megs) (mostly libpng/zlib/libjpg) my win32 machine is a athlon 1700+ in win2k linux box is pIII-450 with gcc 2.9something mac box is g3-350 w/os10.2 gcc3 So that's not really comparing oranges to oranges. We use Jam for mac and linux. BUT we don't use the built-in version of Jam on the mac because its internal Jamrules are not compatible with a couple of things we do. (I'm not quite sure what, because I wasn't the one to get it working) We also have a problem with it not correctly generating dependancies for a few files so it doesn't compile them when it should. I also can't figure out how to force Jam to compile a specific .cpp file for me. It's pretty much either the main target app or nothing. We were just playing around with trying to get things to compile faster and found that gcc3.x is much slower than gcc 2.x. I also discovered gcc(at least on the mac) has some form of precompiled header support which I didn't try to use yet. All I can say is gcc is ssslllllllllllooooooooooooowwwwwwwwww. faster harddrives help compiling speed too. I just can't understand why stl is the main culprit for that much of a slow-down. Speaking of slow-down, my brain slowed down so I think I will go to bed now. - Ray |
From: BG <arc...@ma...> - 2002-12-10 06:35:15
|
On Monday, December 9, 2002, at 04:10 PM, phi...@pl... wrote: > Personally, I think they (ND) were a little loopy to even try it. The > PS2 > already has a very steep learning curve without trying to make it lean > back > on itself. I also have to wonder how things would have been different > it > they'd put a similar amount of effort into improving and extending gcc. Well it is an *intriguing* way to learn a platform...! o_O With regards to gcc, as handy as it is it *does* bring a lot of baggage with it. Considering it's going to be used primarily in-house and on a single hardware spec, "rolling your own" doesn't have the scare factor it might otherwise have. Besides sometimes starting from fresh does have it's advantages rather than trying to improve an existing code base. Anyway, I'm always for a developer getting a little brave and exploring a different route... ------------------------------------------------------------------------ -------------------------- Copyright 2002 archie4oz email -- End User Licensing Agreement -- By reading the above post you grant archie4oz (email author of said listed party name) the right to take your money, eat your cat, and urinate on your house. In addition you give archie4oz (above mentioned) the right to use your sister in anyway he sees fit. If you do not agree to these terms then DO NOT READ the above email. |
From: Thatcher U. <tu...@tu...> - 2002-12-10 05:53:38
|
On Dec 09, 2002 at 02:38 -0800, Brian Hook wrote: > > What are the alternatives other than rolling your own? > > Rolling your own? =) > > For C++, unless you want a base root Object class, there's no > alternative, since STL is templatized and gives you the static type > checking that many feel is so vitally important. Actually I think there's a really good alternative, if your primary beef with STL is compile times: minimal vector<> and map<> workalikes. Basic vector<> comprises about 95% of my usage of STL, and map<> is an additional 4.9%. One thing I learned from scripting languages is that array and hash are the only two basic data structures that really matter. The simple STL-alike array<> and hash<> templates that I use for my personal stuff take about 350 lines, and I consider them very worthwhile. (They're mediocre, but public domain, if anyone wants the link.) -- Thatcher Ulrich http://tulrich.com |
From: brian h. <bri...@py...> - 2002-12-10 05:38:16
|
> I suffered severe compile-time shock when I first started working at > Oddworld, so I remember timing a build of Soul Ride (my immediate > prior project) on the machine at work. Stats: > > full build: ~45 seconds (VC6) > 1.80M total > 1.46M .cpp (94 files) > 0.15M .c (19 files) > 0.18M .h (88 files) > (71K lines total) Then we have very similar build times and sizes for our respective "small" projects. So the question is -- what are the really big chunks that "big" project adds that hoses everything? STL is the obvious commonality, because large projects sans STL still seem like they compile pretty quickly. > The reason this works is because the compiler only reads headers (and > builds data structures from them) once for the whole lump. As opposed > to reading, building structures, compiling a few hundred lines, and > then flushing everything out again, once for each source file. Our > typical #include line-count is in excess of 25K lines (because of > STL). So it would seem that STL is directly the culprit for lots of these build time explosions. > Unfortunately the link time didn't seem to drop any, which is the Are your codebases built into a single monolithic EXE, or an EXE and multiple DLLs? The latter approach seems to help link times appreciably. > nobody under the age of 30 can be bothered to type "make" :). Kids these days. =) Speaking of which, I'm curious if anyone has tried using jam (Just Another Make)? make isn't even quite 80s technology, much less 2000+ technology. It's mind bogglingly obtuse, yet somehow manages to survive even today. jam is an attempt to sort out this mess somewhat more cleanly, and Apple has migrated to it for ProjectBuilder. Brian |
From: Thatcher U. <tu...@tu...> - 2002-12-10 05:28:24
|
On Dec 09, 2002 at 02:17 -0800, Brian Hook wrote: > > I can't help but wonder if your slow rebuilds on Linux and OS X are the > result of using gcc and STL? Because that factor of 9 is just mind > bogglingly dramatically different, enough so that it would raise alarm > bells. gcc has issues, but damn, not THAT many issues. In addition to being slower than VC in general, gcc is surpisingly slow at compiling STL code, in my limited experience. A factor of 9 sounds awfully high to me too, though. At one point I got curious and measure the code pulled in by "#include <vector>" using various STL's -- VC6's default STL pulls in 122KB, gcc pulls in 181KB, and STLport-4.5 pulls in 203KB. I suffered severe compile-time shock when I first started working at Oddworld, so I remember timing a build of Soul Ride (my immediate prior project) on the machine at work. Stats: full build: ~45 seconds (VC6) 1.80M total 1.46M .cpp (94 files) 0.15M .c (19 files) 0.18M .h (88 files) (71K lines total) About 0.16M of the .cpp was written by a contractor who used STL; the rest was written by me with little to no STL. The .c is Lua 3.2. The Oddworld code base at that time was about 200K+ lines (dunno how many megs) and used STL. I think I measured a full build at like 12 minutes (VC7), but my memory could be wrong. Our new codebase is relatively cleaner than before IMO, but still uses STL. A full game-only build still takes ~12 minutes (VC7) on my (relatively slow 1GHz) laptop. (This particular laptop is much slower at compiling than the machine used for the above timings.) Here are the size stats: 5.83M total 4.33M .cpp (538 files) 1.50M .h (568 files) The last time compile times came up, someone on this list mentioned doing "lumped" builds, where you use a script to #include a whole bunch of .cpp files into one big compilation unit. I cooked up some make/perl monstrosity to try that out with Oddworld's code, and got some stunning results: 3 minute full builds! The reason this works is because the compiler only reads headers (and builds data structures from them) once for the whole lump. As opposed to reading, building structures, compiling a few hundred lines, and then flushing everything out again, once for each source file. Our typical #include line-count is in excess of 25K lines (because of STL). Unfortunately the link time didn't seem to drop any, which is the biggest determinant of our usual compile cycle time (and the build process changes were deemed too wacky to be adopted, since it seems nobody under the age of 30 can be bothered to type "make" :). -- Thatcher Ulrich http://tulrich.com |
From: Dan T. <da...@cs...> - 2002-12-10 03:54:22
|
Just to add my 2 cents...I just built linux on our isntruction servers at UW, which arent the best machines, doing a make clean then make all. I know it ins't building the full 110 megs(68 megs of .c files) for our particular implementation due to architecture and driver stuff, but I think 4:41(5 minutes...) is a nice build time for an operating system. For me it kinda puts things in perspective, anyway, depsite the optimizations and such. -Dan ----- Original Message ----- From: "Brian Hook" <bri...@py...> To: <gam...@li...> Sent: Monday, December 09, 2002 6:32 PM Subject: RE: [GD-General] Compile times > > Tony Hawk's Pro Skater 4 has 10.6 MB of code (engine > > included), and takes 7.5 minutes for a full code rebuild on a > > 2Ghz/1GB PC using GCC 2.93 (Targeting the PS2) > > That's downright speed-of-light compared to most people's comments, and > to me is a pretty clear indicator that the problem isn't necessarily > just the size of the code base. > > Brian > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Gamedevlists-general mailing list > Gam...@li... > https://lists.sourceforge.net/lists/listinfo/gamedevlists-general > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_id=557 > |
From: mike w. <mi...@ub...> - 2002-12-10 03:52:49
|
> My instinct is that a scripting language as an "extension" (vs. "as > data") seems ill advised at times, because you kind of have > responsibilities that are mixed and often poorly defined. Using it as > your primary application language means it's no longer "scripting", it's > just the language you happen to be developing in and you're now calling > into a separate native engine for system specific tasks. That seems > completely reasonable if you feel the language itself is much more > productive than C or C++ or whatever. good point, didn't see the difference initially. the way things have worked generally is that they start out as simple 'entities' (which are simple structs that have custom pragmas that the editor recognizes, providing public and private data for this particular entity (private data is used by the game engine on runtime, public is exposed to the level designer as the entities properties) that the designer can drop into their levels (lights, etc) and if the complexity of the feature becomes too unwieldy to edit via entities (as adding many of a similar entity can become), then that functionality is moved into scripting. for example, the players in-game actions are currently read in via an ini file - there are basic 'set' actions that the player can do, essentially a reference to an animation & that's about it. this is being updated to provide links to scripts (or functions within one) that will allow for more complicated actions (all custom-created by the designer without needing to recompile the engine) such as special moves, animated camera motions during specific in-game actions (pull off a special finishing move & trigger a flyby camera sequence), etc... of course, certain features are scriptable by default as well, and anything simpler is left as entities. the combinations seems to work well & provides an extremely good range of possiblities for the designer. cheers, mike w www.uber-geek.ca |
From: Brian H. <bri...@py...> - 2002-12-10 02:32:30
|
> Tony Hawk's Pro Skater 4 has 10.6 MB of code (engine > included), and takes 7.5 minutes for a full code rebuild on a > 2Ghz/1GB PC using GCC 2.93 (Targeting the PS2) That's downright speed-of-light compared to most people's comments, and to me is a pretty clear indicator that the problem isn't necessarily just the size of the code base. Brian |
From: Brian H. <bri...@py...> - 2002-12-10 02:27:54
|
> this is how we're implementing things - we are using the > scripting language in the very way that brian and others have > been trying to avoid - as a complete language with the > requisite 'language constructs', switch statements, etc... Okay, to avoid muddling semantics, there's "the language" and there's "how it's used". It's entirely possible to use Python as an extension language call by an app, as an application language that calls an engine, or somewhere in between. I use Lua, ostensibly a scripting language, for data definitions. I wouldn't call it a scripting language in my case though. My instinct is that a scripting language as an "extension" (vs. "as data") seems ill advised at times, because you kind of have responsibilities that are mixed and often poorly defined. Using it as your primary application language means it's no longer "scripting", it's just the language you happen to be developing in and you're now calling into a separate native engine for system specific tasks. That seems completely reasonable if you feel the language itself is much more productive than C or C++ or whatever. Brian |
From: Douglas C. <zi...@ho...> - 2002-12-10 02:25:10
|
>Out of curiousity, what's the general size of your codebase? .cpp - 605 files, 4mb .h - 317 files, 1mb That includes all our code (including support for 2 consoles). The reason we have 1/2 the number of .h files is because our plugins for the editor only require a .cpp file per property page / utility plugin. There would probably be 150 .h files if I didn't have to export our 'action' classes so the property pages could get to them as they're never included by anything other than a single, corresponding .cpp file in the game. We're extremely data driven, using an action/value/event system instead of scripting -- but that includes all behaviors (save a couple FSMs in code) for the player, NPCs, GUI, triggers, cinematics, etc. I did spend a while early on making sure dependencies wouldn't be an issue, and it seems to have paid off pretty well so far. _________________________________________________________________ The new MSN 8: smart spam protection and 2 months FREE* http://join.msn.com/?page=features/junkmail |
From: Brian H. <bri...@py...> - 2002-12-10 02:19:22
|
> Does anyone out there have a game project of 10-20MB in size > (source & header files), that builds in 5 minutes or less? I > would love to hear about it. I think that part of the problem is having a game project that large. With size comes almost unavoidable complexity, which is what I was talking about -- what is exactly causing them to get that huge? I know the games I'm personally working on are not that dramatically complex, they're cheezy puzzle games, but at the same time nearly all games (especially cross platform ones) end up having to do a minimal set of operations regardless of overall complexity. For example, no matter what type of game, you're looking at having some kind of unarchiver (unless you just dump all your files straight onto disk, which I suppose some people still do); some kind of TGA/JPG/PNG/BMP/whatever loader; some kind of streaming audio support (MP3, Miles, Ogg Vorbis); some kind of graphics/math stuff; if appropriate, some kind of networking layer built on UDP; some kind of audio wave file loader/playback; possibly some GUI stuff; memory management and file format loaders; cross-platform file system stuff; cross-platform endianess mucking about; cross-platform registry/preferences manipulation; etc. etc. My total source base (including third party open source stuff), including all cross platform stuff, is about 5.5MB (of which MySQL is 2MB). So at any given time, a full rebuild is probably around 4.5 - 5MB. Nothing huge, but even so, a full rebuild for me is around one minute or less. Take my code, multiply by 4 for a 20MB code base, and we're still talking about 5 minutes or so, tops. It's possible that there's some magic project size that causes MSVC to choke as well. Hmmm, actually, one thing I should probably mention is that my projects consist of tons of subprojects. I don't have a single monolithic DSP that has like 300 files or anything like that, and it's possible that this prevents symbol tables or other internal compiler data from getting out of control. > I wonder if in some ways you cross an invisible line where > complexity and inter-dependency increases dramatically as > your project size grows. My guess is that this is true. At some point you lose the ability to have a gestalt understanding of your code base as a whole, and once you're at that point, programmers just start chucking things into directories as they see fit. > Large-scale projects are not a new beast -- there are whole > books written on the subject (J. Lakos. Large-Scale C++ > Software Design, for example). The problem is that many of the books are either written with a late 80s mindset towards software engineering, which is often at odds with the new and trendy extreme programming we're seeing, or they're long, gory discussions about how things exploded on some large project, thinly disguised post-mortems. One of my former managers expressed discomfort about a project, and he said something like "I feel like we're trying to build a Boeing 777, where everyone puts together all their pieces at the very end and it fires up and works" and I was like "Yeah, that's exactly what's going on, and it'll work just fine, this is how modern software engineering works". I had complete faith that with well defined interfaces and unit tests that assembling disparate subsystems 6 months into a project would not be that tough. Holy crap was I wrong. These days, I'm the complete opposite. To paraphrase a friend of a friend, the most important thing you can do is to have the game running ALL THE TIME. Like, from day one, you should have SOMETHING that compiles, builds and runs. Every day there should be a running, functional executable. If you can't build one, or can't guarantee this, then this implies that there is something seriously wrong with your process. I know that sounds rather extreme, but from the projects I've observed, this is more true than not. When you hear phrases like "We'll put those pieces together in a few weeks", alarm bells should be going off. > Brian > has a strong suspicion much of it is accidental and could be > removed if the right choices were made during the course of > development. That's a fairly accurate assessment of my feelings, coupled with a strong feeling that a lot of the complexity is also designed in from the outset because of a belief that it's necessary. The case in point I use are smart pointers, which almost always end up costing more than they're worth in projects I've seen. Those are typically designed in from day one because they seem like a good idea, but then in practice enough ugly things rear their heads that everyone ends up regretting them at the end. And, as I've lamented earlier, I do feel that STL is a contributor to this as well, but that's borderline religious. > I would love to see some larger projects that > build quickly so that I could gain more confidence in this theory. My only addition to this is that I'm somewhat suspicious that projects actually need to be that large, but that's neither here nor there. I doubt anyone with a gigantic project feels that it's a bunch of bloated crap, but I know that I feel a good chunk of my code could just be tossed or at least refactored into something more manageable, and I only have a few megabytes of code. Brian |
From: Mick W. <mi...@ne...> - 2002-12-10 02:19:02
|
Tony Hawk's Pro Skater 4 has 10.6 MB of code (engine included), and takes 7.5 minutes for a full code rebuild on a 2Ghz/1GB PC using GCC 2.93 (Targeting the PS2) I've heard VC is a lot faster than GCC, so it might get the sub-5 minute there. I think it can get a lot faster. We did some restructuring of one of our subsystems using the Lakos methodology, and it sped it up noticably. But you really have to apply it everything. The code is pretty interwoven in places. We don't make much use of templates, and no STL, which I think helps the build time quite a bit. Also no exceptions or RTTI. Mick > -----Original Message----- > From: gam...@li... > [mailto:gam...@li...] On > Behalf Of Michael Moore (GAMES) > Sent: Monday, December 09, 2002 5:50 PM > To: gam...@li... > Subject: RE: [GD-General] Compile times > > > Does anyone out there have a game project of 10-20MB in size > (source & header files), that builds in 5 minutes or less? I > would love to hear about it. > > I wonder if in some ways you cross an invisible line where > complexity and inter-dependency increases dramatically as > your project size grows. Similar to the way the dynamics of a > team changes as you go from 8 developers to 40. > > Large-scale projects are not a new beast -- there are whole > books written on the subject (J. Lakos. Large-Scale C++ > Software Design, for example). I wonder how much of the > complexity in our larger game projects is "accidental" and > how much is "essential", to borrow a term from Brooks. Brian > has a strong suspicion much of it is accidental and could be > removed if the right choices were made during the course of > development. I would love to see some larger projects that > build quickly so that I could gain more confidence in this theory. > > -Michael > > > > > > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Gamedevlists-general mailing list > Gam...@li... > https://lists.sourceforge.net/lists/listinfo/gamedevlists-general > Archives: http://sourceforge.net/mailarchive/forum.php?forum_idU7 > |
From: mike w. <mi...@ub...> - 2002-12-10 02:18:28
|
they weren't console games, but didn't thief and system shock 2 use the same engine, with the exception of scripts etc? i recall reading that they used almost exactly the same executable up until the very last minute...and created two very unique games at the same time. this is how we're implementing things - we are using the scripting language in the very way that brian and others have been trying to avoid - as a complete language with the requisite 'language constructs', switch statements, etc... but we didn't create the language or parser - they are a free library that we've plugged in (www.simkin.co.uk). This has worked out extremely well in our case, not only does the scripting language give the 'non-programmers' powerful tools for modifying the internals of the game, but it also lets us avoid recompiling the main engine as much as possible. the designers use worldcraft-style editors, placing entities that are tied to scripts for a wide range of features. everything from the skybox to cinematics, to Pawn/NPC behaviors is accessible without recompiling. sure the programmers might have to create the base script 'templates' for more complex scripts, but once the framework is there even complete newbies can customize these scripts to create their own behaviors. i couldn't see doing it any other way... mike w www.uber-geek.ca ----- Original Message ----- From: "Mickael Pointier" <mpo...@ed...> To: <gam...@li...> Sent: Monday, December 09, 2002 5:09 AM Subject: Re: [GD-General] Scripting > Concerning console game using scripting languags, I can at least say that > "Time Commando" (PlayStation, 1996), and "Little Big Adventure" > (PlayStation, 1997) are both based on scripts. > > Actually the game engine by itself is only a bunch of functionalities: > detect collision, move and animate actors, play sound and movies, display > stuff, and so on. 100% of game logic is scripted. > > The reason is the variety of actors we had. Time Commando contains 10 > history periods, each with around 30 custom items/actors (opponents, > weapons, traps, bonuses, ...), so it was a lot faster to code these custom > things in scripting language than directly in the engine, because even the > more complex actors never had more than 4 page of script text (not including > resource definition, but you need it in C++ anyway). > > In Little Big Adventure, it allows the designers to add a lot of small stuff > here and there that makes the game world really interesting. You want some > random sound here and there, a squirel that climbs to the nearest tree when > you approach, that's simple to do. In most standard engine, you would have > to derive a SquirelClimbingTree class derivated from some other weird > hiearchy that maked perfect sense at the beggining of the development but > that ressemble to some weird artistic sculpture at the end of the project... > > Another reason, is that Time Commando was coded by a grand total of 4 > programers on two simultaenous versions (PC and PlayStation), so we were > happy to let the level designers "make the game" and try things. > > One of the things I notice each time a "scripting vs complete engine" thread > begins, is that most people talk about lua, python, c, perl or whatever > other off the shelf language they can think about, and the eventual problems > of interfacing with the game engine, memory management, debugging, ... but > almost no one is talking about the cool thing there is in a well design > scripting language: the fact that you can easily code separate actors like > if you were in a perfectly multitasking system. > > When you code some actor in a C/C++, you have explicit entry and exit points > that are used at every refresh frame. > > In a nicely designed custom scripting system you do not have these. For what > it worth, your actor is living _alone_ in a forever loop (or until he dies, > get reseted, change behavior, or whatever). > > Consider the following script fragment (it's the kind of things we had in > TimeCommando, except that the C++ like syntax was not here, so we had > "TestAlive(CowBoy)"): > > ============== > BeginActor Indian > UseWeapon bow > While CowBoy.isalive > Shoot CowBoy > Next > EndWhile > PlayAnim Victory > EndActor > > BeginActor CowBoy > UseWeapon gattling > While Endian.isalive > Shoot Indian > Next > EndWhile > PlayAnim Victory > EndActor > ============== > > The "Next" command simply tell the interpreter to memorise the current > program counter for the current actor, and move on on the next actor. > > Using this system you gain the fact that you know in which order the actors > are executing their code so you can perform complex choregraphies because > you know that this particular actor is performing this or this exactly at > this particular moment, so you can perform particularly nasty things. Never > tried to synchronise complex objects ? > > In Time Commando we had to dynamicaly split/merge characters in a way it > could not be noticed by the player. I'm not talking here of kiddy things > like some actor on a moving platform, I'm talking here of a cowboy riding on > his horse (one single animated mesh containing the horse and the rider) and > then having it climbing down from the horse and going to the saloon (two > separate objects), or a tank on a battlefield where you have the turret that > turn independantly of the tank base, and the gunner on the turret that look > around independantly of the direction the turret is looking at... > > In summary: it worked fine. > > Mickael Pointier > > > > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Gamedevlists-general mailing list > Gam...@li... > https://lists.sourceforge.net/lists/listinfo/gamedevlists-general > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_id=557 > |
From: mike w. <mi...@ub...> - 2002-12-10 02:09:28
|
the full sourcecode to our game engine currently runs about 20 megs. can't say i've done a full recompile in a while, so i'm not sure how long it would take to build the entire sucker. the major benefit we have to compiling the source is the complete seperation of various parts of the engine into dlls & other libraries - we only need to recompile these as necessary (ie if we make low-level changes to the drivers or graphics engine), the main source itself is only about 5 megs - the rest is in 'compile as necessary' libraries. this also lets us have various 'versions' of the various libraries in-house. for example we have a 'stable' driver, ie tested working versions of stuff that the designers can use and know the functionality is there, and we also have 'development' drivers that are the bleeding edge drivers that most of the designers don't really need to access, only one or two designers have access to the dev drivers - and they are used to test & break the newer features before we unleash them on the rest of the team.... we have seperate dlls' for open gl drivers, software drivers, d3d drivers, network libraries, etc... works out well, if certain machine configurations barf on the newer updates, they can roll back to the previous drivers & still be productive while the programmers work on bug-fixes for their particular test-cases... mike w www.uber-geek.ca ----- Original Message ----- From: "Michael Moore (GAMES)" <md...@mi...> To: <gam...@li...> Sent: Monday, December 09, 2002 5:50 PM Subject: RE: [GD-General] Compile times > Does anyone out there have a game project of 10-20MB in size (source & > header files), that builds in 5 minutes or less? I would love to hear > about it. > > I wonder if in some ways you cross an invisible line where complexity > and inter-dependency increases dramatically as your project size grows. > Similar to the way the dynamics of a team changes as you go from 8 > developers to 40. > > Large-scale projects are not a new beast -- there are whole books > written on the subject (J. Lakos. Large-Scale C++ Software Design, for > example). I wonder how much of the complexity in our larger game > projects is "accidental" and how much is "essential", to borrow a term > from Brooks. Brian has a strong suspicion much of it is accidental and > could be removed if the right choices were made during the course of > development. I would love to see some larger projects that build > quickly so that I could gain more confidence in this theory. > > -Michael > > > > > > > > > ------------------------------------------------------- > This sf.net email is sponsored by:ThinkGeek > Welcome to geek heaven. > http://thinkgeek.com/sf > _______________________________________________ > Gamedevlists-general mailing list > Gam...@li... > https://lists.sourceforge.net/lists/listinfo/gamedevlists-general > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_idU7 > |
From: Michael M. (GAMES) <md...@mi...> - 2002-12-10 01:50:21
|
Does anyone out there have a game project of 10-20MB in size (source & header files), that builds in 5 minutes or less? I would love to hear about it. I wonder if in some ways you cross an invisible line where complexity and inter-dependency increases dramatically as your project size grows. Similar to the way the dynamics of a team changes as you go from 8 developers to 40. Large-scale projects are not a new beast -- there are whole books written on the subject (J. Lakos. Large-Scale C++ Software Design, for example). I wonder how much of the complexity in our larger game projects is "accidental" and how much is "essential", to borrow a term from Brooks. Brian has a strong suspicion much of it is accidental and could be removed if the right choices were made during the course of development. I would love to see some larger projects that build quickly so that I could gain more confidence in this theory. -Michael |
From: Brian H. <bri...@py...> - 2002-12-10 01:25:01
|
> Hearing all these sub-five-minute build times is making my=20 > mouth water, but do we have a codebase that's just WAY bigger=20 > than those that are sporting such fast builds? I really think this is the crux of the issue -- why are the code bases so big and take so long to build? Is it poor engineering or just a casual, subconscious feeling that modern systems can handle it? This is probably a sweng list thing at this point, but hey, nothing is OT here as long as it pertains to games =3D) Obviously using language features or libraries that make for excessive build times is a contributing factor, but I also think that there's kind of an underlying assumption of "Hey, we're making big, complex games, so this is just kind of expected". A tacit acceptance that convoluted engineering is simply a real and necessary part of game development these days. That the task is so huge and daunting that whatever happens is probably necessary. I just don't buy into that, but maybe I'm na=EFve that way. Quake 2 and Quake 3 were pretty technologically advanced when they were released, and they never suffered from the intense bloat in terms of code and compile times we see today. I don't think John would have tolerated it. Some would argue that the games were really limited, crude, etc. but I don't think the level of refinement you'd see in a game today vs. Q2 can account for a factor of 10 to 200 delta in build times. Maybe it simply is the proliferation of header files and source files so that with 2000+ .obj files link times are excessive. Or maybe people have just gotten real sloppy about dependency checking. Or maybe software really, truly is that much more complex than it was just a few short years ago. Without sufficient data, it's hard to say. But based on my experiences, almost every time I've seen a project like this spiral out of control in terms of complexity, it's because of overengineering features in the build system/class hierarchies because they're "neat" or "useful" but in practice turn out to cause significant hits in productivity. Are "smart pointers" for garbage collection worth the headache that they so often cause? In lieu of just telling a team, "Hey, manage your memory effectively"? I've never had a problem with memory leaks once clear memory allocation and ownership policies were defined. That seems like a more practical solution than coming up with some inane templatized ref-counting pointer system that's incredibly fragile or has to be back-doored consistently. I'm aware of at least three or four projects that take more than an hour to perform a full rebuild on modern workstation class hardware. Systems probably 2-3x faster than my current one. This means that those code bases are taking 100-200x longer to build than Quake 2 (call it 36 seconds vs. 3600 seconds with the former on a P3/933 and the latter on a P4/2.8). I'm literally jaw droppingly amazed that this is routinely tolerated and, in many cases, flat out rationalized. I had one friend say "I know it takes a long time, but we can't give up STL's string class". The STL string class is worth THAT? Another friend defended his team's build times with some vague hemming and hawing and a blanket "Software today is more complex, that's just how it is". I'll buy that -- software today is more complex, sure. It's bigger, has to do more stuff. But I don't see how it's become 100 to 200 times bigger and more complex in three years. That's one product cycle. =20 I would LOVE it if someone could step forward and show how the products have become that much more complex from one generation to another, something with concrete examples. Maybe you guys can do that and show the delta from DX -> DX2, because I think that would be really good information to know for the indusrty as a whole. -Hook |