Thread: [GD-General] asset & document management
Brought to you by:
vexxed72
From: Enno R. <en...@de...> - 2003-05-15 11:55:07
|
What kind of tools do other game developers use? I've been a die-hard CVS fan for many, many years now, and two years ago, added bugzilla to the list of tools I wouldn't want to work without. But I realize that there are still things I lack. One thing is document management. A game can create lots and lots of documents, and just putting them into revision control isn't enough - I need a way to search for them, I want fulltext search through all of them, keywords, categories, compare revisions, and more - or at least a significant subset of those features. I worked on a project that used MS Sharepoint for this, and while that was okay, it had its own set of problems (It still became messy after a while, and it didn't work in non-IE browsers, plus it's not good at storing html documents). Has anyone had good experience with other such software? The other thing I miss is asset management for large binaries - stuff that we can't or shouldn't put into CVS. I know there's alienbrain, but for some reason the evaluation didn't leave our artists very happy. It's also not exactly cheap, IIRC. Do you know anything else, are you using something that makes you say "wouldn't know how I could live without it"? Enno. |
From: Kent Q. <ken...@co...> - 2003-05-15 13:41:30
|
At 01:54 PM 5/15/2003 +0200, you wrote: >What kind of tools do other game developers use? > >I've been a die-hard CVS fan for many, many years now, and two years ago, >added bugzilla to the list of tools I wouldn't want to work without. But I >realize that there are still things I lack. A good bug tracker is indispensable. Has bugzilla gotten any easier to install since last year? I looked at it and ran away in horror. >One thing is document management. A game can create lots and lots of >documents, and just putting them into revision control isn't enough - I >need a way to search for them, I want fulltext search through all of them, >keywords, categories, compare revisions, and more - or at least a >significant subset of those features. I worked on a project that used MS >Sharepoint for this, and while that was okay, it had its own set of >problems (It still became messy after a while, and it didn't work in >non-IE browsers, plus it's not good at storing html documents). Has anyone >had good experience with other such software? Annoyingly, by far the best product of this type I've ever used is Lotus Notes. I say "annoyingly" because it's got a strange design dating back to 1993. But it has the best metaphor for managing sets of related documents I've ever seen, and in modern versions you can use it pretty effectively without having to install the Notes client on everyone's desktop. We have used it for everything from bug tracking to design databases to actual code repositories and code generators for components that had to fit within a special template. But with all that said, we're now moving to custom XML designs for all our document types. We've built the beginnings of a general-purpose XML forms editor tool (there are such things available from people like Altova as well). Having the data in an XML format means that it's pretty easy to build/find tools to transform, catalog, index, and search in the files. But you also want a decent editor to use while working with them. >The other thing I miss is asset management for large binaries - stuff that >we can't or shouldn't put into CVS. I know there's alienbrain, but for >some reason the evaluation didn't leave our artists very happy. It's also >not exactly cheap, IIRC. Do you know anything else, are you using >something that makes you say "wouldn't know how I could live without it"? We did large assets ourselves through a custom system that stored them on a server and kept a database catalog of them. Alienbrain initially looked more like a tool kit than an actual application, but I've heard lately that it's gotten better. I'd also say that artist's opinions on data organization are highly suspect. It's been my experience (and yes, I'm generalizing) that artists have to be dragged kicking and screaming into any kind of structured data organization. They're visual people and tend not to care that much about silly things like textual names. The key is to find a system that lets you find the assets you need with minimal impact on artist productivity. Seems like a lot of people build their own, here. Kent >_______________________________________________ >Gamedevlists-general mailing list >Gam...@li... >https://lists.sourceforge.net/lists/listinfo/gamedevlists-general >Archives: >http://sourceforge.net/mailarchive/forum.php?forum_id=557 Kent Quirk, CTO, CogniToy ken...@co... http://www.cognitoy.com |
From: Mickael P. <mpo...@ed...> - 2003-05-15 14:01:52
|
>> The other thing I miss is asset management for large binaries - >> stuff that we can't or shouldn't put into CVS. I know there's >> alienbrain, but for some reason the evaluation didn't leave our >> artists very happy. It's also not exactly cheap, IIRC. Do you know >> anything else, are you using something that makes you say "wouldn't >> know how I could live without it"? > > We did large assets ourselves through a custom system that stored > them on a server and kept a database catalog of them. Alienbrain > initially looked more like a tool kit than an actual application, but > I've heard lately that it's gotten better. > > I'd also say that artist's opinions on data organization are highly > suspect. It's been my experience (and yes, I'm generalizing) that > artists have to be dragged kicking and screaming into any kind of > structured data organization. They're visual people and tend not to > care that much about silly things like textual names. The key is to > find a system that lets you find the assets you need with minimal > impact on artist productivity. > > Seems like a lot of people build their own, here. Yep, here we also checked Alien Brain, when it was still called "NxN MediaStation" back in 1999, and at this time it was painfully slow (something like 4 minutes between the moment when you openned the client view and get the data ready on screen...), and was more looking like a "Do it yourself" package. Since we didn't had an unlimited amount of time to invest in that, we tried an alternative solution: developping our own minimalistic system. Since this first version it seems that AlienBrain had been optimised a lot, but so far I never find any review of the product made by thirdparties on a real project that does not look like a copy of the white papers available on their website, so I will wait for someone telling me it worked great for them on a real product before going into evaluation again... So back to our own product. It's minimalistic in the sense that it does not perform archiving and cannot merge. Basicaly it's an "Exclusive Checkout" based system that simply allow people to get/put/add/checkout/checking files from a common network repository. It's totaly made using Windows 2k+ shell extensions, so it's cleanly integrated in the windows explorer (right click contextual options, addition columns to display the status of files, property page to show the history of file, and so on...). Among the options are a small "send message" that appear when user select a file that someone else has in checkout. Using windows message service so it allow the user to send a "please could you let me the access to that file, thanks" that appear as a modal box on the screen immediately (like network printer message feedback). Also it has a small "view difference" option that can redirect to any 3rd party differencing software depending of the file type (pictures using our own picture difference program, for text types it can be windiff or araxis merge...). Got the idea ? Our artists are using it because it didn't force them to use other tools than before. They can perform checkin/checkout from any explorer instance, meaning it works also in applications fileselectors. Also, since it does not perform archiving, the speed is similar to that of a straight copy from/to the network, so they don't get the illusion of losing time when working. It's not a great product, but well, it works. Been used on three products (one beeing finished on 3 different targets) without any major problem. Mickael Pointier |
From: Enno R. <en...@de...> - 2003-05-15 21:01:18
|
Mickael Pointier wrote: > So back to our own product. It's minimalistic in the sense that it does not > perform archiving and cannot merge. Basicaly it's an "Exclusive Checkout" > based system that simply allow people to get/put/add/checkout/checking files > from a common network repository. I've also had a look at unison, because someone on the list mentioned it. I don't like the idea of not being in control over the archiving. On the one hand, I clearly don't want to keep all the old versions. If I can't buy the asset management that I like, this is what I think I'd like to build: CVS has a lot of features (webcvs, logs, branching, tags, etc.) that I wouldn't want to miss. Also, I really like TortoiseCVS, because it has the "integrate in the explorer shell" functionality you mentioned, and that's making many people feel a lot more comfortable than a command line, especially designers and artists. On the other hand, large binaries that undergo several changes a day are not manageable in CVS in the long run. What I would really want is to be able to say "only keep the last 5 versions, treat the rest as if they are 0-length files". It might be possible to extend CVS like that - the cvs files are fairly easy to read, and I could write an external process that locks a directory once a day, and kicks out old versions. Obviously, this only works on binaries where I don't have diffs, but always full files. In addition, I would like to be able to "nail down" an archived version, so it doesn't get flushed out. Anthing with a tag on it, for example, would be kept. If that's workable, I'd get the power of CVS, but not the huge storage requirement. And something that the artist already know from other places. Anyone who sees a problem in that? Enno. |
From: Mickael P. <mpo...@ed...> - 2003-05-16 07:18:27
|
Enno Rehling wrote: > Mickael Pointier wrote: > >> So back to our own product. It's minimalistic in the sense that it >> does not perform archiving and cannot merge. Basicaly it's an >> "Exclusive Checkout" based system that simply allow people to >> get/put/add/checkout/checking files from a common network repository. > > I've also had a look at unison, because someone on the list mentioned > it. I don't like the idea of not being in control over the archiving. > On the one hand, I clearly don't want to keep all the old versions. Well, we have the chance to have a special server that has an automatic "snapshoting" function. Basicaly we can get back in a parallel copy of the tree any file in the state it was 12 hours ago, last day, 2 days... up to one week ago. So yes there is no physical archiving of file, but there is anyway an history of modifications made by each person, we never lost anything so far since we also have the real daily/weekly/monthly/yearly backups :) Mickael Pointier |
From: Daniel V. <vo...@ep...> - 2003-06-28 03:01:56
|
I figured the below OS usage statistics for Unreal Tournament 2003 (gathered at 6 PM EST, 06/10/2003) might be useful to some folks and I'd be curious to see numbers for other online games if people are willing to share. Dedicated servers currently online (absolute numbers). Linux/x86 925 Win2K 859 Win98/WinME 28 WinNT 38 WinXP 299 Last OS used for unique clients connecting to masterserver (percentages). Linux/x86 0.73 Win2K 10.70 Win98/WinME 15.35 WinNT 0.01 WinXP 73.19 -- Daniel, Epic Games Inc. |
From: Parveen K. <pk...@al...> - 2003-06-28 20:04:51
|
Here are the stats that Daniel provided earlier this year as well. There are more Win98/ME machines than Win2K machines. But it looks like users are moving away from Win98/ME to XP. On Fri, 2003-06-27 at 20:01, Daniel Vogel wrote: > Dedicated servers currently online (absolute numbers). >=20 > Linux/x86 925 --> 43.04% > Win2K 859 --> 39.97% > Win98/WinME 28 --> 1.30%=20 > WinNT 38 --> 1.76% > WinXP 299 --> 13.91% >=20 > Last OS used for unique clients connecting to masterserver > (percentages). >=20 > Linux/x86 0.73 > Win2K 10.70 > Win98/WinME 15.35 > WinNT 0.01 > WinXP 73.19 On Sat, 2003-01-04 at 13:09, Daniel Vogel wrote: > UT 2003 Client Server >=20 > Linux 0.69 % 40.66 % > Windows 98/ME 19.63 % 2.74 % > Windows NT 0.00 % 6.03 % > Windows 2000 11.87 % 34.96 % > Windows XP 67.78 % 15.62 % |
From: Ivan G. <dea...@ga...> - 2003-06-29 09:39:22
|
Thanks for the info! I'm very happy with the fact that most clients use WinXP :))) ------------------------------------------------- Leave all your expectations behind, or they'll pull you down on your way to the top. ------------------------------------------------- -Ivan |
From: Brian H. <bri...@py...> - 2003-08-16 04:33:59
|
Say you have a patch installer that needs to update the following files: GAME.EXE GAME.DLL NETAPI.DLL The typical way would be to download the patch, apply it to temporary files, then copy over the files upon completion. PATCH GAME.EXE NEWGAME.EXE PATCH GAME.DLL NEWGAME.DLL PATCH NETAPI.DLL NEWNETAPI.DLL //Assume UPDATE renames existing to .EXT.OLD //and copies NEW*.EXT to *.EXT UPDATE GAME.EXE UPDATE GAME.DLL UPDATE NETAPI.DLL Obviously nothing destructive happens until after the patches are applied and you can verify that everything is okay. But what do you do if something bad happens during the destructive update portion, e.g. lost connection to remote disk or power outage or system crash, etc. during the update? The brute force mechanism seems a little, well, brute force, I was wondering if there's something more elegant I could do. Brian |
From: Jason M. <jma...@li...> - 2003-08-16 05:14:01
|
You would probably want to have a way to recognize that something bad happened that stopped the update from completing. I'm not sure what you mean by the brute force mechanism, but the first thing that comes to my mind is to create a temporary file, say "patching.txt", that would be created before the patch begins and deleted when it is finished. The game and/or patch installer could check for the existence of this file when it starts up, and if it's there you would know something funky happened while updating and the update needs to be redone (or better yet continued from where it got interrupted). -Jason ----- Original Message ----- From: "Brian Hook" <bri...@py...> To: <gam...@li...> Sent: Friday, August 15, 2003 9:32 PM Subject: [GD-General] Multifile atomic updates Say you have a patch installer that needs to update the following files: GAME.EXE GAME.DLL NETAPI.DLL The typical way would be to download the patch, apply it to temporary files, then copy over the files upon completion. PATCH GAME.EXE NEWGAME.EXE PATCH GAME.DLL NEWGAME.DLL PATCH NETAPI.DLL NEWNETAPI.DLL //Assume UPDATE renames existing to .EXT.OLD //and copies NEW*.EXT to *.EXT UPDATE GAME.EXE UPDATE GAME.DLL UPDATE NETAPI.DLL Obviously nothing destructive happens until after the patches are applied and you can verify that everything is okay. But what do you do if something bad happens during the destructive update portion, e.g. lost connection to remote disk or power outage or system crash, etc. during the update? The brute force mechanism seems a little, well, brute force, I was wondering if there's something more elegant I could do. Brian ------------------------------------------------------- This SF.Net email sponsored by: Free pre-built ASP.NET sites including Data Reports, E-commerce, Portals, and Forums are available now. Download today and enter to win an XBOX or Visual Studio .NET. http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01 _______________________________________________ Gamedevlists-general mailing list Gam...@li... https://lists.sourceforge.net/lists/listinfo/gamedevlists-general Archives: http://sourceforge.net/mailarchive/forum.php?forum_idU7 |
From: Brian H. <ho...@py...> - 2003-08-16 12:40:41
|
> what you mean by the brute force mechanism The old "setting a flag before patch, clearing it after verification" thing that was mentioned. You get a bit of a recursive problem if you try to back out to the older versions and then find that you failed doing THAT, etc. But setting checkpoint flags probably works well enough I guess. Bleh. -Hook |
From: Colin F. <cp...@ea...> - 2003-08-17 03:05:26
|
I suggest the following: MAKE YOUR "MAIN" APP A "LAUNCHER", not the actual application. For example, have a root directory for your application: c:\app Such that you have your friendly app icon and name in there: c:\app\happy.exe This is the only binary in the directory, and this executable is the one specified by any shortcuts on the desktop and the "Start menu". But this app is just a simple Win32 app that is just WinMain() with a call to CreateProcess() or system() and some extra logic. Okay, your real application, with associated DLLs, etc, is in a sub-directory: c:\app\ver2003aug Like the following: c:\app\ver2003aug\happyv2003aug.exe c:\app\ver2003aug\happyv2003aug.dll c:\app\ver2003aug\ijl_v2003aug.dll c:\app\ver2003aug\manifest_v2003aug.bin If you patch your application, you basically do whatever reckless procedure suits you, resulting in a totally new version directory: c:\app\ver2004jan Like the following: c:\app\ver2004jan\happyv2004jan.exe c:\app\ver2004jan\happyv2004jan.dll c:\app\ver2004jan\ijl_v2004jan.dll c:\app\ver2003aug\manifest_v2004jan.bin Okay, anything bad can happen during initial installation or patching! If the initial installation TOTALLY fails, the launcher app won't even be available. If the tiny launcher app was installed, but nothing else, this won't lead to any craziness. And any additional success / corruption during the copying or patching of files, on a continuum, will never be a problem. The launcher searches all sub-directories for manifest_*.bin files. Any such file that is not internally consistent (i.e., matches its own stated hash code within the file) is rejected. Thus, we have a list of candidate application versions. We can sort from newest to oldest. We consider the newest manifest first. We look at the list of all files mentioned in the manifest, and we verify the existence of each file and the recorded hash code. If all files exist AND match all recorded hash codes, then we execute the main application as a new process. The end! The updater always creates new subdirectories, and the main application NEVER changes. There are no "state" files recording the progress of updates or installations or "latest version", etc. The launcher actively decides which version among the completely correct versions is the latest. The beauty of this scheme, depending on how difficult it is to hack your manifest_*.bin file format, is that you get some protection against cracks or random corruption. Also, you will be 100% sure of all file versions! All files will be consistent, otherwise you will not run. Furthermore, the launcher app can report corruption, or even do updates itself, and make the user aware of available versions. Perhaps it would be useful for the user to have the ability to select versions, for testing, etc. You can give the user the option of removing old versions. But the great thing is that you can always insure that at least one working version exists (assuming the initial install was successful). You will never execute a corrupt version (of the EXE, associated DLLS, or even data if you choose to verify the data, too). Disk space may be an issue, especially when patching huge files, but I think giving the user the option of removing old versions (after both old and new versions have been fully verified) will conserve disk space to any desired level. Well, this scheme seems so simple and appealing that I'm sure it is very common! (I can't possibly have an original idea...) --- Colin cp...@ea... |
From: Mike W. <mi...@ge...> - 2003-08-16 07:16:12
|
i'd think that you'd have your updater application set an ini or registry entry indicating 'startedupdate=1' or something and then have some kind of launcher application that does checks of the application, reads this setting on every launch, if the 'startedupdate' is set, but not 'finishedupdate', then prompt the user to finish the update they started, which could check at what state the update is in, if the files have been downloaded locally already, the user doesn't have to do that step, you can md5 the files to make sure they are valid before applying the changes (taking into account terminated downloads) and download any 'broken' files as necessary... or use VISE AutoUpdate that does all this fun stuff for you ;] mike w www.gekidodesigns.com Brian Hook wrote: >Say you have a patch installer that needs to update the following >files: > >GAME.EXE >GAME.DLL >NETAPI.DLL > >The typical way would be to download the patch, apply it to temporary >files, then copy over the files upon completion. > >PATCH GAME.EXE NEWGAME.EXE >PATCH GAME.DLL NEWGAME.DLL >PATCH NETAPI.DLL NEWNETAPI.DLL > >//Assume UPDATE renames existing to .EXT.OLD >//and copies NEW*.EXT to *.EXT >UPDATE GAME.EXE >UPDATE GAME.DLL >UPDATE NETAPI.DLL > >Obviously nothing destructive happens until after the patches are >applied and you can verify that everything is okay. But what do you >do if something bad happens during the destructive update portion, >e.g. lost connection to remote disk or power outage or system crash, >etc. during the update? > >The brute force mechanism seems a little, well, brute force, I was >wondering if there's something more elegant I could do. > >Brian > > > > >------------------------------------------------------- >This SF.Net email sponsored by: Free pre-built ASP.NET sites including >Data Reports, E-commerce, Portals, and Forums are available now. >Download today and enter to win an XBOX or Visual Studio .NET. >http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01 >_______________________________________________ >Gamedevlists-general mailing list >Gam...@li... >https://lists.sourceforge.net/lists/listinfo/gamedevlists-general >Archives: >http://sourceforge.net/mailarchive/forum.php?forum_idU7 > > > > |
From: Mike W. <mi...@ge...> - 2003-08-16 08:08:24
|
doh, sorry bout the double post my bad mike Mike Wuetherick wrote: > i'd think that you'd have your updater application set an ini or > registry entry indicating 'startedupdate=1' or something and then have > some kind of launcher application that does checks of the application, > reads this setting on every launch, if the 'startedupdate' is set, but > not 'finishedupdate', then prompt the user to finish the update they > started, which could check at what state the update is in, if the > files have been downloaded locally already, the user doesn't have to > do that step, you can md5 the files to make sure they are valid before > applying the changes (taking into account terminated downloads) and > download any 'broken' files as necessary... > > or use VISE AutoUpdate that does all this fun stuff for you ;] > > mike w > www.gekidodesigns.com > > > Brian Hook wrote: > >> Say you have a patch installer that needs to update the following files: >> >> GAME.EXE >> GAME.DLL >> NETAPI.DLL >> >> The typical way would be to download the patch, apply it to temporary >> files, then copy over the files upon completion. >> >> PATCH GAME.EXE NEWGAME.EXE >> PATCH GAME.DLL NEWGAME.DLL >> PATCH NETAPI.DLL NEWNETAPI.DLL >> >> //Assume UPDATE renames existing to .EXT.OLD >> //and copies NEW*.EXT to *.EXT >> UPDATE GAME.EXE >> UPDATE GAME.DLL >> UPDATE NETAPI.DLL >> >> Obviously nothing destructive happens until after the patches are >> applied and you can verify that everything is okay. But what do you >> do if something bad happens during the destructive update portion, >> e.g. lost connection to remote disk or power outage or system crash, >> etc. during the update? >> >> The brute force mechanism seems a little, well, brute force, I was >> wondering if there's something more elegant I could do. >> >> Brian >> >> >> >> >> ------------------------------------------------------- >> This SF.Net email sponsored by: Free pre-built ASP.NET sites including >> Data Reports, E-commerce, Portals, and Forums are available now. >> Download today and enter to win an XBOX or Visual Studio .NET. >> http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01 >> >> _______________________________________________ >> Gamedevlists-general mailing list >> Gam...@li... >> https://lists.sourceforge.net/lists/listinfo/gamedevlists-general >> Archives: >> http://sourceforge.net/mailarchive/forum.php?forum_idU7 >> >> >> >> > > > > ------------------------------------------------------- > This SF.Net email sponsored by: Free pre-built ASP.NET sites including > Data Reports, E-commerce, Portals, and Forums are available now. > Download today and enter to win an XBOX or Visual Studio .NET. > http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01 > > _______________________________________________ > Gamedevlists-general mailing list > Gam...@li... > https://lists.sourceforge.net/lists/listinfo/gamedevlists-general > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_id=557 > > |
From: Mike W. <mi...@ge...> - 2003-08-16 09:14:16
|
nm, just realized i'm subscribed twice...can't figure out what my 'other' email address i'm subscribed with is...hmm... Mike Wuetherick wrote: > doh, sorry bout the double post > my bad > mike > > Mike Wuetherick wrote: > >> i'd think that you'd have your updater application set an ini or >> registry entry indicating 'startedupdate=1' or something and then >> have some kind of launcher application that does checks of the >> application, reads this setting on every launch, if the >> 'startedupdate' is set, but not 'finishedupdate', then prompt the >> user to finish the update they started, which could check at what >> state the update is in, if the files have been downloaded locally >> already, the user doesn't have to do that step, you can md5 the files >> to make sure they are valid before applying the changes (taking into >> account terminated downloads) and download any 'broken' files as >> necessary... >> >> or use VISE AutoUpdate that does all this fun stuff for you ;] >> >> mike w >> www.gekidodesigns.com >> >> >> Brian Hook wrote: >> >>> Say you have a patch installer that needs to update the following >>> files: >>> >>> GAME.EXE >>> GAME.DLL >>> NETAPI.DLL >>> >>> The typical way would be to download the patch, apply it to >>> temporary files, then copy over the files upon completion. >>> >>> PATCH GAME.EXE NEWGAME.EXE >>> PATCH GAME.DLL NEWGAME.DLL >>> PATCH NETAPI.DLL NEWNETAPI.DLL >>> >>> //Assume UPDATE renames existing to .EXT.OLD >>> //and copies NEW*.EXT to *.EXT >>> UPDATE GAME.EXE >>> UPDATE GAME.DLL >>> UPDATE NETAPI.DLL >>> >>> Obviously nothing destructive happens until after the patches are >>> applied and you can verify that everything is okay. But what do you >>> do if something bad happens during the destructive update portion, >>> e.g. lost connection to remote disk or power outage or system crash, >>> etc. during the update? >>> >>> The brute force mechanism seems a little, well, brute force, I was >>> wondering if there's something more elegant I could do. >>> >>> Brian >>> >>> >>> >>> >>> ------------------------------------------------------- >>> This SF.Net email sponsored by: Free pre-built ASP.NET sites including >>> Data Reports, E-commerce, Portals, and Forums are available now. >>> Download today and enter to win an XBOX or Visual Studio .NET. >>> http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01 >>> >>> _______________________________________________ >>> Gamedevlists-general mailing list >>> Gam...@li... >>> https://lists.sourceforge.net/lists/listinfo/gamedevlists-general >>> Archives: >>> http://sourceforge.net/mailarchive/forum.php?forum_idU7 >>> >>> >>> >>> >> >> >> >> ------------------------------------------------------- >> This SF.Net email sponsored by: Free pre-built ASP.NET sites including >> Data Reports, E-commerce, Portals, and Forums are available now. >> Download today and enter to win an XBOX or Visual Studio .NET. >> http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01 >> >> _______________________________________________ >> Gamedevlists-general mailing list >> Gam...@li... >> https://lists.sourceforge.net/lists/listinfo/gamedevlists-general >> Archives: >> http://sourceforge.net/mailarchive/forum.php?forum_id=557 >> >> > > > > ------------------------------------------------------- > This SF.Net email sponsored by: Free pre-built ASP.NET sites including > Data Reports, E-commerce, Portals, and Forums are available now. > Download today and enter to win an XBOX or Visual Studio .NET. > http://aspnet.click-url.com/go/psa00100003ave/direct;at.aspnet_072303_01/01 > > _______________________________________________ > Gamedevlists-general mailing list > Gam...@li... > https://lists.sourceforge.net/lists/listinfo/gamedevlists-general > Archives: > http://sourceforge.net/mailarchive/forum.php?forum_id=557 > > |
From: Enno R. <en...@de...> - 2003-05-15 20:52:52
|
Kent Quirk wrote: > At 01:54 PM 5/15/2003 +0200, you wrote: > > A good bug tracker is indispensable. Has bugzilla gotten any easier to > install since last year? I looked at it and ran away in horror. Well, I really can't remember, it's been so long. I think you really just do apt-get install bugzilla and go through the configuration dialogs. You need a bit of understanding as to what it is you're installing there, but nothing horrid. We've since adapted bugzilla a lot, and use it for all task tracking. Basically, nobody does anything that isn't in bugzilla, which gives us a lot of control and predictability. People do time estimates on their tasks, we use milestones extensively, and we're thinking of tying it closer together with CVS (in our system you can already get from the webcvs changes to the bugzilla task that caused the change, but as of now, you can't do it the other way round). And we have some rudimentary export from bugzilla to project to make gant charts of what's going on. nice. > Annoyingly, by far the best product of this type I've ever used is Lotus > Notes. I say "annoyingly" because it's got a strange design dating back > to 1993. I rmember Notes from the mid-nineties, at university. It was weird, and while I can see that it would do parts of what we want, it's jsut so clunky to use that I can't see it being widely accepted here. > But with all that said, we're now moving to custom XML designs for all > our document types. We've built the beginnings of a general-purpose XML > forms editor tool (there are such things available from people like > Altova as well). Having the data in an XML format means that it's pretty > easy to build/find tools to transform, catalog, index, and search in the > files. But you also want a decent editor to use while working with them. So you make your design documents, slides, project plans, etc. all in those? Or are you saying you use the XML for indexing the documents that you make? Anyway, we're not big enough to spend a lot of time on building our own software solutions, I'm afraid. > I'd also say that artist's opinions on data organization are highly > suspect. It's been my experience (and yes, I'm generalizing) that > artists have to be dragged kicking and screaming into any kind of > structured data organization. They're visual people and tend not to care > that much about silly things like textual names. The key is to find a > system that lets you find the assets you need with minimal impact on > artist productivity. Jep, absolutely. OTOH, I've had some that were absolutely delighted after they learned about CVS and what it offers them. For small, textual resources, that is. We've pretty much rolled our own in the past, too, and it's just not as powerful or hassle-free as a good commercial product would be, so I'd really like to switch. :-/ Enno. |
From: Thatcher U. <tu...@tu...> - 2003-05-15 19:45:47
|
On May 15, 2003 at 01:54 +0200, Enno Rehling wrote: > > The other thing I miss is asset management for large binaries - > stuff that we can't or shouldn't put into CVS. I know there's > alienbrain, but for some reason the evaluation didn't leave our > artists very happy. It's also not exactly cheap, IIRC. Do you know > anything else, are you using something that makes you say "wouldn't > know how I could live without it"? We're using Perforce; it handles binary assets just fine. We have done a bunch of scripting to automate some content processes, e.g. so that artists can hit a button in Maya to do the appropriate edit/checkout. This seems to be working pretty smoothly for us. My (third-hand) understanding of AlienBrain is that they are adding value by including a lot of this convenience scripting and integrating for popular tools. -- Thatcher Ulrich http://tulrich.com |
From: Enno R. <en...@de...> - 2003-05-15 21:04:02
|
Thatcher Ulrich wrote: > On May 15, 2003 at 01:54 +0200, Enno Rehling wrote: > > We're using Perforce; it handles binary assets just fine. We have > done a bunch of scripting to automate some content processes, e.g. so > that artists can hit a button in Maya to do the appropriate > edit/checkout. This seems to be working pretty smoothly for us. How does it handle an artist that creates 182 versions of a 80 megabyte binary file in the course of 3 weeks? I suppose CVS would end up with 14 GB of archives, by which time it has probably long croaked :-) I've looked at Perforce in the past, it's a really nice product, simliar to what we already know. But does it handle really, really large amounts of data? Enno. |
From: Thatcher U. <tu...@tu...> - 2003-05-16 04:33:44
|
On May 15, 2003 at 11:00 +0200, Enno Rehling wrote: > Thatcher Ulrich wrote: > > >On May 15, 2003 at 01:54 +0200, Enno Rehling wrote: > > > >We're using Perforce; it handles binary assets just fine. We have > >done a bunch of scripting to automate some content processes, e.g. so > >that artists can hit a button in Maya to do the appropriate > >edit/checkout. This seems to be working pretty smoothly for us. > > How does it handle an artist that creates 182 versions of a 80 megabyte > binary file in the course of 3 weeks? I suppose CVS would end up with 14 GB > of archives, by which time it has probably long croaked :-) No problem so far... > I've looked at Perforce in the past, it's a really nice product, simliar to > what we already know. But does it handle really, really large amounts of > data? I don't know the actual size of our repository w/ history, but we throw everything into Perforce including tons of automatically built assets, and the systems people assure me we're in no danger of running out of disk space. The performance continues to be very good as well. Knock on wood... I've personally used CVS for binary assets on much smaller projects, but haven't stress-tested it to nearly the same extent. -- Thatcher Ulrich http://tulrich.com |
From: Jamie F. <ja...@qu...> - 2003-05-16 10:24:30
|
we use CVS for binary assets as well as code. Generally, we commit the source asset files to CVS (e.g. max, maya files, etc.), and maintain a build process that turns those files into the final binary asset. Has worked for one project and many demos, continues to work (touch wood) for two ongoing projects.... Jamie -----Original Message----- From: gam...@li... [mailto:gam...@li...]On Behalf Of Thatcher Ulrich Sent: 16 May 2003 05:30 To: gam...@li... Subject: Re: [GD-General] Re: asset & document management On May 15, 2003 at 11:00 +0200, Enno Rehling wrote: > Thatcher Ulrich wrote: > > >On May 15, 2003 at 01:54 +0200, Enno Rehling wrote: > > > >We're using Perforce; it handles binary assets just fine. We have > >done a bunch of scripting to automate some content processes, e.g. so > >that artists can hit a button in Maya to do the appropriate > >edit/checkout. This seems to be working pretty smoothly for us. > > How does it handle an artist that creates 182 versions of a 80 megabyte > binary file in the course of 3 weeks? I suppose CVS would end up with 14 GB > of archives, by which time it has probably long croaked :-) No problem so far... > I've looked at Perforce in the past, it's a really nice product, simliar to > what we already know. But does it handle really, really large amounts of > data? I don't know the actual size of our repository w/ history, but we throw everything into Perforce including tons of automatically built assets, and the systems people assure me we're in no danger of running out of disk space. The performance continues to be very good as well. Knock on wood... I've personally used CVS for binary assets on much smaller projects, but haven't stress-tested it to nearly the same extent. -- Thatcher Ulrich http://tulrich.com ------------------------------------------------------- Enterprise Linux Forum Conference & Expo, June 4-6, 2003, Santa Clara The only event dedicated to issues related to Linux enterprise solutions www.enterpriselinuxforum.com _______________________________________________ Gamedevlists-general mailing list Gam...@li... https://lists.sourceforge.net/lists/listinfo/gamedevlists-general Archives: http://sourceforge.net/mailarchive/forum.php?forum_id=557 |
From: Mickael P. <mpo...@ed...> - 2003-05-16 07:36:23
|
Enno Rehling wrote: > Thatcher Ulrich wrote: > >> On May 15, 2003 at 01:54 +0200, Enno Rehling wrote: >> >> We're using Perforce; it handles binary assets just fine. We have >> done a bunch of scripting to automate some content processes, e.g. so >> that artists can hit a button in Maya to do the appropriate >> edit/checkout. This seems to be working pretty smoothly for us. > > How does it handle an artist that creates 182 versions of a 80 > megabyte binary file in the course of 3 weeks? I suppose CVS would > end up with 14 GB of archives, by which time it has probably long > croaked :-) > > I've looked at Perforce in the past, it's a really nice product, > simliar to what we already know. But does it handle really, really > large amounts of data? Would be interesting to have some hard numbers here about the size of assets you are all managing on your projects. I've been in hollidays for 2 weeks, and I just made a synchronization on my local repository for the project I'm working on, there are the numbers: * 13736 files * 3438 folders * 1650 files were modified and getting them from the network represented a total of 1.29 gigabytes. The whole synchronization operation take 7 minutes and 26 second, for an average transfert speed of 2.95 megabyte/second That's for the main game ready asset folder. For what we call "rawdata" (where artists are doing experimentation), we have a total of 70036 files (39.2 gigabytes) in 5525 folders. When I see these numbers, I wonder how it's possible to get fast versioning programs that perform CRC, compression, archiving... Mickael Pointier |
From: Thatcher U. <tu...@tu...> - 2003-05-16 13:29:12
|
On May 16, 2003 at 09:38 +0200, Mickael Pointier wrote: > > Would be interesting to have some hard numbers here about the size > of assets you are all managing on your projects. > > I've been in hollidays for 2 weeks, and I just made a synchronization on my > local repository for the project I'm working on, there are the numbers: > * 13736 files > * 3438 folders > * 1650 files were modified and getting them from the network represented a > total of 1.29 gigabytes. > The whole synchronization operation take 7 minutes and 26 second, for an > average transfert speed of 2.95 megabyte/second > > > > That's for the main game ready asset folder. > > For what we call "rawdata" (where artists are doing experimentation), we > have a total of 70036 files (39.2 gigabytes) in 5525 folders. I'm almost completely sync'd, so I don't have handy transfer numbers. Here are some numbers for the full content tree: 7.6 GB 28050 files I exclude some branches of the content tree because I'm usually on the other end of a 768Kb/s DSL line. An empty sync using Perforce takes about 1.3 seconds over DSL. When files need to be transferred, the transfer rate is limited to the DSL speed (Perforce doesn't appear to use any rsync-like tricks, which would be nice). But generally I sync to the full repository whenever I feel like I need to (several times a day); the delay is not a consideration. On a LAN, sync's are obviously much faster, but I don't have figures for that. -- Thatcher Ulrich http://tulrich.com |
From: Ivan-Assen I. <as...@ha...> - 2003-05-16 13:47:56
|
> ... on the other end of a 768Kb/s DSL line.... > ... rsync-like tricks, which would be nice... On a semirelated note, what can you recommend for a folder sync between two Windows machines, both of which are behind firewalls with uncooperative BOFHs? Use of a FTP server on a third machine is permitted. Any ideas? |
From: Thatcher U. <tu...@tu...> - 2003-05-16 19:27:56
|
On Fri, 16 May 2003, Ivan-Assen Ivanov wrote: > > ... on the other end of a 768Kb/s DSL line.... > > ... rsync-like tricks, which would be nice... > > On a semirelated note, what can you recommend for a > folder sync between two Windows machines, both of which > are behind firewalls with uncooperative BOFHs? > Use of a FTP server on a third machine is permitted. > > Any ideas? My preference would be rsync via ssh, plus whatever scripts you need. cygwin includes all the necessary tools. If the firewalls are effective, then this would involve rsync'ing to/from the third machine as an intermediary. YMMV -Thatcher |
From: Brian H. <ho...@py...> - 2003-05-16 19:40:44
|
>My preference would be rsync via ssh, plus whatever scripts you >need. cygwin includes all the necessary tools. If the firewalls= are >effective, then this would involve rsync'ing to/from the third >machine as an intermediary. If you need one-way propagation, then rsync works fine, but if= you need to reconcile in multiple directions, I highly recommend= unison (which is free) over ssh. Brian |