Re: [Algorithms] Complexity of new hardware
Brought to you by:
vexxed72
|
From: Conor S. <bor...@ya...> - 2009-04-26 16:11:51
|
Where I'd like to break in; Haskell is not yet ready for large games (or medium/large project software development at large), because it's an academic language that hasn't yet progressed to programming in the large (big picture). Most of the "safety" in Haskell is for local problems, not for the large architectual decisions, which are the hardest to change (and in fact, haskell provides very little guidance or mechanism in these areas) later down the line. At least "object oriented" (in the general sense) programming languages try and provide a layered mechanism for programming in the large. Functional programming is a very good tool, but it's too pure a tool for production software. Most production software has areas that are "do this, then do that", which pure functional language still has awkward and heavy abstractions for (i.e. an extra level of thought that isn't necessary for the functionality required). It is also interesting that when Tim Sweeney said in his programming-language-for-the-future talk that the "graphics engine" would be "fuctional", yet he doesn't mention that rendering (as it currently stands) occurs in order and is highly stateful. Graphics hardware requires that you set your states, followed by your rendering commands, in order, which is a highly imperative way to think. This really shows that large problems tend to be made up of mixed solutions, that don't follow any one set of rules. The interaction with evalutation is purely a tools problem. It has been shown you can write C# in a REPL and there is no reason why C++ couldn't work in a REPL if C# can (as long as you can isolate the illegal behaviour). All of this is not to write off functional programming. I love functional programming and I think it's the key to code re-use (or in Charles Simonyi's words, "going meta") that has been missing from a lot of the current "promised land" languages. I think C# 3.0 was a move in the right direction (somewhere between F#, Haskell, C# 3.0, Cyclone and C99 is probably a "sweet point" right now), I also think the next C++ standard is moving in the right direction with lambdas/closures (if not dispensing with a whole lot of crud that a re-worked systems language doesn't need), but I do think it's (functional programming) not a paradigm people should be grabbing with both hands (only one and a pinky or so). Functional programming is part of the general solution to "better programming", but to take it to extremes (like Haskell) is not the answer, in the way software transactional memory is not the answer to scalable parallel computation (only some of the time; STM still has the analogue to deadlocks; that is a pathological case of cross referential transactions). There is no silver bullet to either of these problems and what we should be looking at is the best balance of tools without a level of complication that makes us put all our mental effort into the mechanisms of compution as opposed to the outcomes we're trying to achieve. Cheers, Conor ----- Original Message ---- From: Sam Martin <sam...@ge...> To: Game Development Algorithms <gda...@li...> Cc: and...@ni... Sent: Sunday, 26 April, 2009 3:58:08 AM Subject: Re: [Algorithms] Complexity of new hardware Yeah, that's what I'm talking about! :) I was trying to resist getting excited and going into over-sell mode, but likely undercooked how much potential I think there is here. To highlight just two more points I think are important: - Haskell stands a very good chance of allowing games to really get on top of their (growing) complexity. I think this is best illustrated in the paper, "Why functional programming matters", http://www.cs.chalmers.se/~rjmh/Papers/whyfp.html. Well worth a read if you've not seen it before. - It can be interactively evaluated and extended. Working with C/C++ we get so used to living without this I think we potentially under value how important a feature this is. Cheers, Sam -----Original Message----- From: Sebastian Sylvan [mailto:seb...@gm...] Sent: Sat 25/04/2009 19:16 To: Game Development Algorithms Cc: and...@ni... Subject: Re: [Algorithms] Complexity of new hardware On Wed, Apr 22, 2009 at 5:52 PM, Sam Martin <sam...@ge...>wrote: > > > Wouldn't that be a tough sell? You'd already be competing with free > > implementations of LUA, Python, JavaScript and their ilk on the low > end, > > and built-in languages like UnrealScript on the high end. > > I don't think there's a market for that kind of scripting DSL. A new > language would need to eat into the remaining C++ development burden > that isn't suitable to implementing in Lua, say. Which is plenty. > > > Doesn't this bring us back full circle? I recall a statement from a > > month ago saying that we all need to think differently about how we > put > > together massively parallel software, because the current tools don't > > really help us in the right ways... > > Another reason to consider pure functional languages. This is a much > deeper topic that I'm now about to trivialise, but the referential > transparency of these languages makes them particular suitable to > parallel evaluation. For example, GHC (arguably the most mature Haskell > compiler) can compile for an arbitrary number of cores, although it's > still an active research area as I understand it. Being a massive Haskell fanboy myself, let me jump in with some other cool things it does that relates to game development. 1. It's starting to get support for "Nested data parallelism". Basically flat data parallelism is what we get with shaders now, the problem with that is that the "per-element operation" can't itself be another data parallel operation. NDP allows you to write data parallel operations (on arrays) where the thing you do to each element is itself another data parallel operation. The compiler then has a team of magic pixies that fuses/flattens this into a series of data parallel appliacations, eliminating the need to do it manually. 2. It has Software Transactional Memory. So when you really need shared mutable state you can still access it from lots of different threads at once with optimistic concurrency (only block when there's an actual conflict). Yes, there are issues, and yes it adds overhead, but if the alternative is single threaded execution and the overhead is 2-3x, then we win once we have 4 hardware threads to spare. 3. Monads! Basically this allows you to overload semi-colon, which means you can fairly easily define your own embedded DSLs. This can let you write certain code a lot easier.. You could have a "behaviour" monad for example, abstracting over all the details of entities in the game doing things which take multiple frames (so you don't need to litter your behaviour code with state machine code, saving and restoring state etc, you just write what you want to do and the implementation of the monad takes care of things that needs to "yield"). 4. It's safe. Most code in games isn't systems code, so IMO it doesn't make sense to pay the cost of using a systems programming language for it (productivity, safety). 5. It's statically typed with a native compiler, meaning you could compile all your scripts and just link them into the game for release and get decent performance. Not C-like (yet, anyway!), but probably an order of magnitude over most dynamic languages. -- Sebastian Sylvan +44(0)7857-300802 UIN: 44640862 The new Internet Explorer 8 optimised for Yahoo!7: Faster, Safer, Easier. |