From: Andrew G. <and...@gm...> - 2008-08-27 21:18:33
|
> > The current implementation holds that the behavior is undefined if you pass > NULL as function arguments: > > glmMatInverse2f (&foo, NULL); // BOOM If the behavior (in DEBUG mode) is for this function to cause an assertion then that behavior IS in fact defined. If this function is called without assertions enabled then, while the behavior is definitely undefined, it will also likely crash the app (dereferencing null). Differences in behavior between debug and optimized mode can be toxic. I think a lot of the time when developers indicate behavior is "undefined" that the behavior is actually well defined for a particular build but should be relied on as it may change. When I say "build" here I do not mean the difference between a debug and release build. Which would be preferred? Invalid results (a call to MatInverse that seems to do nothing) or an app crash? I realize that I'm just babbling and not offering a solution. I think I would prefer a null or singular matrix to return NULL and not modify *out. That behavior is at least testable with unit testing. The disadvantage is that it may be difficult under normal circumstances to detect the root cause of a bug if a null or singular matrix doesn't cause an instant failure. However this function would not affect the stability of the app. Looking at the #ifndef's in MatInverse2f and 3f (missing in 4f) I see a reliance on assert() in release code. When compiled with NDEBUG the *assert( fabs( det ) < GLM_ERROR_TOLERANCE )* will be absent. Is this this (undefined) behavior what you expect? pudman |