|
From: Braden M. <br...@en...> - 2002-04-26 04:17:17
|
On Thu, 2002-04-25 at 04:35, -oli- wrote: > At 00:44 25.04.2002 -0400, you wrote: > >On Sun, 2002-04-21 at 10:03, -oli- wrote: > > > How can I find out from within the class NodeAudioClip if the AudioClip > > > node is [currently] child (i.e. source) of a Sound node? > > > >You can't really do that. > > > > > I need this to determine if the audio data has to be played or not, > > because > > > this check is done in the method NodeAudioClip::update(). > > > >Perhaps this logic belongs in Sound rather than AudioClip. > > But the determination if the clip is playing takes place in > NodeAudioClip::update(), _not_ in any method of the class NodeSound, what > makes sense, because the update()-method is called periodically from > VrmlScene::update() and the time dependance of the sound is in the > AudioClip-Node (not in the Sound-Node). Hm... You're right. > If I do it from NodeSound::render(), it would be only updated if the > scene/view changes, e.g. when the user navigates through the world. > > >Remember, MovieTextures can be sound sources, too. As such, it seems like we'd > >want to reuse the same logic regardless of what our sound source is. > > IMHO this has to take place in NodeMovieTexture::update(). Yes. We might be able to share this logic in a method on SoundSourceNode (in the rearch branch), but that really doesn't get us any closer to an answer to your question. How does the runtime know to render a visually rendered node? It knows because the renderer walks the tree, and by design it won't visit a node that shouldn't be rendered. Do we need something like an AudioRenderer? That is, a class that works like the Viewer, only for audio instead of geometry? -- Braden McDaniel e-mail: <br...@en...> <http://endoframe.com> Jabber: <br...@ja...> |