|Page (1) of 1 - 09/03/01||email article||print page|
Can You Synch Audio in LiveMotion SWF Files?Some quick and sloppy tips for matching audio with end frames
The problem is especially severe for users of Adobe LiveMotion, which doesn't have anywhere near the scriptability of Flash for taking care of problems like this. So here's a quick and dirty trick for getting the job done.
First of all, this is not going to fix any problems with lip synch This is a fix for simply ending the presentation's visual and aural components together. In the first case, we have the problem of audio tracks outlasting the animation. In the second and more common case, the audio ends well before the graphics. There's no particularly pretty solution to this problem. You either cut off the end of your audio, or you cut off the end of your animation. But even these rather ugly options can be a little bit tricky.
We'll take the problem of the audio ending prior to the animation. As you know, you can't attach behaviors to audio files in LiveMotion. So you have to do it in a roundabout way. Note: As always with LiveMotion, you should be working from scratch. Too many changes to an existing file can corrupt it permanently.
1. Place your audio file. (For this example, mine is called "Crambone," which I will use for future reference.)
2. Create a small object, and call it "Trigger." Change Trigger's opacity to zero in the Opacity palette.
3. In your timeline, select both Trigger and Crambone, and group them together using the Object > Group (Command-G) command. For purposes of this example, we'll rename the group "Audio Group" to avoid confusion.
4. Expand Audio Group and select Trigger. Drag its timeline in point all the way to the end so that it uses only a single frame in the animation, which should be the frame corresponding with the last frame of Crambone (as in the picture below).
5. Still with Trigger selected, go to your Rollover palette. Create a new custom state called "Bye."
6. Select the Normal state in the rollover palette. Normally you can't get behaviors to be triggered by the Normal state. But you can get the Normal state to change its own state into another, and that other state can take on any behavior. Sound weird? It is. But it's the way you have to do it for now. So, in the Normal state, click the Edit Behaviors button. Then add the "Change State" behavior with a target of Trigger and a state of Bye, as in the illustration below.
7. So the concept is that as soon as the Trigger object loads, which coincides with the last frame of the audio track, it switches itself to its Bye state. So now you can set the Bye state to trigger anything you want. For example, you could simply add the Stop behavior and target that at the composition, which will make the whole animation stop. Or it could trigger a fade to black or whatever?basically any tricks that you might pull with a mouseover, except this will be automatically triggered (no user interaction) at the last frame of the audio.
Once you get this concept, it's easy to see how it can be done in reverse?to get the end of an animation to trigger the end of an audio file. There is, however, one key difference. When you stop an animation, you can easily force your composition to fade to black (or white or whatever). I'm not going to go into this technique here; I've explained it many times in other tutorials and in the LiveMotion forum. But with audio, you can't just "cover it up" by controlling a sound's opacity, as you would with graphics.
No, there are really only two solutions that I can see. You either trigger a Stop All Sounds behavior, or your trigger Stop All Sounds and cover up the abrupt ending with some sort of conclusive sound effect or drum lick or something like that.
The way it works is to take your longest animation track (or the one most likely to run long) and group it with a second Trigger object, just as you did with your audio track. Your Second Trigger object would similarly have two states?Normal and Bye?and the Normal state would automatically switch to the Bye state, which would then trigger the Stop All Sounds event as well as a secondary event, such as playing your conclusive sound effect.
One note on this: I've noticed that when combining the Stop All Sounds behavior with a behavior that plays another sound, it's best to have the Stop All Sounds behavior LAST in the list. This seems counterintuitive, but it's the only way I've been able to get such a contradictory set of behaviors to function properly.
Now, this is, as advertised, a pretty sloppy way do do it. But I've been asked, and there's the answer. If you have any further questions or need clarification, please visit us in the Adobe LiveMotion forum here at Digital Media Net. Please also share any alternate ideas (besides the obvious one) for handling these sorts of situations.
Related Keywords:web, streaming, webcast, Webcasting, Adobe LiveMotion
Source:Digital Media Online. All Rights Reserved