They just released the API update for 2.1. I didn't see anything in the update notes that would allow for the creation of audio playback visualizations. If it's in the API, i can't find it. I'm just a novice programmer, but I would think you need something that can return either the frequency/tone being played in order to render a visualization as the frequency/tone of each beat changes. Here is the description of the "Media" API (which is where I think this support would be):
"Provides classes that manage various media interfaces in audio and video. The Media APIs are used to play and, in some cases, record media files. This includes audio (e.g., play MP3s or other music files, ringtones, game sound effects, or DTMF tones) and video (e.g., play a video streamed over the web or from local storage).
Other special classes in the package offer the ability to detect the faces of people in Bitmaps (
FaceDetector), control audio routing (to the device or a headset) and control alerts such as ringtones and phone vibrations (
AudioManager). "
It may be buried in the API somewhere, but I think if it was, someone would have created it by now.
As for the live wallpaper in the video, my only guess is that:
A: It's actually a video file playing a pre-recorded visualization with the audio.
B: They are using the "speech to text" function to generate a visualization (i.e. each word that it thinks is being said represents a different wave form in the visualization).
Not sure if B can be run against an mp3 file, so the answer is more likely A.
I don't know why this wouldn't be something built into Android, but it looks like we'll have to wait for an update before we get MP3 visualizers.
If I'm wrong and someone knows of a way to do this, please let me know!