Access it here: AudioTrackr.
I built a fun little thingy with the Web Audio API. It visualizes the frequency levels and waveform of individual tracks in various songs. It was born out of a desire to stress test the Audio API. Turns out it’s pretty snappy (the slow part, as usual, is just drawing on the canvas).
Compatibility: Chrome, Firefox Aurora (24+), and iOS 6+. Unfortunately, Firefox and iOS will load a bit slowly, due to their (current) lack of support for createMediaElementSource (see below). Also, iOS is pretty finicky, so try reloading if it doesn’t work. It would work in Opera 15, but I’m only using mp3 files. Safari on Mac has an issue I haven’t been able to resolve yet.
Oh, and I built it with Angular, which I don’t have much experience with, so I’m sure I messed it up. I tried to keep as much as I could out of the Angular code (seemed to improve performance a bit to not have everything tied to Angular’s scope). If you have any suggestions for improvement, please leave me a comment.
Here are a few takeways:
- With the Web Audio API, you typically need to download an entire audio file and then decode the entire thing before you can start playing it. However, there’s this cool new thing called createMediaElementSource, which came out of WebRTC. It allows you to plug an HTML5 <audio> element into the API — which means you can take advantage of the element’s combined buffering/streaming/decoding capabilities. This is awesome, because it means I can start playing multiple files almost right away, instead of waiting forever for all of them to download and decode.
- Speaking of which, I had to set up some extra subdomains to serve the files from, since browsers will only allow a certain number of simultaneous HTTP connections to a single hostname. And if those connections happen to be audio files that the browser started and then paused downloading – waiting to see if you’re actually going to play them – you’re in big trouble because it will never start downloading the rest of the files.
- iOS will only play one Audio element at a time. And there’s no good way to detect that (other than to actually try it).
- iOS decodes audio buffers much faster than my quad-core desktop computer (good thing we’re using <audio> on the dekstop).
- The API has changed since iOS first implemented it. Thankfully Mozilla is on top of things.
- Feel free to fork it on github — you could add mute/solo buttons for each track!
- Thanks to Mix.js for inspiration.