It's Christmas, get a 30% OFF. Apply code XMAS at checkout.
Today we’re going to grab out dancing shoes, because we’re going to create a basic music visualizer using Tailwind CSS and JavaScript.
A music visualizer is a tool or software that creates visual representations of music or sound. It takes audio signals, analyzes them in real time, and translates them into dynamic, often colorful animations or graphics that respond to different elements of the music, such as beats, rhythm, frequency, and amplitude.
These visualizations can range from simple waveforms to complex geometric patterns and effects. Music visualizers are commonly used in media players, live performances, or as standalone applications to enhance the listening experience by providing a visual counterpart to the audio.
Popular examples of music visualizers include those built into media players like Windows Media Player, Winamp, or YouTube’s audio visualizers. Many artists also use them during live shows to create immersive multimedia experiences.
Music visualizers are often used in live performances by DJs, bands, and electronic music artists. They create immersive environments by synchronizing visuals with the music, enhancing the overall experience for the audience. For instance, when a DJ plays an energetic track, the visuals might explode with rapid color transitions and pulsating shapes to match the tempo and rhythm.
Many media players, like Windows Media Player, iTunes, or VLC, include built-in music visualizers. While listening to music, users can view different types of visual animations that respond to the song being played, adding a visual dimension to the auditory experience.
Music visualizers are commonly used on YouTube and other streaming platforms as visual content for songs, particularly in channels that specialize in ambient music or electronic beats. Rather than using static images, creators use visualizers to generate dynamic, engaging visuals that react to the music being streamed.
In digital audio workstations (DAWs) like FL Studio and Ableton Live, producers use visualizers to monitor audio signals. The visual feedback helps them fine-tune elements like frequency balance and audio effects in their tracks.
Some entertainment apps, especially in virtual reality (VR) and gaming, use music visualizers as part of their user experience. For example, VR platforms offer immersive visualizer apps where users can experience their favorite songs in a visually stunning, 3D environment.
Music visualizers are also used in interactive art installations where sound and visuals merge to create a synchronized sensory experience. These installations often allow viewers to see how the music is represented visually, making the experience more engaging and accessible.
Popular examples of music visualizers include those built into media players like Windows Media Player, Winamp, or YouTube’s audio visualizers. Many artists also use them during live shows to create immersive multimedia experiences.
The audio element is used to play the audio file, which is pretty obvious. We are going to use the native HTML audio element, but you can also use a library like Howler.js or SoundJS to play audio files. Id’s
audio
: Assigns a unique ID to the audio element. This is important for accessing the element in JavaScript.
Attributescontrols
: This attribute adds a play/pause button to the audio element.src
: This attribute specifies the path to the audio file.The canvas element is used to display the music visualizer. We will use the native HTML canvas element. Id’s
visualizer
: Assigns a unique ID to the canvas element. This is important for accessing the element in JavaScript.<div class="max-w-xl items-center w-full mx-auto">
<audio id="audio" controls src="/music/sample.mp3"></audio>
<canvas id="visualizer"></canvas>
</div>
We will declare the audio element, canvas element, and context object.
const audio = document.getElementById("audio");
: This line of code declares the audio element and assigns it to the variable audio
.const canvas = document.getElementById("visualizer");
: This line of code declares the canvas element and assigns it to the variable canvas
.const ctx = canvas.getContext("2d");
: This line of code declares the context object and assigns it to the variable ctx
.const audio = document.getElementById("audio");
const canvas = document.getElementById("visualizer");
const ctx = canvas.getContext("2d");
let audioContext;
: This line of code declares the audioContext variable and assigns it to undefined
.let analyser;
: This line of code declares the analyser variable and assigns it to undefined
.let dataArray;
: This line of code declares the dataArray variable and assigns it to undefined
.let audioContext;
let analyser;
let dataArray;
We will create a function called initializeAudio
that will be called when the audio element is ready to play. This function will initialize the audio context, create the analyser, and set up the data array.
if (!audioContext)
: This condition checks if the audioContext variable is undefined
.audioContext = new(window.AudioContext || window.webkitAudioContext)();
: This line of code creates a new audio context using the AudioContext
or webkitAudioContext
constructor.analyser = audioContext.createAnalyser();
: This line of code creates a new analyser node and assigns it to the analyser
variable.analyser.fftSize = 256;
: This line of code sets the FFT size to 256.const bufferLength = analyser.frequencyBinCount;
: This line of code calculates the buffer length based on the FFT size.dataArray = new Uint8Array(bufferLength);
: This line of code creates a new Uint8Array
with the buffer length and assigns it to the dataArray
variable.const source = audioContext.createMediaElementSource(audio);
: This line of code creates a new media element source and assigns it to the source
variable.source.connect(analyser);
: This line of code connects the source to the analyser.analyser.connect(audioContext.destination);
: This line of code connects the analyser to the audio context destination.requestAnimationFrame(visualize);
: This line of code requests an animation frame and calls the visualize
function.audio.addEventListener("play", initializeAudio);
: This line of code adds an event listener to the play
event of the audio element and calls the initializeAudio
function.function initializeAudio() {
if (!audioContext) {
audioContext = new(window.AudioContext || window.webkitAudioContext)();
analyser = audioContext.createAnalyser();
analyser.fftSize = 256;
const bufferLength = analyser.frequencyBinCount;
dataArray = new Uint8Array(bufferLength);
const source = audioContext.createMediaElementSource(audio);
source.connect(analyser);
analyser.connect(audioContext.destination);
}
requestAnimationFrame(visualize);
}
audio.addEventListener("play", initializeAudio);
We will create a function called visualize
that will be called every frame to update the visualization. This function will get the audio data, calculate the bar width, and update the canvas.
const WIDTH = canvas.width;
: This line of code calculates the width of the canvas.const HEIGHT = canvas.height;
: This line of code calculates the height of the canvas.const barWidth = (WIDTH / dataArray.length) * 2.5;
: This line of code calculates the bar width based on the canvas width and the data array length.analyser.getByteFrequencyData(dataArray);
: This line of code gets the audio data from the analyser and assigns it to the data array.ctx.fillStyle = "rgb(17, 24, 39)";
: This line of code sets the fill style of the canvas to a dark blue color.ctx.fillRect(0, 0, WIDTH, HEIGHT);
: This line of code fills the entire canvas with the dark blue color.for (let i = 0; i < dataArray.length; i++)
: This line of code iterates over each element in the data array.const barHeight = dataArray[i] / 2;
: This line of code calculates the bar height based on the value of the current element in the data array.const hue = (i / dataArray.length) * 360;
: This line of code calculates the hue based on the current element index and the data array length.ctx.fillStyle =
hsl(${hue}, 100%, 50%);
: This line of code sets the fill style of the canvas to a color based on the hue.ctx.fillRect(i * barWidth, HEIGHT - barHeight, barWidth, barHeight);
: This line of code fills a rectangle with the current color and position based on the bar width and bar height.requestAnimationFrame(visualize);
: This line of code requests an animation frame and calls the visualize
function.function visualize() {
const WIDTH = canvas.width;
const HEIGHT = canvas.height;
const barWidth = (WIDTH / dataArray.length) * 2.5;
analyser.getByteFrequencyData(dataArray);
ctx.fillStyle = "rgb(17, 24, 39)";
ctx.fillRect(0, 0, WIDTH, HEIGHT);
for (let i = 0; i < dataArray.length; i++) {
const barHeight = dataArray[i] / 2;
const hue = (i / dataArray.length) * 360;
ctx.fillStyle = `hsl(${hue}, 100%, 50%)`;
ctx.fillRect(i * barWidth, HEIGHT - barHeight, barWidth, barHeight);
}
requestAnimationFrame(visualize);
}
const audio = document.getElementById("audio");
const canvas = document.getElementById("visualizer");
const ctx = canvas.getContext("2d");
let audioContext;
let analyser;
let dataArray;
function initializeAudio() {
if (!audioContext) {
audioContext = new(window.AudioContext || window.webkitAudioContext)();
analyser = audioContext.createAnalyser();
analyser.fftSize = 256;
const bufferLength = analyser.frequencyBinCount;
dataArray = new Uint8Array(bufferLength);
const source = audioContext.createMediaElementSource(audio);
source.connect(analyser);
analyser.connect(audioContext.destination);
}
requestAnimationFrame(visualize);
}
audio.addEventListener("play", initializeAudio);
function visualize() {
const WIDTH = canvas.width;
const HEIGHT = canvas.height;
const barWidth = (WIDTH / dataArray.length) * 2.5;
analyser.getByteFrequencyData(dataArray);
ctx.fillStyle = "rgb(17, 24, 39)";
ctx.fillRect(0, 0, WIDTH, HEIGHT);
for (let i = 0; i < dataArray.length; i++) {
const barHeight = dataArray[i] / 2;
const hue = (i / dataArray.length) * 360;
ctx.fillStyle = `hsl(${hue}, 100%, 50%)`;
ctx.fillRect(i * barWidth, HEIGHT - barHeight, barWidth, barHeight);
}
requestAnimationFrame(visualize);
}
In this tutorial, we learned how to create a basic music visualizer using Tailwind CSS and JavaScript. We covered the basics of the HTML audio element and the canvas element, as well as the JavaScript code that initializes the audio context, creates the analyser, and updates the visualization.
I hope you found this tutorial helpful and have a great day!
/Michael Andreuzza
Unlock all themes for $199 for forever! Includes lifetime updates, new
themes, unlimited projects, and support. No subscription needed.
— No subscription required!