Displaying and downloading audio files with React & Next.js

clxrity logo

Introduction

  • As a music producer, I am always looking for sounds and samples to use in my music. I also inevitably end up with a lot of audio files that I don't always use, or even remember that I have.
  • I wanted to create a simple way to display and listen to these audio files, and also to be able to download them easily.
  • There are other websites and sources that do this as well, such as looperman, splice, and freesound, but I wanted to create my own version that I could customize and control (and be accustomed to my own style).
    • Also, it's worth mentioning that none of these websites have an interactive audio player that allows you to actually visualize the waveform of the audio file, which is something I wanted to incoorporate.
  • The website (clxrity.xyz, based on my artist name) is still very much in the works. This is merely a prototype of what I want to achieve.

Part 1 - Building an audio waveform visualizer package

Note: This package is very much still in the works.

  • At the core of this project is an audio waveform visualizer package that I am building. This package will allow you to visualize the waveform of an audio file, and also interact with it (e.g. play, pause, seek, etc.).
npm i @clxrity/react-audio

Web Audio API

AnalyserNode
AnalyserNode

Canvas

import { ComponentPropsWithRef } from 'react';
 
interface CanvasProps extends ComponentPropsWithRef<'canvas'> {
    analyzerdData: AnalyzerData;
    color: string
    size?: {
        width: number
        height: number
    }
}
  • A reference to the canvas element is created using the useRef hook.
import { ElementRef } from 'react';
 
const canvasRef = useRef<ElementRef<'canvas'>>(null);
  • I created a draw() function that is called within a useEffect hook that will be called whenever data changes.
    • I also created an animateBars() utility function that will fill the canvas with the waveform data.
const draw = (
    analyzerNode: AnalyserNode,
    bufferLengthCount: number,
    dataArrayDigits: Uint8Array,
) => {
    const canvas = canvasRef.current;
    if (!canvas || !analyzerNode) return;
 
    const canvasCtx = canvas.getContext('2d');
 
    const animate = () => {
        requestAnimationFrame(animate);
        canvas.width = canvas.width;
        animateBars({
            analyzer: analyzerNode,
            canvas,
            canvasCtx,
            dataArray: dataArrayDigits,
            bufferLength: bufferLengthCount,
            color,
        })
    }
 
    animate();
}
  • The animate() function is called recursively using the requestAnimationFrame function.
    • The canvas.width = canvas.width line is used to clear the canvas on each frame.
    • The animateBars() function is called with the necessary arguments to fill the canvas with the waveform data.
// utils/animateBars.ts
 
interface AnimateBarsParams {
    analyzer: AnalyserNode;
    canvas: HTMLCanvasElement;
    canvasCtx: CanvasRenderingContext2D;
    dataArray: Uint8Array;
    bufferLength: number;
    color: string;
}
 
export default function animateBars({
    analyzer,
    canvas,
    canvasCtx,
    dataArray,
    bufferLength,
    color,
}: AnimateBarsParams) {
    /**
     * @see https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode/getByteFrequencyData
    */
    analyzer.getByteFrequencyData(dataArray);
    canvasCtx.fillStyle = color; // color of the bars
    const HEIGHT = canvas.height;
    const barWidth = Math.ceil(canvas.width / bufferLength) * 2.5;
 
    let barHeight;
    let x = 0;
 
    for (let i = 0; i < bufferLength; i++) {
        barHeight = (dataArray[i] / 255) * HEIGHT;
 
        canvasCtx.fillRect(x, HEIGHT - barHeight, barWidth, barHeight);
 
        x += barWidth + 1;
    }
}

Waveform component

  • While I also created multiple different components for playing & interacting with an audio file, the most important component that would be utilizing the <Canvas /> component is the <Waveform /> component.
    • This contains mulitple states and functions that are used to interact with the audio file.
      • Values stored in state (with the useState hook) include:
        • The analyzer data (as the AnalyzerData interface type mentioned previously)
        • The audio context (as AudioContext)
        • The audio source (the URL/path to the audio file)
      • Values stored in refs (with the useRef hook) include:
  • The <Waveform /> component takes in the following props:
    • It also extends the ComponentProps interface from React, and default Props interface which I created for all the components to extend from.
import { ComponentProps } from 'react';
export interface WaveformProps extends ComponentProps<'div'>, Props {
    color?: string;
    size?: {
        width: number;
        height: number;
    };
    fftsize: number; /*
        - The size of the Fast Fourier Transform to be used to determine the frequency domain.
        - The default value is 2048.
        - Valid options include: 32 | 64 | 128 | 256 | 512 | 1024 | 2048 | 4096 | 8192 | 16384 | 32768
    */
}

  • Within a useEffect hook, if a track source is present, the audio source is set and the audio element is loaded.
useEffect(() => {
    if (track.src) {
        setAudioSrc(track.src);
        audioElement.current.load();
    }
}, [track.src, audioElement]);
  • After many hours I spent stuck on an issue where the audio file would not play, I realized that the audio file would not play until the user interacted with the page (due to the Autoplay Policy).
    • I had to create a handleUserGesture() function that would handle the audio context state.
const handleUserGesture = () => {
    if (!audioCtx) {
        setAudioCtx(new AudioContext())
    }
    if (audioCtx.state === 'suspended') {
        audioCtx.resume();
    }
}
  • This will be called (for now) when the user clicks anywhere on the page.
useEffect(() => {
    window.addEventListener('click', handleUserGesture);
 
    return () => {
        window.removeEventListener('click', handleUserGesture);
    }
}, [audioCtx]);
  • Next was to create an audioAnalyzer() function that would handle setting the AnalyzerData state.
const audioAnalyzer = () => {
    if (sourceNode.current) {
        return; // if source node is already set, return
    }
    if (audioElement.current && audioCtx) {
        setAnalyzerData(
            audioElement.current,
            audioCtx,
            sourceNode,
            fftsize,
        )
    }
}
  • This function will be called whenever the audio element is played.
useEffect(() => {
    const current = audioElement.current;
 
    if (current) {
        audioElement.current.src = audioSrc;
        current.addEventListener("play", audioAnalyzer);
 
        audioAnalyzer();
    }
 
    return () => {
        audioElement.current.removeEventListener("play", audioAnalyzer); // remove event listener when component unmounts
        audioCtx.close(); // close the audio context when the component unmounts
    }
 
}, [audioElement, audioCtx]);
  • With all of this, and additional UI components, I was able to create a simple audio player that visualizes the waveform of the audio file.
    • You may view the completed code for the <Waveform /> component here
return (
    <div {...props} {/* ... */}>
        <Player 
            audioElement={audioElement}
            {/* ... */}
        />
        {
            analyzerData && (
                <Canvas 
                    analyzerData={analyzerData}
                    color={color}
                    size={size}
                />
            )
        }
    </div>
)

Part 2 - Building the website

To be continued...