Note: This package is very much still in the works.
npm i @clxrity/react-audio
To visualize the waveform of the audio file, I utilized the Canvas API.
I created a <Canvas />
component that takes in the following props:
import { ComponentPropsWithRef } from 'react';
interface CanvasProps extends ComponentPropsWithRef<'canvas'> {
analyzerdData: AnalyzerData;
color: string
size?: {
width: number
height: number
}
}
useRef
hook.import { ElementRef } from 'react';
const canvasRef = useRef<ElementRef<'canvas'>>(null);
draw()
function that is called within a useEffect
hook that will be called whenever data changes.
animateBars()
utility function that will fill the canvas with the waveform data.const draw = (
analyzerNode: AnalyserNode,
bufferLengthCount: number,
dataArrayDigits: Uint8Array,
) => {
const canvas = canvasRef.current;
if (!canvas || !analyzerNode) return;
const canvasCtx = canvas.getContext('2d');
const animate = () => {
requestAnimationFrame(animate);
canvas.width = canvas.width;
animateBars({
analyzer: analyzerNode,
canvas,
canvasCtx,
dataArray: dataArrayDigits,
bufferLength: bufferLengthCount,
color,
})
}
animate();
}
animate()
function is called recursively using the requestAnimationFrame
function.
canvas.width = canvas.width
line is used to clear the canvas on each frame.animateBars()
function is called with the necessary arguments to fill the canvas with the waveform data.// utils/animateBars.ts
interface AnimateBarsParams {
analyzer: AnalyserNode;
canvas: HTMLCanvasElement;
canvasCtx: CanvasRenderingContext2D;
dataArray: Uint8Array;
bufferLength: number;
color: string;
}
export default function animateBars({
analyzer,
canvas,
canvasCtx,
dataArray,
bufferLength,
color,
}: AnimateBarsParams) {
/**
* @see https://developer.mozilla.org/en-US/docs/Web/API/AnalyserNode/getByteFrequencyData
*/
analyzer.getByteFrequencyData(dataArray);
canvasCtx.fillStyle = color; // color of the bars
const HEIGHT = canvas.height;
const barWidth = Math.ceil(canvas.width / bufferLength) * 2.5;
let barHeight;
let x = 0;
for (let i = 0; i < bufferLength; i++) {
barHeight = (dataArray[i] / 255) * HEIGHT;
canvasCtx.fillRect(x, HEIGHT - barHeight, barWidth, barHeight);
x += barWidth + 1;
}
}
<Canvas />
component is the <Waveform />
component.
AnalyzerData
interface type mentioned previously)<Waveform />
component takes in the following props:
ComponentProps
interface from React, and default Props
interface which I created for all the components to extend from.import { ComponentProps } from 'react';
export interface WaveformProps extends ComponentProps<'div'>, Props {
color?: string;
size?: {
width: number;
height: number;
};
fftsize: number; /*
- The size of the Fast Fourier Transform to be used to determine the frequency domain.
- The default value is 2048.
- Valid options include: 32 | 64 | 128 | 256 | 512 | 1024 | 2048 | 4096 | 8192 | 16384 | 32768
*/
}
useEffect
hook, if a track source is present, the audio source is set and the audio element is loaded.useEffect(() => {
if (track.src) {
setAudioSrc(track.src);
audioElement.current.load();
}
}, [track.src, audioElement]);
handleUserGesture()
function that would handle the audio context state.const handleUserGesture = () => {
if (!audioCtx) {
setAudioCtx(new AudioContext())
}
if (audioCtx.state === 'suspended') {
audioCtx.resume();
}
}
useEffect(() => {
window.addEventListener('click', handleUserGesture);
return () => {
window.removeEventListener('click', handleUserGesture);
}
}, [audioCtx]);
audioAnalyzer()
function that would handle setting the AnalyzerData
state.const audioAnalyzer = () => {
if (sourceNode.current) {
return; // if source node is already set, return
}
if (audioElement.current && audioCtx) {
setAnalyzerData(
audioElement.current,
audioCtx,
sourceNode,
fftsize,
)
}
}
useEffect(() => {
const current = audioElement.current;
if (current) {
audioElement.current.src = audioSrc;
current.addEventListener("play", audioAnalyzer);
audioAnalyzer();
}
return () => {
audioElement.current.removeEventListener("play", audioAnalyzer); // remove event listener when component unmounts
audioCtx.close(); // close the audio context when the component unmounts
}
}, [audioElement, audioCtx]);
<Waveform />
component herereturn (
<div {...props} {/* ... */}>
<Player
audioElement={audioElement}
{/* ... */}
/>
{
analyzerData && (
<Canvas
analyzerData={analyzerData}
color={color}
size={size}
/>
)
}
</div>
)
To be continued...