Search code examples
reactjsnext.js13wavmidiaudio-player

playing midi files in a react website or converting midi to wav before playing


I have a next.js react website that uses an api to generate continuations for MIDI files. the api gets a MIDI file, processes it and uploads the processed MIDI file to cloud and returns the URL to the website.

I have built an audio player component (with the help of claude as Iv'e never done that beofre), however the sounds it produces are below sub-optimal, the several instruments are getting overlapped and the drums sounds like high pitched mechanical noise.

I'll attach the component code below but I have several questions:

  1. is there a better way to play midi files than to write your own audio player? I want the component to be customized by me
  2. provided that the midi files are not corrupted, what is the cause in my code for this dissonance which doesn't exist in players such windows media player?
  3. would a better option be to convert the files to a different format (such as wav) that is more than just representation of musical notation and play it instead? if so how does one go about the conversion, are there libraries that deal with it or do I need to write the conversion myself?

I'm a novice in the audio processing world so any help would be very much appreciated.

attached is the full component code (minus the customizations)

'use client';

import React, { useState, useEffect, useRef, useCallback } from 'react';
import styled from 'styled-components';
import * as Tone from 'tone';
import { Midi } from '@tonejs/midi';
import { FaPlay, FaPause, FaStepBackward, FaStepForward } from 'react-icons/fa';

interface AudioPlayerProps {
  midiUrl: string;
}

interface NoteData {
  time: number;
  note: string;
  duration: number;
  velocity: number;
}

const AudioPlayer: React.FC<AudioPlayerProps> = ({ midiUrl }) => {
  const [isPlaying, setIsPlaying] = useState(false);
  const [duration, setDuration] = useState(0);
  const [currentTime, setCurrentTime] = useState(0);
  const [error, setError] = useState<string | null>(null);
  const [progress, setProgress] = useState(0);

  const synthRef = useRef<Tone.PolySynth | null>(null);
  const midiRef = useRef<Midi | null>(null);
  const notesRef = useRef<NoteData[]>([]);

  const initializeAudioContext = useCallback(async () => {
    if (!synthRef.current) {
      await Tone.start();
      synthRef.current = new Tone.PolySynth(Tone.Synth, {
        envelope: {
          attack: 0.02,
          decay: 0.1,
          sustain: 0.3,
          release: 0.8,
        },
      }).toDestination();
    }
  }, []);

  useEffect(() => {
    const loadMidi = async () => {
      try {
        await initializeAudioContext();
        const midi = await Midi.fromUrl(midiUrl);
        midiRef.current = midi;
        setDuration(midi.duration);

        notesRef.current = midi.tracks.flatMap(track => 
          track.notes.map(note => ({
            time: note.time,
            note: note.name,
            duration: note.duration,
            velocity: note.velocity
          }))
        ).sort((a, b) => a.time - b.time);

      } catch (error) {
        console.error('Error loading MIDI file:', error);
        setError('Error loading MIDI file');
      }
    };

    loadMidi();
  }, [midiUrl, initializeAudioContext]);

  const togglePlayPause = useCallback(async () => {
    await initializeAudioContext();
    if (isPlaying) {
      Tone.getTransport().pause();
    } else {
      Tone.getTransport().cancel();
      Tone.getTransport().seconds = currentTime;
      
      notesRef.current.filter(note => note.time >= currentTime).forEach(note => {
        Tone.getTransport().schedule((time) => {
          synthRef.current?.triggerAttackRelease(note.note, note.duration, time, note.velocity);
        }, note.time);
      });

      Tone.getTransport().start();
    }
    setIsPlaying(!isPlaying);
  }, [isPlaying, currentTime, initializeAudioContext]);

  const handleProgressChange = useCallback((newTime: number) => {
    setCurrentTime(newTime);
    setProgress((newTime / duration) * 100);
    Tone.getTransport().seconds = newTime;
  }, [duration]);

  const handleProgressBarInteraction = useCallback((event: React.MouseEvent<HTMLDivElement>) => {
    const bounds = event.currentTarget.getBoundingClientRect();
    const x = event.clientX - bounds.left;
    const clickedValue = (x / bounds.width) * duration;
    handleProgressChange(clickedValue);
  }, [duration, handleProgressChange]);

  const rewind = useCallback(() => {
    handleProgressChange(Math.max(currentTime - 10, 0));
  }, [currentTime, handleProgressChange]);

  const fastForward = useCallback(() => {
    handleProgressChange(Math.min(currentTime + 10, duration));
  }, [currentTime, duration, handleProgressChange]);

  useEffect(() => {
    const interval = setInterval(() => {
      if (isPlaying) {
        const current = Tone.getTransport().seconds;
        setCurrentTime(current);
        setProgress((current / duration) * 100);
        if (current >= duration) {
          setIsPlaying(false);
          Tone.getTransport().stop();
        }
      }
    }, 100);

    return () => clearInterval(interval);
  }, [isPlaying, duration]);

  const formatTime = (time: number) => {
    const minutes = Math.floor(time / 60);
    const seconds = Math.floor(time % 60);
    return `${minutes}:${seconds.toString().padStart(2, '0')}`;
  };

  return (
    <PlayerContainer>
      <ProgressBarContainer
        onClick={handleProgressBarInteraction}
        onMouseDown={handleProgressBarInteraction}
      >
        <ProgressBar width={progress} />
        <ProgressCircle left={progress} />
      </ProgressBarContainer>
      <Controls>
        <Button onClick={rewind} disabled={!synthRef.current}>
          <FaStepBackward />
        </Button>
        <PlayButton onClick={togglePlayPause}>
          {isPlaying ? <FaPause /> : <FaPlay />}
        </PlayButton>
        <Button onClick={fastForward} disabled={!synthRef.current}>
          <FaStepForward />
        </Button>
      </Controls>
      <TimeInfo>
        <span>{formatTime(currentTime)}</span>
        <span>{formatTime(duration)}</span>
      </TimeInfo>
      {error && <div style={{ color: 'red', marginTop: '1rem' }}>{error}</div>}
    </PlayerContainer>
  );
};

export default AudioPlayer;

Solution

  • You will probably want to allocate some samples for the instruments you are going to be using for better sound quality.

    The combination of tools - tone - tonejs/Midi (which uses midi-file) could also be problematic due to the resulting file.

    MIDI files are binary and notoriously varied in implementations, I come across many files that are oddly formatted, and many which cause the midi-file library to get confused at.

    In your case, I would recommend a different play solution.

    https://github.com/fraigo/javascript-midi-player

    May be worth a try. Feel free to DM me.