Skip to content
This repository has been archived by the owner on Jun 9, 2019. It is now read-only.

Alright, I give up. Can someone help, please? #65

Open
FireController1847 opened this issue Jan 31, 2018 · 3 comments
Open

Alright, I give up. Can someone help, please? #65

FireController1847 opened this issue Jan 31, 2018 · 3 comments

Comments

@FireController1847
Copy link

Now I know this isn't an issue but I am completely lost. I'm quite new to this so please bare with me. This is my current attempt code:

const Speaker = require("speaker");
const AudioContext = require("web-audio-engine").StreamAudioContext;
const decode = require("audio-decode");
const fs = require("fs");
const context = new AudioContext();

console.log("Running");
(async () => {
  const file = fs.readFileSync("./myfile.mp3");
  console.log("Decoding");
  const buffer = await decode(file);
  console.log("Decoded");
  const amp = context.createGain();

  const source = context.createBufferSource();
  source.buffer = buffer;
  source.connect(amp);
  source.start();

  amp.gain.value = 1;
  amp.connect(context.destination);
  
  context.pipe(new Speaker());
  context.resume();
  console.log("End");
})();

I am attempting to pipe the stream into a new speaker, and play the mp3 file attached (essentially creating a mini audio player). Issue is, this literally does nothing. It doesn't error. It doesn't work. It does nothing. The only thing it does is log my three logs, and then there's absolutely no audio. I can get the example on the main page to work, but that's it, I can't seem to find anything else in this world that will work. Can someone help me out, please?

@mohayonao
Copy link
Owner

mohayonao commented Jan 31, 2018

audio-decode returns AudioBuffer for audiojs. WAE is not compatible it.

So, source.buffer = buffer; does not work.

You should convert audios AudioBuffer to WAE AudioData format.
Probably, like this.

const decode = require("audio-decode");

wae.decoder.set("mp3", async (file) => {
  const buffer = await decode(file);
  const audioData = {
    sampleRate: buffer.sampleRate,
    channels: Array.from({ length: buffer. numberOfChannels }, (_, ch) => {
      return buffer.getChannelData(ch);
    })
  };
  return audioData;
});

@FireController1847
Copy link
Author

FireController1847 commented Jan 31, 2018

Alright, so I'm using your example to do something like this:

const Speaker = require("speaker");
const wae = require("web-audio-engine");
const AudioContext = wae.StreamAudioContext;
const decode = require("audio-decode");
const fs = require("fs");

async function decodeMP3(file) {
  const buffer = await decode(file);
  const audioData = {
    "numberOfChannels": buffer.numberOfChannels,
    "length": buffer.length,
    "sampleRate": buffer.sampleRate,
    "channels": Array.from({"length": buffer.numberOfChannels}, (_, ch) => {
      return buffer.getChannelData(ch);
    })
  };
  return audioData;
}
wae.decoder.set("mp3", decodeMP3);
wae.decoder.set("m4a", decodeMP3);

const context = new AudioContext();

console.log("Running");
(async () => {
  const file = fs.readFileSync("./mysong.mp3");
  console.log("Decoding");
  const buffer = await context.decodeAudioData(file);
  console.log("Decoded");
  const amp = context.createGain();

  const source = context.createBufferSource();
  source.buffer = buffer;
  source.connect(amp);
  source.start();

  amp.gain.value = 1;
  amp.connect(context.destination);
  
  context.pipe(new Speaker());
  context.resume();
  console.log("End");
})();

process.on("unhandledRejection", console.trace);

The issue is, this doesn't work. (ex. TypeError: Failed to decode audio data) So, I changed "channels" to "channelData", and I get a very distorted kinda slow output, with the "stereo" effect kinda weird. Sorry I'm so new to this >.< Here's what gave the distorted output.

async function decodeMP3(file) {
  const buffer = await decode(file);
  const audioData = {
    "numberOfChannels": buffer.numberOfChannels,
    "length": buffer.length,
    "sampleRate": buffer.sampleRate,
    "channelData": Array.from({"length": buffer.numberOfChannels}, (_, ch) => {
      return buffer.getChannelData(ch);
    })
  };
  return audioData;
}

I've also attempted to use node-lame to decode it, but I got completely lost >.< I will confirm that when using the same file but converted to WAV it works fine.

@FireController1847
Copy link
Author

I've now found that multiplying the buffer rate by 2 will fix the weird slowdown of the song, but I'm still having an issue where the song is playing twice, one in one ear and one in the other. To be specific, what's happening is my left ear is starting from the beginning of the song and is working fine. The issue is, my right ear is starting from already half way through the song.

Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants