Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Playing / Not playing states and/or audio file duration #60

Open
kidfribble opened this issue Mar 14, 2017 · 4 comments
Open

Playing / Not playing states and/or audio file duration #60

kidfribble opened this issue Mar 14, 2017 · 4 comments

Comments

@kidfribble
Copy link
Contributor

Ideally for both the pre-saved audio and the list of saved and uploaded audio I could have some way of showing in the UI that an audio file is playing and knowing when it has finished playing. Even better if I can get a stop event and a duration to work with. I don't see this is essential, but would be valuable experience add.

Right now play() in notes-list.js returns a promise, which is fine for confirming that it has begun, but I don't see a way to sense that it has completed.

@gr2m
Copy link
Contributor

gr2m commented Mar 14, 2017

looking into this now

@gr2m
Copy link
Contributor

gr2m commented Mar 15, 2017

ok, audio.play() starts the play (it does not return a promise btw). You can pause it with audio.pause(). After pausing, audio.play() will resume. You can get the duration in seconds with audio.duration. If you want to play from the start, set audio.currentTime = 0, then audio.play(). More info on https://developer.mozilla.org/en-US/docs/Web/API/HTMLMediaElement

@gr2m
Copy link
Contributor

gr2m commented Mar 15, 2017

if you want to show audiowaves, we can probably do that with https://github.com/bbc/waveform-data.js. I’d prefer to use canvas instead of full d3 to reduce the amount of the built that needs to be downloaded by the user

@gr2m
Copy link
Contributor

gr2m commented Mar 15, 2017

I tried this

const audioContext = new AudioContext()
var reader = new FileReader()
reader.addEventListener('loadend', () => {
  webAudioBuilder(audioContext, reader.result, (error, waveform) => {
    if (error) {
      console.log(`\nerror ==============================`)
      console.log(error)
      return
    }

    console.log(waveform.duration)
  })
})
reader.readAsArrayBuffer(blob)

But unfortunately it fails with DOMException: Unable to decode audio data.

My last hope is that we can record the volume/time during the actual recording and store this information together with the audio file, then use that. But I gotta postpone this

@kidfribble kidfribble self-assigned this Mar 15, 2017
@gr2m gr2m unassigned gr2m and kidfribble Sep 28, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants