I am creating a UWP app in React Native Windows and I’m trying to get data from a specific microphone input to an audio buffer. However when I start my AudioGraph and call getFrame() on the QuantumStarted event, it returns an empty object ({}). When I output to a file instead, I can hear the microphone data during playback.
const deviceInputNodeResult = await audioGraph.createDeviceInputNodeAsync(Windows.Media.Devices.AudioDeviceRole.default, encodingProperties, deviceId);
This is creating the input node with the encoding properties of PCM and my device.
`const frameOutputNode = await audioGraph.createFrameOutputNode(encodingProperties);
deviceInputNode.addOutgoingConnection(frameOutputNode);`
I then create the output node with the same properties and link them through an outgoing connection.
audioGraph.start();
audioGraph.addEventListener("quantumstarted", () => { let frame = frameOutputNode.getFrame(); console.log(frame) });
This then tries to get the frame from the output node, which should result in an AudioFrame, but results in a blank object.
How can I adjust this to be able to pull frames, or is there a different approach to get live microphone data into a buffer?
SopranoSacannah is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.