Posted in:

One of the recurring questions on the NAudio support forums is to do with how you can route different sounds to different outputs in a multi-channel soundcard setup. For example, can you play one MP3 file out of one speaker and a different one out of the other? If you have four outputs, can you route a different signal to each one?

The first issue to deal with is that just because your soundcard has multiple outputs, doesn’t mean you can necessarily open WaveOut with multiple outs. That depends on how the writers of the device driver have chosen to present the card’s capabilities to Windows. For example a four output card may appear as though it were two separate stereo soundcards. The good news is that if you have an ASIO driver, you ought to be able to open it and address all the outputs.

Having got that out of the way, in NAudio it is possible for audio streams to have any number of channels. The WaveFormat class has a channel count, and though this is normally set at 1 or 2, there is no reason why you can’t set it to 8.

What would be useful is an implementation of IWaveProvider that allows us to connect different inputs to particular outputs, kind of like a virtual patch bay. For example, if you had two Mp3FileReaders, and wanted to connect the left channel of the first to output 1 and the left channel of the second to output 2, this class would let you do that.

So I’ve created something I’ve called the MultiplexingWaveProvider (if you can think of a better name, let me know in the comments). In the constructor, you simply provide all the inputs you wish to use, and specify the number of output channels you would like. By default the inputs will be mapped directly onto the outputs (and wrap round if there are less outputs than inputs – so a single mono input would be automatically copied to every output), but these can be changed.

Creating and Configuring MultiplexingWaveProvider

In the following example, we create a new four-channel WaveProvider, so the first two outputs will play left and right channel from input1 and the second two outputs will have the left and right channels from input2. Note that input1 and input2 must be at the same sample rate and bit depth.

var input1 = new Mp3FileReader("test1.mp3");
var input2 = new Mp3FileReader("test2.mp3");
var waveProvider = new MultiplexingWaveProvider(new IWaveProvider[] { input1, input2 }, 4));

Then you can configure the outputs, which is done using ConnectInputToOutput:

waveProvider.ConnectInputToOutput(2,0);
waveProvider.ConnectInputToOutput(3,1);
waveProvider.ConnectInputToOutput(1,2);
waveProvider.ConnectInputToOutput(1,3);

The numbers used are zero-based, so connecting inputs 2 and 3 to outputs 0 and 1 means that test2.mp3 will now play out of the first two outputs instead of the second two. In this example I have connected input 1 (i.e. the right channel of test1.mp3) to both outputs 2 and 3. So you can copy the same input to multiple output channels, and not all input channels need a mapping.

Implementation of MultiplexingWaveProvider

The bulk of the work to achieve this is performed in the Read method of MultiplexingWaveProvider. The first task is to work out how many “sample frames” are required. A sample frame is a single sample in a mono signal, a left and right pair in a stereo signal, and so on. Once we have worked out how many sample frames we need, we then attempt to read that many sample frames from every one of the input WaveProviders (irrespective of whether they are connected to an output – we want to keep them in sync). Then, using our mappings dictionary, work out if any of the channels from this input WaveProvider are needed in the output. Since samples are interleaved in both input and output waveproviders, we can’t do just one Array.Copy – we must copy each sample across individually and put it into the right place.

A well behaved Read method will always return count unless it has reached the end of its available data (and then it should always return 0 in every subsequent call). The way we do this is work out the maximum number of sample frames read out of any of the inputs, and use that to report back the count that is read. This means that we will keep going until we have reached the end of all of our inputs. Because buffers might be reused, it is important that we zero out the output buffer if there was no available input data.

Here’s the implementation as it currently stands:

public int Read(byte[] buffer, int offset, int count)
{
    int sampleFramesRequested = count / (bytesPerSample * outputChannelCount);
    int inputOffset = 0;
    int sampleFramesRead = 0;
    // now we must read from all inputs, even if we don't need their data, so they stay in sync
    foreach (var input in inputs)
    {
        int bytesRequired = sampleFramesRequested * bytesPerSample * input.WaveFormat.Channels;
        byte[] inputBuffer = new byte[bytesRequired];
        int bytesRead = input.Read(inputBuffer, 0, bytesRequired);
        sampleFramesRead = Math.Max(sampleFramesRead, bytesRead / (bytesPerSample * input.WaveFormat.Channels));

        for (int n = 0; n < input.WaveFormat.Channels; n++)
        {
            int inputIndex = inputOffset + n;
            for (int outputIndex = 0; outputIndex < outputChannelCount; outputIndex++)
            {
                if (mappings[outputIndex] == inputIndex)
                {
                    int inputBufferOffset = n * bytesPerSample;
                    int outputBufferOffset = offset + outputIndex * bytesPerSample;
                    int sample = 0;
                    while (sample < sampleFramesRequested && inputBufferOffset < bytesRead)
                    {
                        Array.Copy(inputBuffer, inputBufferOffset, buffer, outputBufferOffset, bytesPerSample);
                        outputBufferOffset += bytesPerSample * outputChannelCount;
                        inputBufferOffset += bytesPerSample * input.WaveFormat.Channels;
                        sample++;
                    }
                    // clear the end
                    while (sample < sampleFramesRequested)
                    {
                        Array.Clear(buffer, outputBufferOffset, bytesPerSample);
                        outputBufferOffset += bytesPerSample * outputChannelCount;
                        sample++;
                    }
                }
            }
        }
        inputOffset += input.WaveFormat.Channels;
    }

    return sampleFramesRead * bytesPerSample * outputChannelCount;
}

Performance

Looking at the code above, you will probably notice that this could be made more efficient if we knew in advance whether we were dealing with 16, 24 or 32 bit input audio (it currently has lots of calls to Array.Copy to copy just 2, 3 or 4 bytes). And I might make three versions of this class at some point, to ensure that this performs a bit better. Another weakness in the current design is the creation of buffers every call to Read, which is something that I generally avoid since it gives work to the garbage collector (update – this is fixed in the latest code).

I have written a full suite of unit tests for this class, so if it does need some performance tuning, there is a safety net to ensure nothing gets broken along the way.

MultiplexingSampleProvider

NAudio 1.5 also has a ISampleProvider interface, which is a much more programmer friendly way of dealing with 32 bit floating point audio. I have also made MultiplexingSampleProvider for the next version of NAudio. One interesting possibility would be then to build on that to create a kind of bus matrix, where every input can be mixed by different amounts into each of the output channels.

Uses

This class actually has uses beyond supporting soundcards with more than 2 outputs. You could use it to swap left and right channels in a stereo signal, or provide a simple switch that selects between several mono inputs.

You also don’t need to output to the soundcard. The WaveFileReader will happily write multi-channel WAV files. However, there are no guarantees about how other programs will deal with WAVs that have more than two channels in them.

Availability

I’ve already checked in the initial version to the latest codebase, so expect this to be part of NAudio 1.6. The only caution is that I might change the class name if I come up with a better idea.

Want to get up to speed with the the fundamentals principles of digital audio and how to got about writing audio applications with NAudio? Be sure to check out my Pluralsight courses, Digital Audio Fundamentals, and Audio Programming with NAudio.

Comments

Comment by Anonymous

I'm very new to audio programming and I have been trying demos to setup a framework for realtime multichannel playback from a continous stream of multi channel float data at 20 kHz but I don't know where to start best:

At a speed of 20 kHz a callback is made to an event handler to provide an array of 32 float values. I want to mix the float values realtime and pass them through an effects library prior to playback.
Please help me out here

Anonymous
Comment by bull

Hey Mark,

Great article!

Could one use this code or nAudio to do per channel mute gain on playback of a multi channel ogg vorbis file?

Comment by Mark H

yes, you could extend this class fairly easily to support muting and adjusting the gain of channels (especially in the ISampleProvider version)

Comment by Anders

Hi Mark

Hi i have problems with the MultiplexingWaveProvider.
I want to play a sound out of the right channel i can get it to play out of the left channel if i create a MultiplexingWaveProvider with one output, but when i create it with 2 outputs it plays in both channels no matter how i connect the inputs to outputs.

Here is my code.
var waveProvider = new MultiplexingWaveProvider(new IWaveProvider[] { mainOutputStream }, 2 );

waveProvider.ConnectInputToOutput(0, 1);
waveProvider.ConnectInputToOutput(1, 1);

AsioOut dfd = new AsioOut();
string ffd = dfd.DriverName;
dfd.Init(waveProvider);
dfd.Play();

Thanks
Anders

Anders
Comment by Mark H

you can't route two inputs to one output. Also, there is no way to disconnect an output at the moment (something I plan to add later). For now you could make a silence producing WaveProvider (very easy to implement) and route that to the output you want silent.

Comment by Anonymous

Hi Mark!

Can you give any hints how to implement the silence producing wave provider? I'm quite new to sound programming...

Best regards

Anonymous
Comment by Anonymous

Hi Mark!

Can you give any hints how to implement the silence producing wave provider? I'm quite new to sound programming...

Best regards

Anonymous
Comment by Mark H

int Read(byte[] buffer, int offset, int count)
{
for(int n = 0; n < count; n++)
{
buffer[n + offset] = 0;
}
return count;
}

Comment by Anonymous

Thank you, it worked just fine!

Anonymous
Comment by Anonymous

Hi Mark,

Excellent work on NAudio.

I'm trying to perform something similar to the poster above; I'm trying to play a stereo source out of just one channel.

This would mean converting it to Mono first (is this possible?) and then in theory playing silence over the other channel(s).

In line with the above request, how would I route silence to, say channel 1 but still have an MP3 playing on channel 0 in mono?

Many thanks in advance.

Anonymous
Comment by Mark H

You won't need to convert to mono first, just route the first channel. You would also make a silencewaveprovder using the code above, adding that as an input and then routing that to the channels you want silent.

Comment by Anonymous

Mark,

Thanks for the advice. I've spent some time over the weekend looking into things further and I feel I have a much better understanding of what's going on.

I've created a silenceproducingwaveprovider (spwp) and I'm playing an mp3, both channels of the mp3 routed to output channel 0 (left) and the spwp is routed to 1 (right). This works perfectly and gives me a combined mono output in the left speaker and silence in the right as seen in the code below.

However, when I try to create the MultiplexingWaveProvider with 6 output channels for a 5.1 setup (or any number other than 2), I receive the following exception: "InvalidParameter calling waveOutOpen". This happens when calling "Dim PlaybackDevice As New WaveOut". Am I missing something? I have tried this one two different machines, both with 6 output channels available.

Many thanks in advance... again.



Dim Mp3FileReader As New Mp3FileReader("some path to mp3")

Dim SilenceProducingWaveProvider As New SilenceProducingWaveProvider

Dim MultiplexingWaveProvider As New MultiplexingWaveProvider(New IWaveProvider() {Mp3FileReader, SilenceProducingWaveProvider}, 2)

With MultiplexingWaveProvider
.ConnectInputToOutput(0, 0)
.ConnectInputToOutput(1, 0)
.ConnectInputToOutput(2, 1)
.ConnectInputToOutput(3, 1)
End With

Dim PlaybackDevice As New WaveOut

PlaybackDevice.Init(MultiplexingWaveProvider)
PlaybackDevice.Play()

Anonymous
Comment by Mark H

It may well be a limitation of the drivers for your WaveOut device. Just because your soundcard has 6 outputs, doesn't mean it is presented to Windows as a 6 output card. It might be 3 stereo outs. Have you tried using WasapiOut instead, or DirectSoundOut?

Comment by Anonymous

Thanks for the reply. Since posting above I've tried the other providers and they're all returning two channels so it looks like a driver issue.

I'm able to use BASS.Net's mixer to do what I'm trying above and it works except the two rear channel streams bleed into one another. I was hoping that NAudio would not present this problem.

Is there any way with NAudio, as there is with BASS to tell it to ignore what the driver reports and force the use of the Windows Control Panel driver count?

Anonymous
Comment by Anonymous

Is it possible to change the channel volumes using this provider?

I've had a try of multiplying various volumes by an integer in an effort to reduce the volume, but I'm not having any luck.

Anonymous
Comment by Mark H

I'm afraid I don't know what BASS .NET does and how it lets you open with multi channels. NAudio just wraps the Windows APIs.

Comment by Zxeltor

Mark. Thanks for posting this. This post has been very helpful.

I’m currently using this class to playback audio files on different channels (Left and Right). Sometimes simultaneously.

I’m curious. The directsound device I’m using occasionally craps out and I have to kill my app. I noticed when you init the default constructor for the directsound device, a guid is assigned. Is it a bad idea to have more then one instance of this output device initialized and performing playback at the same time?

Also … the example given above concerning the silence wave provider. If count never returns zero the playback complete event will never get triggered when using the silence provider with another audio source. I ended up keeping track of my silence streams position under the hood so my read method would eventually return zero. Hopefully this will help any other code monkeys that stumble along.

Zxeltor
Comment by Anonymous

Hi Mark.
Is there anyway to change speed of playing?

Anonymous
Comment by Mark H

@Anonymous, not easily. You would need to make your own WaveProvider to provide the audio at a faster rate (e.g. skip every other sample for a very rudimentary approach)

Comment by m.bagattini

Thanks for the article Mark! I'd like to use this method to play the same track on left/right channel, with one channel delayed; I'd like to try this since I noticed on some daleyed tracks that spoken words are more easily understandable. It's just a theory but I'd like to give it a shot.

So I have my mp3 playing on both earplugs now, do you have any suggestion about how to delay a single channel?

Comment by m.bagattini

Thanks for the article Mark! I'd like to use this method to play the same track on left/right channel, with one channel delayed; I'd like to try this since I noticed on some daleyed tracks that spoken words are more easily understandable. It's just a theory but I'd like to give it a shot.

So I have my mp3 playing on both earplugs now, do you have any suggestion about how to delay a single channel?

Comment by Peter v E

Hi Mark,

Frist of all: Great work on the whole project! It's fun to work with and has all the functions one could ask for :)

I do have a question about the MultiplexingWaveProvider. I'm using it to send a WaveChannel32 which contains a stream from the MP3FileReader to one channel of my sound card. The other channel gets a silent stream (a stream of zeros). This works and all. But how can i determine if the stream from the mp3FileReader is finished? The PlaybackStopped event isn't raised because the silence stream keeps giving zeros.

Is there a way to detect if the mp3FileReader stream is finished?

This is the code i'm using:

Try
outputStream = CreateInputStream(localFilePath)
Dim silenceStream As New SilenceWaveProvider
outDevice = New WasapiOut(device, AudioClientShareMode.Shared, True, 0)
Try
Dim multiplex As New MultiplexingWaveProvider(New IWaveProvider() {outputStream, silenceStream}, 2)
Select Case _channel
Case 0
multiplex.ConnectInputToOutput(0, 0)
multiplex.ConnectInputToOutput(1, 0)
multiplex.ConnectInputToOutput(2, 1)
Case 1
multiplex.ConnectInputToOutput(0, 1)
multiplex.ConnectInputToOutput(1, 0)
multiplex.ConnectInputToOutput(2, 0)
End Select
outDevice.Init(multiplex)
outDevice.Play()
Catch ex As Exception
WriteLog(LogType.ErrorMessage, "Error playing file " & filename, ex.Message)
End Try
Catch ex As Exception
WriteLog(LogType.ErrorMessage, "Error opening audiodevice or creating audio stream", ex.Message)
End Try

Any help would be greatly appreciated!

Comment by Mark H

hi Peter - it's a slightly tricky problem, but I'd do it with a couple of custom Wave/Sample providers. One would simply pass on the audio read through your Mp3FileReader and set a flag when it reaches the end (Read returns 0). Then it would notify your custom silence producing WaveProvider allowing it to stop producing zeros.

Either that or you customise the multiplexer to be able to stop when the first input reaches its end (rather than currently it waits for the last)

Comment by Mark H

I'd just write a custom Wave/SampleProvider to do this. Left right samples are interleaved, so you'd need to store a bit of history for the channel you wanted to delay, but it would be fairly straightforward bit manipulation

Comment by Peter van Ekeren

Hi Mark,

I'm sorry for responding this late. I didn't see your response until now.

Thanks for your solution! I did get it to work but I cheated a bit.

I created a second stream of the same source file and set the volume to 0. Then I feed two streams to the multiplexer and voila (:

This works for now, but I'll look into your solution since it's cleaner.

Thanks again!

-code:

firstOutputStream = CreateInputStream(localFilePath, 1.0F)
secondOutputStream = CreateInputStream(localFilePath, 0.0F)
outDevice = New WasapiOut(device, AudioClientShareMode.Shared, True, 0)
Try
Dim multiplex As MultiplexingWaveProvider
multiplex = New MultiplexingWaveProvider(New IWaveProvider() {firstOutputStream, secondOutputStream}, 2)
Select Case _channel
Case 0
With multiplex
.ConnectInputToOutput(0, 0)
.ConnectInputToOutput(2, 1)
End With
Case 1
With multiplex
.ConnectInputToOutput(0, 1)
.ConnectInputToOutput(2, 0)
End With
End Select
outDevice.Init(multiplex)
outDevice.Play()
Catch ex As Exception

End Try

Comment by John

I'm trying to make a sine wave play in only one ear. No matter what I do it plays in both ears. I feel like I've tried every inputToOutput combination and I still get it output on both sides.

Here's my code.. what am I doing wrong??

NOTE: Both SineWaveProvider and SilentWaveProvider work on their own.
_waveOutput is a class level private object of type WaveOut

public void Play()
{
var sineWaveProvider = new SineWaveProvider();
sineWaveProvider.Frequency = Frequency;


var swp = new SilentWaveProvider();

var mwp = new MultiplexingWaveProvider(new IWaveProvider[] { swp, sineWaveProvider }, 2);


mwp.ConnectInputToOutput(0, 0);
mwp.ConnectInputToOutput(1, 1);

_waveOutput = new WaveOut();
_waveOutput.Init(mwp);
_waveOutput.Play();
}


////////READ FUNCTION OF SineWaveProvider/////////////////
private int _sample;

public override int Read(float[] buffer, int offset, int sampleCount)
{
int sampleRate = WaveFormat.SampleRate;

for(int n = 0; n < sampleCount; n++ )
{
buffer[n + offset] = (float)(Amplitude * Math.Sin((2 * Math.PI * _sample * Frequency) / sampleRate));

_sample++;
if (_sample >= sampleRate) _sample = 0;
}

return sampleCount;
}



////////READ FUNCTION OF SilentWaveProvider////////////////
private int _sample;

public override int Read(float[] buffer, int offset, int sampleCount)
{
int sampleRate = WaveFormat.SampleRate;

for (int n = 0; n < sampleCount; n++)
{
buffer[n + offset] = 0.00f;

_sample++;
if (_sample >= sampleRate) _sample = 0;
}

return sampleCount;
}

John
Comment by Mark H

MultiplexingWaveProvider does have unit tests that check this scenario works correctly, so I'm surprised you're having a problem. How about write some audio to a WAV file and check it in audacity, see what you are actually generating,.

Comment by John

Thanks Mark!
I noticed something strange too. In the read function of the SineWaveProvider I tried this:

buffer[n + offset] = (float)(Amplitude * Math.Sin((2 * Math.PI * _sample * Frequency) / sampleRate));

buffer[n + offset + 1] = 0.00f;

When debugging I noticed that the value in buffer[n + offset + 1] wasn't actually being set to 0.00! (I also tried (float)0; just to be sure it wasn't something dumb like that). I'm thinking it may be 1 of 2 things:
1) The buffer[] is passed in to the function so it may be possible that something else is accessing the buffer[] at the same time
OR
2) Since i'm using the alpha release of VisualStudio 2013 there MAY be a bug with assigning array values or some other obscure bug. I would think that would be a basic compiler function that would have been mastered many years ago but you never know.

I'll try what you suggested when I get home.
I'll also try it in VisualStudio 2012 and dig into the nAudio source to look for other threads accessing the buffer.

John
Comment by John

This whole time it was MY SOUND CARD!

I don't yet know exactly why it'e my sound card but when I tried this on a different machine it worked fine...

OMFG DUCKSAUCE!
http://xkcd.com/457/

John
Comment by vitek

WMA file reader does not see all 6 channels in my input file, just 2! Why? Here is the source:

var wmaStream = new WMAFileReader(fileName);
wmaStream.WaveFormat.Channels gives 2 instead of 6 (Audacity plays it OK and shows all 6 channels!).

Any help appreciated.

Comment by Luc Baeten

Hi Mark
I am trying the following: I want to play a sound file but filter the sound for the left ear with a specific equalizer end filter the sound for the right ear with another equalizer. I am using MultiplexingSampleProvider to realize this but can't get it to work. Sound seems to be the same on both ears. Any idea how to realize this? Many thanks in advance.
Kind regards
Luc

Luc Baeten
Comment by Mark Heath

you can use MultiplexingSampleProvider, but only if you've split your input into two independent SampleProviders, which I don't think you've got. I'd personally make a custom ISampleProvider that in its read method, passed every other sample through an equalizer, while leaving the others unchanged.

Mark Heath
Comment by Luc Baeten

Thanks Mark, for your response. Yes I have splitted the input in two independent providers and working now with the equalizer settings per provider.
Working on this implementation I have another question. Can you tell me how to deal with volume together with the gain of the equalizer? Should I define the volume on the AudioFileReader and adjust the gain of the equalizerbands or should I work only with the gain of the bands and leave volume out?
Regards
Luc

Luc Baeten
Comment by Luc Baeten

Hello Mark, Is it possible to get paid support from you? I am struggling with this muliplexing provider and hoping you could help.
Hope to hear from you soon.
Kind regards
Luc

Luc Baeten
Comment by Mark Heath

hi Luc, sorry I have been away on holiday. I do occasionally do small bits of paid support for NAudio. Best to email me privately about this and I can discuss terms and what I could offer you. My email is my full name at gmail dot com

Mark Heath
Comment by Jacob Stolmeier

Hello Mark,
I have an AD/DA convertor that my parents gave me that has 16 channels in and 16 channels out. With this I have been wanting to create basically a telephone switch board where my friends and I can make connections, and break connections. To have this ability, I would need to join multiple inputs to a single output and any of those joined inputs to any output that I would like. Basically a matrix. Also the end goal is to be able to throw in some ringtone Wav files as well when a person picks up the phone. I see you have the mixingwaveprovider that gives me the ability to mix my wav files and you the have the MultiplexingWaveProvider which allows me to choose any channel I would like. But I was wondering if there was any way I could join the two? Mixes the wav files the put on any channel?

Jacob Stolmeier
Comment by Jacob Stolmeier

Also I would love to share my final project with you as well. It could be pretty fun to mix any inputs with any outputs.

Jacob Stolmeier
Comment by Mark Heath

for this case i'd probably make an adapted version of the MultiplexingSampleProvider so you can route input channels to the right output channel, but also mix together.

Mark Heath
Comment by Frank Post

Hello Mark,
I have a brief question regarding compatible multi-channel sounds cards that would allow proper input/output mapping using the MultiplexingWaveProvider. I have tried several Sound Blasters 7:1 and 5:1 internal cards and using the ASIO4all driver with no luck. Either I get no sound or I am getting sound throughout all the channels. For testing purposes, I used your sample code described in your article to test the card's compatibility.
Would you be so kind to let me know what Multi-Channel Audio Cards in particular are known to be compatible with NAudio's input/output channel mapping?
Thank you for your time.

Frank Post
Comment by Wilbert Bongers

Hi Mark,
I am writing a program to record and trim audio files. For now it works fine if the same Asio device is used for both recording as well as playback audio. However it needs to be configurable to use two different Asio devices. Would this be possible and what would be the best approach?

Wilbert Bongers
Comment by Andrew Rogers

Hi Mark. I was wondering if it should be possible to play a stereo MP3 out any of the 5.1 channels of my sound card. I have been able to get a 5.1 WAV file to play out of whatever channels I choose. But i cant get a stereo file to play when i specified 6 output channels. I think i recall reading somewhere that the input file needs to have the same number of channels as the output.

Andrew Rogers
Comment by Mark Heath

I don't think it's possible to use two ASIO devices simultaneously unfortunately

Mark Heath
Comment by Mark Heath

Unfortunately I don't know how to do this. I assume it's possible, but it's never been something I've needed to do myself.

Mark Heath
Comment by Andrew Rogers

Thanks for the response. I couldn't get it to work but that could have been my incompetence. I ended up using a different library to do it. My use case is a bit weird. I am using it for our security system - i have a 4 channel amp with each channel going off to a different part of the building. Some parts want loud alarms while others dont want any. So being able to send a simple stereo MP3 out to all channels at different volumes is what i needed. I didnt want the hassle of having to convert the files to 5.1.

Andrew Rogers
Comment by Annonymus

Hi Mark,
Is it possible to create a stereo WAV from 2 `RawSourceWaveStream`s?

Annonymus