0 Comments

In this post I will explain how to use the WaveFileWriter class that is part of NAudio. I will discuss how to use it now in NAudio 1.4 and mention some of the changes that will be coming for NAudio 1.5.

The purpose of WaveFileWriter is to allow you to create a standard .WAV file. WAV files are often thought of as containing uncompressed PCM audio data, but actually they can contain any audio compression, and are often used as containers for telephony compression types such as mu-law, ADPCM, G.722 etc.

NAudio provides a one-line method to produce a WAV file if you have an existing WaveStream derived class that can provide the data (in NAudio 1.5 it can be an IWaveProvider).

string tempFile = Path.Combine(Path.GetTempPath(), Guid.NewGuid().ToString() + ".wav");
WaveFormat waveFormat = new WaveFormat(8000, 8, 2);
WaveStream sourceStream = new NullWaveStream(waveFormat, 10000);
WaveFileWriter.CreateWaveFile(tempFile, sourceStream);

In the above example, I am using a simple utility class as my source stream, but in a real application this might be the output of a mixer, or the output from some effects or a synthesizer. The most important thing to note is that the Read method of your source stream MUST eventually return 0, otherwise your file will keep on writing until your disk is full! So beware of classes in NAudio (such as WaveChannel32) that can be configured to always return the number of bytes asked for from the Read method.

For greater control over the data you write, you can simply use the WriteData method (renamed to “Write” in NAudio 1.5, as WaveFileWriter will inherit from Stream). WriteData assumes that you are providing raw data in the correct format and will simply write it directly into the data chunk of the WAV file. This is therefore the most general purpose way of writing to a WaveFileWriter, and can be used for both PCM and compressed formats.

byte[] testSequence = new byte[] { 0x1, 0x2, 0xFF, 0xFE };
using (WaveFileWriter writer = new WaveFileWriter(fileName, waveFormat))
{
    writer.WriteData(testSequence, 0, testSequence.Length);
}

WaveFileWriter has an additional constructor that takes a Stream instead of a filename, allowing you to write to any kind of a stream (for example, a MemoryStream). Be aware though that when you dispose the WaveFileWriter, it disposes the output stream, so use the IgnoreDisposeStream utility class to wrap the output stream if you don’t want that to happen.

One of the most commonly used bit depths for PCM WAV files is 16 bit, and so NAudio provides another WriteData overload (to be called WriteSamples in NAudio 1.5) that allows you to supply data as an array of shorts (Int16s). Obviously, this only really makes sense if you are writing to a 16 bit WAV file, but the current implementation will also try to scale the sample value for different bit depths.

short[] samples = new short[1000];
// TODO: fill sample buffer with data
waveFileWriter.WriteData(samples, 0, samples.Length);

Another consideration is that very often after applying various audio effects (even as simple as changing the volume), the audio samples stored as 32 bit floating point numbers (float or Single). To make writing these to the WAV file as simple as possible, a WriteSamplefunction is provided, allowing you to write one sample at a time. If the underlying PCM format is a different bit depth (e.g. 16 or 24 bits), then the WriteSample function will attempt to convert the sample to that bit depth before writing it to a file. NAudio 1.5 will also feature a WriteSamplesfunction to allow arrays of floating point samples to be written. The following example shows one second of a 1kHz sine wave being written to a WAV file using the WriteSamplefunction:

float amplitude = 0.25f;
float frequency = 1000;

for (int n = 0; n < waveFileWriter.WaveFormat.SampleRate; n++)
{
    float sample = (float)(amplitude * Math.Sin((2 * Math.PI * n * frequency) / waveFileWriter.WaveFormat.SampleRate));
    waveFileWriter.WriteSample(sample);
}
Want to get up to speed with the the fundamentals principles of digital audio and how to got about writing audio applications with NAudio? Be sure to check out my Pluralsight courses, Digital Audio Fundamentals, and Audio Programming with NAudio.
Vote on HN

Comments

Comment by hen vertis

Hi
First of all I want to congratulate you on the software package Naudio.
my question is :
i have 2 WaveProvider32 that produce sine wave at diffrent amplitude(like the example "Play a Sine Wave")i have also MultiplexingWaveProvider that connect WaveProvider32 of sine wave 1 to output 0 and WaveProvider32 of sine wave 2 to output 1.
(i dont do anything in MultiplexingWaveProvider.read function) the sound work perfect .
i want to be able to write the both signals to wave file how can i do it?
should i pass WaveFileWriter reference to the 2 instances of WaveProvider32 ?
can i get the the samples from MultiplexingWaveProvider and write it?
what about the performance issue I/O can be in another thread,how can i do it ?

Comment by Mark H

you can just pass your MultiplexingWaveProvider into the WaveFileWriter.CreateWaveFile function. CreateWaveFile will call Read repeatedly until it reaches the end. You can do this in another thread no problem. One thing is that your source WaveProviders must return 0 from Read when they reach the end, or you'll create a never-ending WAV file that fills your hard disk up.

Comment by hen vertis

Hi Mark
what i do wrong?


SineWaveProvider32 file

namespace Hello2ChannelsAudio
{
public class SineWaveProvider32 : IWaveProvider
{
private WaveFormat waveFormat;
float m_fAmplitude;
float m_fFrequency;
int m_nSample;



public SineWaveProvider32( int sampleRate, int channels, float Amplitude,float Frequency)

{
m_fAmplitude = Amplitude;
m_fFrequency = Frequency;
SetWaveFormat(sampleRate, channels);

}


public void SetWaveFormat(int sampleRate, int channels)
{
this.waveFormat = WaveFormat.CreateIeeeFloatWaveFormat(sampleRate, channels);
}

public int Read(byte[] buffer, int offset, int count)
{
WaveBuffer waveBuffer = new WaveBuffer(buffer);
int samplesRequired = count / 4;
int samplesRead = Read(waveBuffer.FloatBuffer, offset / 4, samplesRequired);
return samplesRead * 4;
}

public int Read(float[] buffer, int offset, int sampleCount)
{
int sampleRate = WaveFormat.SampleRate;
for (int n = 0; n < sampleCount; n++)
{
buffer[n + offset] = (float)(m_fAmplitude * Math.Sin((2 * Math.PI * m_nSample * m_fFrequency) / sampleRate));
m_nSample++;
if (m_nSample >= sampleRate)
m_nSample = 0;
}
return sampleCount;
}

public WaveFormat WaveFormat
{
get { return waveFormat; }
}
}
}


Form file



private void button1_Click(object sender, EventArgs e)
{
if (waveOut == null)
{
input1 = new SineWaveProvider32(10000, 1, 0.25F, 1000F);

input2 = new SineWaveProvider32(10000, 1, 1.5F, 2000F);

multiplexingWaveProvider = new MultiplexingWaveProvider(new IWaveProvider[] { input1, input2 }, 2);
multiplexingWaveProvider.ConnectInputToOutput(0, 0);
multiplexingWaveProvider.ConnectInputToOutput(1, 1);
waveOut = new WaveOut();
waveOut.Init(multiplexingWaveProvider);
waveOut.Play();
WaveFileWriter.CreateWaveFile("temp.wav", multiplexingWaveProvider);

}
else
{
waveOut.Stop();
waveOut.Dispose();
waveOut = null;
}
}

Comment by Mark H

you've got two things trying to read from the same waveprovider. You can either play or write a wave file. If you want to do both, creae a new IWaveProvider that in the Read method, reads from the source, writes to a WAV file and then passes the data on through.

Comment by hen vertis

Hi Mark
Thanks for the fast response.
so i need to pass the same instance of wavefilewriter to both waveproviders ?

Comment by Mark H

no, the new waveprovider you make should be the last thing in the pipeline, after the multiplexer. It makes its own WaveFileWriter and writes the data into it as it passes it through in its Read function

Comment by hen vertis

Hi again
i do what you describe and a wave file is genreate but it seems that is not sync like what i hear from the player.

here is my code:

public class SineWaveProvider32 : IWaveProvider
{
private WaveFormat waveFormat;
float m_fAmplitude;
float m_fFrequency;
int m_nSample;
WaveFileWriter m_waveFileWriter;


public SineWaveProvider32( WaveFileWriter waveFileWriter,int sampleRate, int channels, float Amplitude,float Frequency)

{
m_waveFileWriter = waveFileWriter;
m_fAmplitude = Amplitude;
m_fFrequency = Frequency;
SetWaveFormat(sampleRate, channels);

}


public void SetWaveFormat(int sampleRate, int channels)
{
this.waveFormat = WaveFormat.CreateIeeeFloatWaveFormat(sampleRate, channels);
}

public int Read(byte[] buffer, int offset, int count)
{
WaveBuffer waveBuffer = new WaveBuffer(buffer);
int samplesRequired = count / 4;
int samplesRead = Read(waveBuffer.FloatBuffer, offset / 4, samplesRequired);
return samplesRead * 4;
}

public int Read(float[] buffer, int offset, int sampleCount)
{
int sampleRate = WaveFormat.SampleRate;
for (int n = 0; n < sampleCount; n++)
{
buffer[n + offset] = (float)(m_fAmplitude * Math.Sin((2 * Math.PI * m_nSample * m_fFrequency) / sampleRate));
m_nSample++;
if (m_nSample >= sampleRate)
m_nSample = 0;
m_waveFileWriter.WriteSample(buffer[n + offset]);
}
return sampleCount;
}

public WaveFormat WaveFormat
{
get { return waveFormat; }
}
}

private void button1_Click(object sender, EventArgs e)
{
if (waveOut == null)
{
waveFileWriter = new WaveFileWriter("temp.wav", new WaveFormat(10000, 1));
input1 = new SineWaveProvider32(waveFileWriter,10000, 1, 0.25F, 1000F);

input2 = new SineWaveProvider32(waveFileWriter,10000, 1, 1.5F, 2000F);

multiplexingWaveProvider = new MultiplexingWaveProvider(new IWaveProvider[] { input1, input2 }, 2);
multiplexingWaveProvider.ConnectInputToOutput(0, 0);
multiplexingWaveProvider.ConnectInputToOutput(1, 1);
waveOut = new WaveOut();
waveOut.Init(multiplexingWaveProvider);
waveOut.Play();

//WaveFileWriter.CreateWaveFile("temp.wav", multiplexingWaveProvider);

}
else
{
waveOut.Stop();
waveOut.Dispose();
waveOut = null;
waveFileWriter.Close();
}
}

Comment by Mark H

no, the wavefilewriter must be after the multiplexer in the pipeline. You have it before. You need to create a new class that implements IWaveProvider. In its constructor it takes the multiplexing wave provider. In its Read method it reads from the multiplexing wave provider, writes what it gets to a file, and then returns what it read to the caller (which will be WaveOut)

Comment by hen vertis

Hi
I lost you :(
i have 2 WaveProvider32 ~ Sine
after i have MultiplexingWaveProvider .
now i need to add another waveProvider that get data from the multiplexingWaveProvider?

Comment by Mark H

yes, you'll need to create a helper class if you want to both play and record to WAV at the same time.

Comment by hen vertis

Hi
i think its work.
i add new class like you say.
class MultiplexingWaveProvider32Stereo : IWaveProvider
{
private WaveFormat m_waveFormat;
WaveFileWriter m_waveFileWriter;
MultiplexingWaveProvider m_multiplexingWaveProvider;
public MultiplexingWaveProvider32Stereo(MultiplexingWaveProvider multiplexingWaveProvider, WaveFileWriter waveFileWriter)
{
m_waveFileWriter = waveFileWriter;
m_waveFormat = multiplexingWaveProvider.WaveFormat;
m_multiplexingWaveProvider = multiplexingWaveProvider;
}



#region IWaveProvider Members

public int Read(byte[] buffer, int offset, int count)
{
WaveBuffer waveBuffer = new WaveBuffer(buffer);
int samplesRead =m_multiplexingWaveProvider.Read(waveBuffer.ByteBuffer, offset, count);
for (int i = 0; i < samplesRead/4; i++)
{
m_waveFileWriter.WriteSample(waveBuffer.FloatBuffer[i]);
}
return samplesRead;
}







public WaveFormat WaveFormat
{
get { return m_waveFormat; }
}

#endregion
}

Thanks

Comment by hen vertis

Hi
now i think its working i add the new class.

class MultiplexingWaveProvider32Stereo : IWaveProvider
{
private WaveFormat m_waveFormat;
WaveFileWriter m_waveFileWriter;
MultiplexingWaveProvider m_multiplexingWaveProvider;
public MultiplexingWaveProvider32Stereo(MultiplexingWaveProvider multiplexingWaveProvider, WaveFileWriter waveFileWriter)
{
m_waveFileWriter = waveFileWriter;
m_waveFormat = multiplexingWaveProvider.WaveFormat;
m_multiplexingWaveProvider = multiplexingWaveProvider;
}



#region IWaveProvider Members

public int Read(byte[] buffer, int offset, int count)
{
WaveBuffer waveBuffer = new WaveBuffer(buffer);
int samplesRead =m_multiplexingWaveProvider.Read(waveBuffer.ByteBuffer, offset, count);
for (int i = 0; i < samplesRead/4; i++)
{
m_waveFileWriter.WriteSample(waveBuffer.FloatBuffer[i]);
}
return samplesRead;
}







public WaveFormat WaveFormat
{
get { return m_waveFormat; }
}

#endregion
}



Thanks :)

Comment by hen vertis

Hi Mark
i have 2 WaveProvider32 and MultiplexingWaveProvider32Stereo ,
i have at the begining of playing and recording at the same time alots of beeps that disapear after 5-10 seconds also in the file that WaveFileWriter create.
What could be the problem?
thanks

Comment by Anonymous

Hi, I juste want to resample but only CreateWaveFile work and i dont want a wave file

bye [] data = From64String("----- string encoded ---");
MemoryStream fs = new MemoryStream(data);
var baseDir = AppDomain.CurrentDomain.BaseDirectory;

using (var wfr = new WaveFileReader(fs))
{
var outputFormat = new WaveFormat(8000, 16, 1);
using (var pcmStream = new WaveFormatConversionStream(outputFormat, wfr))
{
using (var ms = new MemoryStream())
{
var bytesRead = -1;
while (bytesRead != 0)
{
var buffer = new byte[pcmStream.WaveFormat.AverageBytesPerSecond];
bytesRead = pcmStream.Read(buffer, 0, pcmStream.WaveFormat.AverageBytesPerSecond);


ms.Write(buffer, 0, bytesRead);
}

program.WaveHeaderIN(ms.GetBuffer());
ms.Position = 0;
RawSourceWaveStream RawStram = new RawSourceWaveStream(ms, outputFormat);

System.IO.File.WriteAllBytes(@"Desktop\waveConvertBy.wav", ms.GetBuffer());
// to make a real wav file...

ms.Position = 0;
WaveFileWriter.CreateWaveFile(Path.Combine(\Desktop\output.wav"), RawStram);
Console.WriteLine("wavefile length: " + RawStram.Length);

}
}
}
ms.getBUffer is not a wav file when i play it. there is another way to put RawSourcewave stream into byte? or memory Stream?
thanks

Anonymous
comments powered by Disqus