Flash Development for Android: Audio Input via Microphone

Exclusive offer: get 50% off this eBook here
Flash Development for Android Cookbook

Flash Development for Android Cookbook — Save 50%

Over 90 recipes to build exciting Android applications with Flash, Flex, and AIR

£18.99    £9.50
by Joseph Labrecque | June 2011 | Cookbooks Enterprise Articles

This article discusses how to capture audio from integrated device hardware through Flash based capture methods.

This article by Joseph Labrecque, author of Flash Development for Android Cookbook, will cover the following recipes:

  • Detecting microphone support
  • Using the device microphone to monitor audio sample data
  • Recording microphone audio sample data

 

Flash Development for Android Cookbook

Flash Development for Android Cookbook

Over 90 recipes to build exciting Android applications with Flash, Flex, and AIR

        Read more about this book      

(For more resources on this subject, see here.)

Introduction

Camera and microphone are standard accessories on most mobile devices and Android devices are no exception to this. In the previous article we dealt with Visual Input via Camera. The present article will cover encoding raw audio captured from the device microphone and encoding it to WAV or MP3 for use on other platforms and systems.

All of the recipes in this article are represented as pure ActionScript 3 classes and are not dependent upon external libraries or the Flex framework. Therefore, we will be able to use these examples in any IDE we wish.

The reader is advised to refer to the first recipe of Flash Development for Android: Visual Input via Camera for detecting microphone support.

Using the device microphone to monitor audio sample data

By monitoring the sample data being returned from the Android device microphone through the ActionScript Microphone API, we can gather much information about the sound being captured, and perform responses within our application. Such input can be used in utility applications, learning modules, and even games.

How to do it...

We will set up an event listener to respond to sample data reported through the Microphone API:

  1. First, import the following classes into your project:

    import flash.display.Sprite;
    import flash.display.Stage;
    import flash.display.StageAlign;
    import flash.display.StageScaleMode;
    import flash.events.SampleDataEvent;
    import flash.media.Microphone;
    import flash.text.TextField;
    import flash.text.TextFormat;

  2. Declare a TextField and TextFormat object pair to allow visible output upon the device. A Microphone object must also be declared for this example:

    private var mic:Microphone;
    private var traceField:TextField;
    private var traceFormat:TextFormat;

  3. We will now set up our TextField, apply a TextFormat, and add the TextField to the DisplayList. Here, we create a method to perform all of these actions for us:

    protected function setupTextField():void {
    traceFormat = new TextFormat();
    traceFormat.bold = true;
    traceFormat.font = "_sans";
    traceFormat.size = 44;
    traceFormat.align = "center";
    traceFormat.color = 0x333333;
    traceField = new TextField();
    traceField.defaultTextFormat = traceFormat;
    traceField.selectable = false;
    traceField.mouseEnabled = false;
    traceField.width = stage.stageWidth;
    traceField.height = stage.stageHeight;
    addChild(traceField);
    }

  4. Now, we must instantiate our Microphone object and set it up according to our needs and preferences with adjustments to codec, rate, silenceLevel, and so forth. Here we use setSilenceLevel() to determine what the minimum input level our application should consider to be "sound" and the rate property is set to 44, indicating that we will capture audio data at a rate of 44kHz. Setting the setLoopBack () property to false will keep the captured audio from being routed through the device speaker:

    protected function setupMic():void {
    mic = Microphone.getMicrophone();
    mic.setSilenceLevel(0);
    mic.rate = 44;
    mic.setLoopBack(false);
    }

  5. Once we have instantiated our Microphone object, we can then register a variety of event listeners. In this example, we'll be monitoring audio sample data from the device microphone, so we will need to register our listener for the SampleDataEvent.SAMPLE_DATA constant:

    protected function registerListeners():void {
    mic.addEventListener(SampleDataEvent.SAMPLE_DATA, onMicData);
    }

  6. As the Microphone API generates sample data from the Android device input, we can now respond to this in a number of ways, as we have access to information about the Microphone object itself, and more importantly, we have access to the sample bytes with which we can perform a number of advanced operations:

    public function onMicData(e:SampleDataEvent):void {
    traceField.text = "";
    traceField.appendText("activityLevel: " +
    e.target.activityLevel + "\n");
    traceField.appendText("codec: " + e.target.codec + "\n");
    traceField.appendText("gain: " + e.target.gain + "\n");
    traceField.appendText("bytesAvailable: " +
    e.data.bytesAvailable + "\n");
    traceField.appendText("length: " + e.data.length + "\n");
    traceField.appendText("position: " + e.data.position + "\n");
    }

  7. The output will look something like this. The first three values are taken from the Microphone itself, the second three from Microphone sample data:

    Flash Development for Android

How it works...

When we instantiate a Microphone object and register a SampleDataEvent.SAMPLE_DATA event listener, we can easily monitor various properties of our Android device microphone and the associated sample data being gathered. We can then respond to that data in many ways. One example would be to move objects across the Stage based upon the Microphone.activityLevel property. Another example would be to write the sample data to a ByteArray for later analysis.

What do all these properties mean?

  • activityLevel: This is a measurement indicating the amount of sound being received
  • codec: This indicates the codec being used: Nellymoser or Speex
  • gain: This is an amount of boosting provided by the microphone to the sound signal
  • bytesAvailable: This reveals the number of bytes from the present position until the end of our sample data byteArray
  • length: Lets us know the total length of our sample data byteArray
  • position: This is the current position, in bytes, within our sample data byteArray

 

Flash Development for Android Cookbook Over 90 recipes to build exciting Android applications with Flash, Flex, and AIR
Published: June 2011
eBook Price: £18.99
Book Price: £30.99
See more
Select your format and quantity:
        Read more about this book      

(For more resources on this subject, see here.)

Recording Microphone Audio Sample Data

One of the most fundamental things a developer would want to be able to do with audio sample data gathered from an Android microphone, would be to capture the data and use it in some way within an application. This recipe will demonstrate how to preserve and play back captured microphone audio sample data.

How to do it...

We will employ an event listener to respond to sample data reported through the Microphone API by writing captured audio data to a ByteArray and then playing it back internally through the Sound object:

  1. First, import the following classes into your project:

    import flash.display.Sprite;
    import flash.display.Stage;
    import flash.display.StageAlign;
    import flash.display.StageScaleMode;
    import flash.events.SampleDataEvent;
    import flash.events.TouchEvent;
    import flash.media.Microphone;
    import flash.media.Sound;
    import flash.media.SoundChannel;
    import flash.utils.ByteArray;
    import flash.ui.Multitouch;
    import flash.ui.MultitouchInputMode;
    import flash.text.TextField;
    import flash.text.TextFormat;

  2. Declare a TextField and TextFormat object pair to allow visible output upon the device. A Microphone object must also be declared for this example. To store and play back the sample data, we will need to declare a ByteArray, along with a Sound and SoundChannel pair:

    private var mic:Microphone;
    private var micRec:ByteArray;
    private var output:Sound;
    private var outputChannel:SoundChannel;
    private var traceField:TextField;
    private var traceFormat:TextFormat;

  3. We will now set up our TextField, apply a TextFormat, and add the TextField to the DisplayList. Here, we create a method to perform all of these actions for us:

    protected function setupTextField():void {
    traceFormat = new TextFormat();
    traceFormat.bold = true;
    traceFormat.font = "_sans";
    traceFormat.size = 44;
    traceFormat.align = "center";
    traceFormat.color = 0x333333;
    traceField = new TextField();
    traceField.defaultTextFormat = traceFormat;
    traceField.selectable = false;
    traceField.mouseEnabled = false;
    traceField.width = stage.stageWidth;
    traceField.height = stage.stageHeight;
    addChild(traceField);
    }

  4. Then, instantiate a Microphone object and set it up according to our needs and preferences with adjustments to codec, rate, silenceLevel, and so forth. Here we use setSilenceLevel() to determine what the minimum input level our application should consider to be "sound" and the rate property is set to 44, indicating that we will capture audio data at a rate of 44kHz. Setting the setLoopBack () property to false will keep the captured audio from being routed through the device speaker. We'll also instantiate a ByteArray to hold all of our audio samples as they are intercepted:

    protected function setupMic():void {
    mic = Microphone.getMicrophone();
    mic.setSilenceLevel(0);
    mic.rate = 44;
    mic.setLoopBack(false);
    micRec = new ByteArray();
    }

  5. Once we have instantiated our Microphone and ByteArray objects, we can then register an event listener to enable touch interactions. A simple tap will suffice:

    protected function registerListeners():void {
    Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT;
    stage.addEventListener(TouchEvent.TOUCH_TAP, startRecording);
    traceField.text = "Tap to Record";
    }

  6. Once recording has been invoked by the user, we'll be monitoring audio sample data from the device microphone, so will need to register our listener for the SampleDataEvent.SAMPLE_DATA constant:

    protected function startRecording(e:TouchEvent):void {
    stage.removeEventListener(TouchEvent.TOUCH_TAP, startRecording);
    stage.addEventListener(TouchEvent.TOUCH_TAP, stopRecording);
    mic.addEventListener(SampleDataEvent.SAMPLE_DATA, onMicData);
    traceField.text = "Recording Audio \nTap to Stop";
    }

  7. As the Microphone API generates sample data from the Android device input, we have access to the audio sample data bytes, which we can write to a ByteArray for later use:

    protected function onMicData(e:SampleDataEvent):void {
    micRec.writeBytes(e.data);
    }

  8. To stop recording, we will need to remove the SampleDataEvent.SAMPLE_DATA event listener from our Microphone object:

    protected function stopRecording(e:TouchEvent):void {
    mic.removeEventListener(SampleDataEvent.SAMPLE_DATA, onMicData);
    stage.removeEventListener(TouchEvent.TOUCH_TAP, stopRecording);
    stage.addEventListener(TouchEvent.TOUCH_TAP, playBackAudio);
    traceField.text = "Tap to Playback";
    }

  9. To prepare for playback, we will instantiate a new Sound object and register a SampleDataEvent.SAMPLE_DATA event upon it just as we had done for the Microphone object previously. We will also instantiate a SoundChannel object and invoke the play() method of our Sound object to play back the captured Microphone audio:

    protected function playBackAudio(e:TouchEvent):void {
    stage.removeEventListener(TouchEvent.TOUCH_TAP, playBackAudio);
    micRec.position = 0;
    output = new Sound();
    output.addEventListener(SampleDataEvent.SAMPLE_DATA,
    onSampleDataRequest);
    outputChannel = output.play();
    traceField.text = "Playing Audio";
    }

  10. Once we invoke the play() method upon our Sound object, it will begin gathering generated sample data from a method called onSampleDataRequest. We need to create this method now, and allow it to loop over the bytes we previously wrote to our ByteArray object. This is, effectively, the inverse of our capture process.
  11. In order to provide proper playback within our application we must provide between 2048 and 8192 samples of data. It is recommended to use as many samples as possible, but this will also depend upon the sample frequency.

    Note that we invoke writeFloat() twice within the same loop because we need our data expressed in stereo pairs, one for each channel.

  12. When using writeBytes() in this example, we are actually channeling sound data back out through our SampleDataEvent and through a Sound object, thus enabling the application to produce sound:

    protected function
    onSampleDataRequest(e:SampleDataEvent):void {
    var out:ByteArray = new ByteArray();
    for(var i:int = 0; i < 8192 && micRec.bytesAvailable; i++ ) {
    var micsamp:Number = micRec.readFloat();
    // left channel
    out.writeFloat(micsamp);
    // right channel
    out.writeFloat(micsamp);
    }
    e.data.writeBytes(out);
    }

  13. Output to our TextField will change depending upon the current application state:

    Flash Development for Android

How it works...

When we instantiate a Microphone object and register a SampleDataEvent.SAMPLE_DATA event listener, we can easily monitor the associated sample data being gathered and write this data to a ByteArray for later playback. As new samples come in, more data is added to the ByteArray, building up the sound data over time.

By registering a SampleDataEvent.SAMPLE_DATA event listener to a Sound object, we instruct it to actively seek audio data generated from a specific method as soon as we invoke play(). In our example, we move through the constructed ByteArray and send audio data back out through this method, effectively playing back the recorded audio through the Sound object and associated SoundChannel.

See also...

The use of bytes within ActionScript is a complex subject. To read more about this topic, we recommend Thibault Imbert's book "What can you do with bytes?", which is freely available from http://www.bytearray.org/?p=711.

Summary

In this article we have discussed how to capture audio from integrated device hardware through Flash based capture methods.


Further resources on this subject:


Flash Development for Android Cookbook Over 90 recipes to build exciting Android applications with Flash, Flex, and AIR
Published: June 2011
eBook Price: £18.99
Book Price: £30.99
See more
Select your format and quantity:

About the Author :


Joseph Labrecque

Joseph Labrecque is primarily employed by the University of Denver as Senior Interactive Software Engineer specializing in the Adobe Flash Platform, where he produces innovative academic toolsets for both traditional desktop environments and emerging mobile spaces. Alongside this principal role, he often serves as adjunct faculty communicating upon a variety of Flash Platform solutions and general web design and development subjects.

In addition to his accomplishments in higher education, Joseph is the Proprietor of Fractured Vision Media, LLC, a digital media production company, technical consultancy, and distribution vehicle for his creative works. He is founder and sole abiding member of the dark ambient recording project An Early Morning Letter, Displaced, whose releases have received international award nominations and underground acclaim.

Joseph has contributed to a number of respected community publications as an article writer and video tutorialist. He is also the author of Flash Development for Android Cookbook, Packt Publishing (2011), What's New in Adobe AIR 3, O'Reilly Media (2011), What's New in Flash Player 11, O'Reilly Media (2011), Adobe Edge Quickstart Guide, Packt Publishing (2012) and co-author of Mobile Development with Flash Professional CS5.5 and Flash Builder 4.5: Learn by Video, Adobe Press (2011). He also serves as author on a number of video training publications through video2brain, Adobe Press, and Peachpit Press.

He regularly speaks at user group meetings and industry conferences such as Adobe MAX, FITC, D2W, 360|Flex, and a variety of other educational and technical conferences. In 2010, he received an Adobe Impact Award in recognition of his outstanding contribution to the education community. He has served as an Adobe Education Leader since 2008 and is also an Adobe Community Professional.

Visit him on the Web at http://josephlabrecque.com/.

Books From Packt


Android 3.0 Application Development Cookbook
Android 3.0 Application Development Cookbook

Android Application Testing Guide
Android Application Testing Guide

Android User Interface Development: Beginner's Guide
Android User Interface Development: Beginner's Guide

Sencha Touch 1.0 Mobile JavaScript Framework
Sencha Touch 1.0 Mobile JavaScript Framework

Facebook Graph API Development with Flash
Facebook Graph API Development with Flash

Flash Facebook Cookbook
Flash Facebook Cookbook

Mobile Web Development
Mobile Web Development

PhoneGap Beginner's Guide: RAW
PhoneGap Beginner's Guide: RAW


Code Download and Errata
Packt Anytime, Anywhere
Register Books
Print Upgrades
eBook Downloads
Video Support
Contact Us
Awards Voting Nominations Previous Winners
Judges Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software
Resources
Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software