Flash Development for Android: Visual Input via Camera

Exclusive offer: get 50% off this eBook here
Flash Development for Android Cookbook

Flash Development for Android Cookbook — Save 50%

Over 90 recipes to build exciting Android applications with Flash, Flex, and AIR

$29.99    $15.00
by Joseph Labrecque | June 2011 | Cookbooks Web Development

This article discusses how to capture still images and video from integrated device hardware through both Flash based capture methods and while employing native camera applications.

This article by Joseph Labrecque, author of Flash Development for Android Cookbook, will cover the following recipes:

  • Detecting camera support
  • Using the traditional camera API to save a captured image
  • Using the Mobile CameraUI API to save a captured photograph
  • Using the Mobile CameraUI API to save a captured video

 

Flash Development for Android Cookbook

Flash Development for Android Cookbook

Over 90 recipes to build exciting Android applications with Flash, Flex, and AIR

        Read more about this book      

(For more resources on this subject, see here.)

Introduction

Camera and microphone are standard accessories on most mobile devices and Android devices are no exception to this. The present article will cover everything from accessing the camera and taking photos to recording video data.

All of the recipes in this article are represented as pure ActionScript 3 classes and are not dependent upon external libraries or the Flex framework. Therefore, we will be able to use these examples in any IDE we wish.

Detecting camera and microphone support

Nearly all Android devices come equipped with camera hardware for capturing still images and video. Many devices now have both front and rear-facing cameras. It is important to know whether the default device camera is usable through our application. We should never assume the availability of certain hardware items, no matter how prevalent across devices.

Similarly, we will want to be sure to have access to the device microphone as well, when capturing video or audio data.

How to do it...

We will determine which audio and video APIs are available to us on our Android device:

  1. First, import the following classes into your project:

    import flash.display.Sprite;
    import flash.display.Stage;
    import flash.display.StageAlign;
    import flash.display.StageScaleMode;
    import flash.media.Camera;
    import flash.media.CameraUI;
    import flash.media.Microphone;
    import flash.text.TextField;
    import flash.text.TextFormat;

  2. Declare a TextField and TextFormat object pair to allow visible output upon the device:

    private var traceField:TextField;
    private var traceFormat:TextFormat;

  3. We will now set up our TextField, apply a TextFormat, and add the TextField to the DisplayList. Here, we create a method to perform all of these actions for us:

    protected function setupTextField():void {
    traceFormat = new TextFormat();
    traceFormat.bold = true;
    traceFormat.font = "_sans";
    traceFormat.size = 44;
    traceFormat.align = "center";
    traceFormat.color = 0x333333;
    traceField = new TextField();
    traceField.defaultTextFormat = traceFormat;
    traceField.selectable = false;
    traceField.mouseEnabled = false;
    traceField.width = stage.stageWidth;
    traceField.height = stage.stageHeight;
    addChild(traceField);
    }

  4. Now, we must check the isSupported property of each of these objects. We create a method here to perform this across all three and write results to a TextField:

    protected function checkCamera():void {
    traceField.appendText("Camera: " + Camera.isSupported + "\n");
    traceField.appendText("CameraUI: " +
    CameraUI.isSupported + "\n");
    traceField.appendText("Microphone: " +
    Microphone.isSupported + "\n");
    }

  5. We now know the capabilities of video and audio input for a particular device and can react accordingly:

    Flash Development for Android tutorial

How it works...

Each of these three classes has a property isSupported, which we may invoke at any time to verify support on a particular Android device. The traditional Camera and mobile-specific CameraUI both refer to the same hardware camera, but are entirely different classes for dealing with the interaction between Flash and the camera itself, as CameraUI relies upon the default device camera applications to do all the capturing, and Camera works exclusively within the Flash environment.

The traditional Microphone object is also supported in this manner.

There's more...

It is important to note that even though many Android devices come equipped with more than one camera, only the primary camera (and microphone) will be exposed to our application. Support for multiple cameras and other sensors will likely be added to the platform as Android evolves.

Using the traditional camera API to save a captured image

When writing applications for the web through Flash player, or for a desktop with AIR, we have had access to the Camera class through ActionScript. This allows us to access different cameras attached to whatever machine we are using. On Android, we can still use the Camera class to access the default camera on the device and access the video stream it provides for all sorts of things. In this example, we will simply grab a still image from the Camera feed and save it to the Android CameraRoll.

How to do it...

We will construct a Video object to bind the Camera stream to, and use BitmapData methods to capture and then save our rendered image using the mobile CameraRoll API:

  1. At a minimum, we need to import the following classes into our project:

    import flash.display.BitmapData;
    import flash.display.Sprite;
    import flash.display.Stage;
    import flash.display.StageAlign;
    import flash.display.StageScaleMode;
    import flash.events.TouchEvent;
    import flash.media.Camera;
    import flash.media.CameraRoll;
    import flash.media.Video;
    import flash.ui.Multitouch;
    import flash.ui.MultitouchInputMode;

  2. Now we must declare the object instances necessary for camera access and file reference:

    private var video:Video;
    private var camera:Camera;
    private var capture:BitmapData;
    private var cameraRoll:CameraRoll;
    private var videoHolder:Sprite;

  3. Initialize a Video object, passing in the desired width and height, and add it to the DisplayList:

    protected function setupVideo():void {
    videoHolder = new Sprite();
    videoHolder.x = stage.stageWidth/2;
    videoHolder.y = stage.stageHeight/2;
    video = new Video(360, 480);
    videoHolder.addChild(video);
    video.x = -180;
    video.y = -240;
    videoHolder.rotation = 90;
    addChild(videoHolder);
    }

  4. Initialize a Camera object and employ setMode to specify width, height, and frames per second before attaching the Camera to our Video on the DisplayList:

    protected function setupCamera():void {
    camera = Camera.getCamera();
    camera.setMode(480, 360, 24);
    video.attachCamera(camera);
    }

  5. We will now register a TouchEvent listener of type TOUCH_TAP to the Stage. This will enable the user to take a snapshot of the camera display by tapping the device screen:

    protected function registerListeners():void {
    Multitouch.inputMode =
    MultitouchInputMode.TOUCH_POINT;
    stage.addEventListener(TouchEvent.TOUCH_TAP, saveImage);
    }

  6. To capture an image from the camera feed, we will initialize our BitmapData object, matching the width and height of our Video object, and employ the draw method to translate the Video pixels to BitmapData.
  7. To save our acquired image to the device, we must initialize a CameraRoll object and invoke addBitmapData(), passing in the BitmapData object we have created using Video object pixels. We will also determine whether or not this device supports the addBitmapData() method by verifying CameraRoll. supportsAddBitmapData is equal to true:

    protected function saveImage(e:TouchEvent):void {
    capture = new BitmapData(360, 480);
    capture.draw(video);
    cameraRoll = new CameraRoll();
    if(CameraRoll.supportsAddBitmapData){
    cameraRoll.addBitmapData(capture);
    }
    }

    Flash Development for Android tutorial

  8. If we now check our Android Gallery, we will find the saved image:

    Flash Development for Android tutorial

How it works...

Most of this is performed exactly as it would be with normal Flash Platform development on the desktop. Attach a Camera to a Video, add the Video to the DisplayList, and then do whatever you need for your particular application. In this case, we simply capture what is displayed as BitmapData.

The CameraRoll class, however, is specific to mobile application development as it will always refer to the directory upon which the device camera stores the photographs it produces. If you want to save these images within a different directory, we could use a File or FileReference object to do so, but this involves more steps for the user.

Note that while using the Camera class, the hardware orientation of the camera is landscape. We can deal with this by either restricting the application to landscape mode, or through rotations and additional manipulation as we've performed in our example class. We've applied a 90 degree rotation to the image in this case using videoHolder.rotation to account for this shift when reading in the BitmapData. Depending on how any specific application handles this, it may not be necessary to do so.

There's more...

Other use cases for the traditional Camera object are things such as sending a video stream to Flash Media Server for live broadcast, augmented reality applications, or real-time peer to peer chat.

Flash Development for Android Cookbook Over 90 recipes to build exciting Android applications with Flash, Flex, and AIR
Published: June 2011
eBook Price: $29.99
Book Price: $49.99
See more
Select your format and quantity:
        Read more about this book      

(For more resources on this subject, see here.)

Using the Mobile CameraUI API to save a captured photograph

Using the new CameraUI API (available in the mobile AIR SDK), we can perform and alternative capture process to the normal Camera API. The Mobile CameraUI class will make use of the default Android camera application, alongside our custom app, to capture a photograph.

How to do it...

We will set up a CameraUI object to invoke the native Android camera to capture a photograph:

  1. First, import the following classes into your project:

    import flash.display.Sprite;
    import flash.display.StageAlign;
    import flash.display.StageScaleMode;
    import flash.events.Event;
    import flash.events.MediaEvent;
    import flash.events.TouchEvent;
    import flash.media.CameraUI;
    import flash.media.MediaType;
    import flash.media.MediaPromise;
    import flash.ui.Multitouch;
    import flash.ui.MultitouchInputMode;
    import flash.text.TextField;
    import flash.text.TextFormat;

  2. Declare a TextField and TextFormat object pair to allow visible output upon the device. A CameraUI object must also be declared for this example:

    private var camera:CameraUI;
    private var traceField:TextField;
    private var traceFormat:TextFormat;

  3. We will now set up our TextField, apply a TextFormat, and add the TextField to the DisplayList. Here, we create a method to perform all of these actions for us:

    protected function setupTextField():void {
    traceFormat = new TextFormat();
    traceFormat.bold = true;
    traceFormat.font = "_sans";
    traceFormat.size = 22;
    traceFormat.align = "center";
    traceFormat.color = 0xFFFFFF;
    traceField = newTextField();
    traceField.defaultTextFormat = traceFormat;
    traceField.selectable = false;
    traceField.mouseEnabled = false;
    traceField.width = stage.stageWidth;
    traceField.height = stage.stageHeight;
    addChild(traceField);
    }

  4. Instantiate a new CameraUI instance, which will be used to launch the device camera application and return file information back to us. If the CameraUI object is not supported on a particular device, a message is output to our TextField indicating this:

    protected function setupCamera():void {
    if(CameraUI.isSupported) {
    camera = new CameraUI();
    registerListeners();
    }else{
    traceField.appendText("CameraUI is not supported...");
    }
    }

  5. Add an event listener to the CameraUI object so that we know when the capture is complete. We will also register a touch event on the Stage to initiate the capture:

    protected function registerListeners():void {
    Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT;
    camera.addEventListener(MediaEvent.COMPLETE, photoReady);
    stage.addEventListener(TouchEvent.TOUCH_TAP, launchCamera);
    }

  6. To employ the default camera application on our Android device, we will need to invoke the launch method, passing in the MediaType.IMAGE constant to specify that we wish to capture a photograph:

    protected function launchCamera(e:TouchEvent):void {
    camera.launch(MediaType.IMAGE);
    }

  7. Now, the default Android camera will initialize, allowing the user to capture a photograph. Once the user hits OK, focus will return to our application.

    Flash Development for Android tutorial

  8. Finally, once we complete the capture process, an event of type MediaEvent.COMPLETE will fire, invoking our photoReady method. From this, we can ascertain certain details about our captured photograph.

    protected function photoReady(e:MediaEvent):void {
    var promise:MediaPromise = e.data;
    traceField.appendText("mediaType: " + promise.mediaType + "\n");
    traceField.appendText("relativePath: " +
    promise.relativePath + "\n");
    traceField.appendText("creationDate: " +
    promise.file.creationDate + "\n");
    traceField.appendText("extension: " +
    promise.file.extension + "\n");
    traceField.appendText("name: " + promise.file.name + "\n");
    traceField.appendText("size: " + promise.file.size + "\n");
    traceField.appendText("type: " + promise.file.type + "\n");
    traceField.appendText("nativePath: " +
    promise.file.nativePath + "\n");
    traceField.appendText("url: " + promise.file.url + "\n");
    }

  9. The output will look something like this:

    Flash Development for Android tutorial

How it works...

Invoking the CameraUI.launch method will request the Android device to open the default camera application and allow the user to take a photograph. Upon completing the capture process and confirming the captured photograph, focus is then returned to our application along with a set of data about the new file contained within the MediaEvent.COMPLETE event object.

At this point, our application can do all sorts of things with the data returned, or even open the file within the application, assuming that the file type can be loaded and displayed by the runtime.

There's more...

The default camera application will not load if the device does not have a storage card mounted. It is also important to note that if the device becomes low on memory during the capture process, Android may terminate our application before the process is complete.

Flash Development for Android Cookbook Over 90 recipes to build exciting Android applications with Flash, Flex, and AIR
Published: June 2011
eBook Price: $29.99
Book Price: $49.99
See more
Select your format and quantity:
        Read more about this book      

(For more resources on this subject, see here.)

Using the Mobile CameraUI API to save a captured video

Using the new CameraUI API (available in the mobile AIR SDK) we can perform and alternative capture process to the normal Camera API. The mobile CameraUI class will make use of the default Android camera application, alongside our custom app to capture a video.

How to do it...

We will set up a CameraUI object to invoke the native Android camera to capture a video:

  1. First, import the following classes into your project:

    import flash.display.Sprite;
    import flash.display.StageAlign;
    import flash.display.StageScaleMode;
    import flash.events.Event;
    import flash.events.MediaEvent;
    import flash.events.TouchEvent;
    import flash.media.CameraUI;
    import flash.media.MediaPromise;
    import flash.media.MediaType;
    import flash.text.TextField;
    import flash.text.TextFormat;
    import flash.ui.Multitouch;
    import flash.ui.MultitouchInputMode;

  2. Declare a TextField and TextFormat object pair to allow visible output upon the device. A CameraUI object must also be declared for this example:

    private var camera:CameraUI;
    private var traceField:TextField;
    private var traceFormat:TextFormat;

  3. We will now set up our TextField, apply a TextFormat, and add the TextField to the DisplayList. Here, we create a method to perform all of these actions for us:

    protected function setupTextField():void {
    traceFormat = new TextFormat();
    traceFormat.bold = true;
    traceFormat.font = "_sans";
    traceFormat.size = 22;
    traceFormat.align = "center";
    traceFormat.color = 0xFFFFFF;
    traceField = new TextField();
    traceField.defaultTextFormat = traceFormat;
    traceField.selectable = false;
    traceField.mouseEnabled = false;
    traceField.width = stage.stageWidth;
    traceField.height = stage.stageHeight;
    addChild(traceField);
    }

  4. Instantiate a new CameraUI instance, which will be used to launch the device camera application and return file information back to us. If the CameraUI object is not supported on a particular device, a message is output to our TextField indicating this.

    protected function setupCamera():void {
    if(CameraUI.isSupported) {
    camera = new CameraUI();
    registerListeners();
    }else{
    traceField.appendText("CameraUI is not supported...");
    }
    }

  5. Add an event listener to the CameraUI object so that we know when the capture is complete. We will also register a touch event on the Stage to initiate the capture:

    protected function registerListeners():void {
    Multitouch.inputMode = MultitouchInputMode.TOUCH_POINT;
    camera.addEventListener(MediaEvent.COMPLETE, videoReady);
    stage.addEventListener(TouchEvent.TOUCH_TAP, launchCamera);
    }

  6. To employ the default camera application on our Android device, we will need to invoke the launch method, passing in the MediaType.VIDEO constant to specify that we wish to capture a video file:

    protected function launchCamera(e:TouchEvent):void {
    camera.launch(MediaType.VIDEO);
    }

  7. Now, the default Android camera will initialize, allowing the user to take some video. Once the user hits OK, focus will return to our application:

  8. Finally, once we complete the capture process, an event of type MediaEvent.COMPLETE will fire, invoking our videoReady method. From this, we can ascertain certain details about our captured video file:

    protected function videoReady(e:MediaEvent):void {
    var promise:MediaPromise = e.data;
    traceField.appendText("mediaType: " + promise.mediaType + "\n");
    traceField.appendText("relativePath: " +
    promise.relativePath + "\n");
    traceField.appendText("creationDate: " +
    promise.file.creationDate + "\n");
    traceField.appendText("extension: " +
    promise.file.extension + "\n");
    traceField.appendText("name: " + promise.file.name + "\n");
    traceField.appendText("size: " + promise.file.size + "\n");
    traceField.appendText("type: " + promise.file.type + "\n");
    traceField.appendText("nativePath: " +
    promise.file.nativePath + "\n");
    traceField.appendText("url: " + promise.file.url + "\n");
    }

  9. The output will look something like this:

    Flash Development for Android tutorial

How it works...

Invoking the CameraUI.launch method will request that the Android device open the default camera application and allow the user to capture some video. Upon completing the capture process and confirming the captured video file, focus is then returned to our application along with a set of data about the new file contained within the MediaEvent.COMPLETE event object.

At this point, our application can do all sorts of things with the data returned, or even open the file within the application, assuming that the file type can be loaded and displayed by the runtime. This is very important when it comes to video as certain devices will use a variety of codecs to encode the captured video, not all of them Flash Platform compatible.

There's more...

The default camera application will not load if the device does not have a storage card mounted. It is also important to note that if the device becomes low on memory during the capture process, Android may terminate our application before the process is complete.

Also, there are many other events aside from MediaEvent.COMPLETE that we can use in such a process. For instance, register an event listener of type Event.CANCEL in order to react to the user canceling a video save.

Summary

This article discussed how to capture still images and video from integrated device hardware through both Flash based capture methods and while employing native camera applications. In the next article we will explore Flash Development for Android: Audio Input via Microphone.


Further resources on this subject:


About the Author :


Joseph Labrecque

Joseph Labrecque is primarily employed by the University of Denver as Senior Interactive Software Engineer specializing in the Adobe Flash Platform, where he produces innovative academic toolsets for both traditional desktop environments and emerging mobile spaces. Alongside this principal role, he often serves as adjunct faculty communicating upon a variety of Flash Platform solutions and general web design and development subjects.

In addition to his accomplishments in higher education, Joseph is the Proprietor of Fractured Vision Media, LLC, a digital media production company, technical consultancy, and distribution vehicle for his creative works. He is founder and sole abiding member of the dark ambient recording project An Early Morning Letter, Displaced, whose releases have received international award nominations and underground acclaim.

Joseph has contributed to a number of respected community publications as an article writer and video tutorialist. He is also the author of Flash Development for Android Cookbook, Packt Publishing (2011), What's New in Adobe AIR 3, O'Reilly Media (2011), What's New in Flash Player 11, O'Reilly Media (2011), Adobe Edge Quickstart Guide, Packt Publishing (2012) and co-author of Mobile Development with Flash Professional CS5.5 and Flash Builder 4.5: Learn by Video, Adobe Press (2011). He also serves as author on a number of video training publications through video2brain, Adobe Press, and Peachpit Press.

He regularly speaks at user group meetings and industry conferences such as Adobe MAX, FITC, D2W, 360|Flex, and a variety of other educational and technical conferences. In 2010, he received an Adobe Impact Award in recognition of his outstanding contribution to the education community. He has served as an Adobe Education Leader since 2008 and is also an Adobe Community Professional.

Visit him on the Web at http://josephlabrecque.com/.

Books From Packt


Android 3.0 Application Development Cookbook
Android 3.0 Application Development Cookbook

Android Application Testing Guide
Android Application Testing Guide

Android User Interface Development: Beginner's Guide
Android User Interface Development: Beginner's Guide

Sencha Touch 1.0 Mobile JavaScript Framework
Sencha Touch 1.0 Mobile JavaScript Framework

Facebook Graph API Development with Flash
Facebook Graph API Development with Flash

Flash Facebook Cookbook
Flash Facebook Cookbook

Mobile Web Development
Mobile Web Development

PhoneGap Beginner's Guide: RAW
PhoneGap Beginner's Guide: RAW


Code Download and Errata
Packt Anytime, Anywhere
Register Books
Print Upgrades
eBook Downloads
Video Support
Contact Us
Awards Voting Nominations Previous Winners
Judges Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software
Resources
Open Source CMS Hall Of Fame CMS Most Promising Open Source Project Open Source E-Commerce Applications Open Source JavaScript Library Open Source Graphics Software