Search icon
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Augmented Reality for Android Application Development
Augmented Reality for Android Application Development

Augmented Reality for Android Application Development: As an Android developer, including Augmented Reality (AR) in your mobile apps could be a profitable new string to your bow. This tutorial takes you through every aspect of AR for Android with lots of hands-on exercises.

eBook
₹2,323.99 ₹800.00
Print
₹2,904.99
Subscription
₹800 Monthly

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Black & white paperback book shipped to your address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Buy Now

Product Details


Publication date : Nov 25, 2013
Length 134 pages
Edition : 1st Edition
Language : English
ISBN-13 : 9781782168553
Vendor :
Google
Category :
Languages :
Table of content icon View table of contents Preview book icon Preview Book

Augmented Reality for Android Application Development

Chapter 1. Augmented Reality Concepts and Tools

Augmented Reality (AR) offers us a new way to interact with the physical (or real) world. It creates a modified version of our reality, enriched with digital (or virtual) information, on the screen of your desktop computer or mobile device. Merging and combining the virtual and the real can leverage a totally new range of user experience, going beyond what common apps are capable of. Can you imagine playing a first-person shooter in your own neighborhood, with monsters popping up at the corner of your street (as it is possible with ARQuake by Bruce Thomas at the University of South Australia, see left-hand side of the following screenshot)? Will it not be a thrilling moment to go to a natural history museum and see a dusty dinosaur skeleton coming virtually alive—flesh and bone—in front of your eyes? Or can you imagine reading a story to your kid and seeing some proud rooster appear and walk over the pages of a book (as it is possible with the AR version of the "House that Jack Built" written by Gavin Bishop, see the right-hand side of the following screenshot). In this book, we show you how to practically implement such experiences on the Android platform.

A decade ago, experienced researchers would have been among the few who were able to create these types of applications. They were generally limited to demonstration prototypes or in the production of an ad hoc project running for a limited period of time. Now, developing AR experiences has become a reality for a wide range of mobile software developers. Over the last few years, we have been spectators to great progresses in computational power, the miniaturization of sensors, as well as increasingly accessible and featured multimedia libraries. These advances allow developers to produce AR applications more easily than ever before. This already leads to an increasing number of AR applications flourishing on mobile app stores such as Google Play. While an enthusiastic programmer can easily stitch together some basic code snippets to create a facsimile of a basic AR application, they are generally poorly designed, with limited functionalities, and hardly reusable. To be able to create sophisticated AR applications, one has to understand what Augmented Reality truly is.

In this chapter, we will guide you toward a better understanding of AR. We will describe some of the major concepts of AR. We will then move on from these examples to the foundational software components for AR. Finally, we will introduce the development tools that we will use throughout this book, which will support our journey into creating productive and modular AR software architecture.

Ready to change your reality for Augmented Reality? Let's start.

A quick overview of AR concepts


As AR has become increasingly popular in the media over the last few years, unfortunately, several distorted notions of Augmented Reality have evolved. Anything that is somehow related to the real world and involves some computing, such as standing in front of a shop and watching 3D models wear the latest fashions, has become AR. Augmented Reality emerged from research labs a few decades ago and different definitions of AR have been produced. As more and more research fields (for example, computer vision, computer graphics, human-computer interaction, medicine, humanities, and art) have investigated AR as a technology, application, or concept, multiple overlapping definitions now exist for AR. Rather than providing you with an exhaustive list of definitions, we will present some major concepts present in any AR application.

Sensory augmentation

The term Augmented Reality itself contains the notion of reality. Augmenting generally refers to the aspect of influencing one of your human sensory systems, such as vision or hearing, with additional information. This information is generally defined as digital or virtual and will be produced by a computer. The technology currently uses displays to overlay and merge the physical information with the digital information. To augment your hearing, modified headphones or earphones equipped with microphones are able to mix sound from your surroundings in real-time with sound generated by your computer. In this book, we will mainly look at visual augmentation.

Displays

The TV screen at home is the ideal device to perceive virtual content, streamed from broadcasts or played from your DVD. Unfortunately, most common TV screens are not able to capture the real world and augment it. An Augmented Reality display needs to simultaneously show the real and virtual worlds.

One of the first display technologies for AR was produced by Ivan Sutherland in 1964 (named "The Sword of Damocles"). The system was rigidly mounted on the ceiling and used some CRT screens and a transparent display to be able to create the sensation of visually merging the real and virtual.

Since then, we have seen different trends in AR display, going from static to wearable and handheld displays. One of the major trends is the usage of optical see-through (OST) technology. The idea is to still see the real world through a semi-transparent screen and project some virtual content on the screen. The merging of the real and virtual worlds does not happen on the computer screen, but directly on the retina of your eye, as depicted in the following figure:

The other major trend in AR display is what we call video see-through (VST) technology. You can imagine perceiving the world not directly, but through a video on a monitor. The video image is mixed with some virtual content (as you will see in a movie) and sent back to some standard display, such as your desktop screen, your mobile phone, or the upcoming generation of head-mounted displays as shown in the following figure:

In this book, we will work on Android-driven mobile phones and, therefore, discuss only VST systems; the video camera used will be the one on the back of your phone.

Registration in 3D

With a display (OST or VST) in your hands, you are already able to superimpose things from your real world, as you will see in TV advertisements with text banners at the bottom of the screen. However, any virtual content (such as text or images) will remain fixed in its position on the screen. The superposition being really static, your AR display will act as a head-up display (HUD), but won't really be an AR as shown in the following figure:

Google Glass is an example of an HUD. While it uses a semitransparent screen like an OST, the digital content remains in a static position.

AR needs to know more about real and virtual content. It needs to know where things are in space (registration) and follow where they are moving (tracking).

Registration is basically the idea of aligning virtual and real content in the same space. If you are into movies or sports, you will notice that 2D or 3D graphics are superimposed onto scenes of the physical world quite often. In ice hockey, the puck is often highlighted with a colored trail. In movies such as Walt Disney's TRON (1982 version), the real and virtual elements are seamlessly blended. However, AR differs from those effects as it is based on all of the following aspects (proposed by Ronald T. Azuma in 1997):

  • It's in 3D: In the olden days, some of the movies were edited manually to merge virtual visual effects with real content. A well-known example is Star Wars, where all the lightsaber effects have been painted by hand by hundreds of artists and, thus, frame by frame. Nowadays, more complex techniques support merging digital 3D content (such as characters or cars) with the video image (and is called match moving). AR is inherently always doing that in a 3D space.

  • The registration happens in real time: In a movie, everything is pre-recorded and generated in a studio; you just play the media. In AR, everything is in real time, so your application needs to merge, at each instance, reality and virtuality.

  • It's interactive: In a movie, you only look passively at the scene from where it has been shot. In AR, you can actively move around, forward, and backward and turn your AR display—you will still see an alignment between both worlds.

Interaction with the environment

Building a rich AR application needs interaction between environments; otherwise you end up with pretty, 3D graphics that can turn boring quite fast. AR interaction refers to selecting and manipulating digital and physical objects and navigating in the augmented scene. Rich AR applications allow you to use objects which can be on your table, to move some virtual characters, use your hands to select some floating virtual objects while walking on the street, or speak to a virtual agent appearing on your watch to arrange a meeting later in the day. In Chapter 6, Make It Interactive – Create the User Experience, we will discuss mobile-AR interaction. We will look at how some of the standard mobile interaction techniques can also be applied to AR. We will also dig into specific techniques involving the manipulation of the real world.

Choose your style – sensor-based and computer vision-based AR

Previously in this chapter, we discussed what AR is and elaborated on display, registration, and interaction. As some of the notions in this book can also be applied to any AR development, we will specifically look at mobile AR.

Mobile AR sometimes refers to any transportable, wearable AR system that can be used indoors and outdoors. In this book, we will look at mobile AR with the most popular connotation used today—using handheld mobile devices, such as smartphones or tablets. With the current generation of smartphones, two major approaches to the AR system can be realized. These systems are characterized by their specific registration techniques and, also, their interaction range. They both enable a different range of applications. The systems, sensor-based AR and computer vision-based AR, are using the video see-through display, relying on the camera and screen of the mobile phone.

Sensor-based AR

The first type of system is called sensor-based AR and generally referred to as a GPS plus inertial AR (or, sometimes, outdoor AR system). Sensor-based AR uses the location sensor from a mobile as well as the orientation sensor. Combining both the location and orientation sensors delivers the global position of the user in the physical world.

The location sensor is mainly supported with a GNSS (Global Navigation Satellite System) receiver. One of the most popular GNSS receivers is the GPS (maintained by the USA), which is present on most smartphones.

Note

Other systems are currently (or will soon be) deployed, such as GLONASS (Russia), Galileo (Europe, 2020), or Compass (China, 2020).

There are several possible orientation sensors available on handheld devices, such as accelerometers, magnetometers, and gyroscopes. The measured position and orientation of your handheld device provides tracking information, which is used for registering virtual objects on the physical scene. The position reported by the GPS module can be both inaccurate and updated slower than you move around. This can result in a lag, that is, when you do a fast movement, virtual elements seem to float behind. One of the most popular types of AR applications with sensor-based systems are AR browsers, which visualize Points of Interests (POIs), that is, simple graphical information about things around you. If you try some of the most popular products such as Junaio, Layar, or Wikitude, you will probably observe this effect of lag.

The advantage of this technique is that the sensor-based ARs are working on a general scale around the world, in practically any physical outdoor position (such as if you are in the middle of the desert or in a city). One of the limitations of such systems is their inability to work inside (or work poorly) or in any occluded area (no line-of-sight with the sky, such as in forests or on streets with high buildings all around). We will discuss more about this type of mobile AR system in Chapter 4, Locating in the World.

Computer vision-based AR

The other popular type of AR system is computer vision-based AR. The idea here is to leverage the power of the inbuilt camera for more than capturing and displaying the physical world (as done in sensor-based AR). This technology generally operates with image processing and computer vision algorithms that analyze the image to detect any object visible from the camera. This analysis can provide information about the position of different objects and, therefore, the user (more about that in Chapter 5, Same as Hollywood – Virtual on Physical Objects).

The advantage is that things seem to be perfectly aligned. The current technology allows you to recognize different types of planar pictorial content, such as a specifically designed marker (marker-based tracking) or more natural content (markerless tracking). One of the disadvantages is that vision-based AR is heavy in processing and can drain the battery really rapidly. Recent generations of smartphones are more adapted to handle this type of problem, being that they are optimized for energy consumption.

AR architecture concepts

So let's explore how we can support the development of the previously described concepts and the two general AR systems. As in the development of any other application, some well-known concepts of software engineering can be applied in developing an AR application. We will look at the structural aspect of an AR application (software components) followed by the behavioral aspect (control flow).

AR software components

An AR application can be structured in three layers: the application layer, the AR layer, and the OS/Third Party layer.

The application layer corresponds to the domain logic of your application. If you want to develop an AR game, anything related to managing the game assets (characters, scenes, objects) or the game logic will be implemented in this specific layer. The AR layer corresponds to the instantiation of the concepts we've previously described. Each of the AR notions and concepts that we've presented (display, registration, and interaction) can be seen, in terms of software, as a modular element, a component, or a service of the AR layer.

You can note that we have separated tracking from registration in the figure, making tracking one major software component for an AR application. Tracking, which provides spatial information to the registration service, is a complex and computationally intensive process in any AR application. The OS/Third Party layer corresponds to existing tools and libraries which don't provide any AR functionalities, but will enable the AR layer. For example, the Display module for a mobile application will communicate with the OS layer to access the camera to create a view of the physical world. On Android, the Google Android API is part of this layer. Some additional libraries, such as JMonkeyEngine, which handle the graphics, are also part of this layer.

In the rest of the book, we will show you how to implement the different modules of the AR layer, which also involves communication with the OS/Third Party layer. The major layers of an AR application (see the right-hand side of the following figure), with their application modules (the left-hand side of the following figure), are depicted in the following figure:

AR control flow

With the concept of software layers and components in mind, we can now look at how information will flow in a typical AR application. We will focus here on describing how each of the components of the AR layer relate to each other over time and what their connections with the OS/Third Party layer are.

Over the last decade, AR researchers and developers have converged toward a well-used method of combining these components using a similar order of execution—the AR control flow. We present here the general AR control flow used by the community and summarized in the following figure:

The preceding figure, read from the bottom up, shows the main activities of an AR application. This sequence is repeated indefinitely in an AR application; it can be seen as the typical AR main loop (please note that we've excluded the domain logic here as well as the OS activities). Each activity corresponds to the same module we've presented before. The structure of the AR layer and AR control flow is, therefore, quite symmetric.

Understand that this control flow is the key to developing an AR application, so we will come back to it and use it in the rest of the book. We will get into more details of each of the components and steps in the next chapter.

So, looking at the preceding figure, the main activities and steps in your application are as follows:

  • Manage the display first: For mobile AR, this means accessing the video camera and showing a captured image on the screen (a view of your physical world). We will discuss that in Chapter 2, Viewing the World. This also involves matching camera parameters between the physical camera and the virtual one that renders your digital objects (Chapter 3, Superimposing the World).

  • Register and track your objects: Analyze the sensors on your mobile (approach 1) or analyze the video image (approach 2) and detect the position of each element of your world (such as camera or objects). We will discuss this aspect in Chapter 4, Locating in the World and Chapter 5, Same as Hollywood – Virtual on Physical Objects.

  • Interact: Once your content is correctly registered, you can start to interact with it, as we will discuss in Chapter 6, Make It Interactive – Create the User Experience.

System requirements for development and deployment

If you want to develop Augmented Reality applications for Android, you can share the majority of tools with regular Android developers. Specifically, you can leverage the power of the widely supported Google Android Developer Tools Bundle (ADT Bundle). This includes the following:

  • The Eclipse Integrated Development Environment (IDE)

  • The Google Android Developer Tools (ADT) plugin for Eclipse

  • The Android platform for your targeted devices (further platforms can be downloaded)

  • The Android Emulator with the latest system image

Besides this standard package common to many Android development environments, you will need the following:

  • A snapshot of JMonkeyEngine (JME), version 3 or higher

  • Qualcomm® VuforiaTM SDK (VuforiaTM), version 2.6 or higher

  • Android Native Development Kit (Android NDK), version r9 or higher

The JME Java OpenGL® game engine is a free toolkit that brings the 3D graphics in your programs to life. It provides 3D graphics and gaming middleware that frees you from exclusively coding in low-level OpenGL® ES (OpenGL® for Embedded Systems), for example, by providing an asset system for importing models, predefined lighting, and physics and special effects components.

The Qualcomm® VuforiaTM SDK brings state-of-the art computer vision algorithms targeted at recognizing and tracking a wide variety of objects, including fiducials (frame markers), image targets, and even 3D objects. While it is not needed for sensor-based AR, it allows you to conveniently implement computer vision-based AR applications.

The Google Android NDK is a toolset for performance-critical applications. It allows parts of the application to be written in native-code languages (C/C++). While you don't need to code in C or C++, this toolset is required by VuforiaTM SDK.

Of course, you are not bound to a specific IDE and can work with command-line tools as well. The code snippets themselves, which we present in this book, do not rely on the use of a specific IDE. However, within this book, we will give you setup instructions specifically for the popular Eclipse IDE. Furthermore, all development tools can be used on Windows (XP or later), Linux, and Mac OS X (10.7 or later).

On the next pages, we will guide you through the installation processes of the Android Developer Tools Bundle, NDK, JME, and VuforiaTM SDK. While the development tools can be spread throughout the system, we recommend that you use a common base directory for both the development tools and the sample code; let's call it AR4Android (for example, C:/AR4Android under Windows or /opt/AR4Android under Linux or Mac OS X).

Installing the Android Developer Tools Bundle and the Android NDK

You can install the ADT Bundle in two easy steps as follows:

  1. Download the ADT Bundle from http://developer.android.com/sdk/index.html.

  2. After downloading, unzip adt-bundle-<os_platform>.zip into the AR4Android base directory.

You can then start the Eclipse IDE by launching AR4Android/adt-bundle-<os_platform>/eclipse/eclipse(.exe).

Tip

Please note that you might need to install additional system images, depending on the devices you use (for example, version 2.3.5, or 4.0.1). You can follow the instructions given at the following website: http://developer.android.com/tools/help/sdk-manager.html.

For the Android NDK (version r9 or higher), you follow a similar procedure as follows:

  1. Download it from http://developer.android.com/tools/sdk/ndk/index.html.

  2. After downloading, unzip android-ndk-r<version>Y-<os_platform>.(zip|bz2) into the AR4Android base directory.

Installing JMonkeyEngine

JME is a powerful Java-based 3D game engine. It comes with its own development environment (JME IDE based on NetBeans) which is targeted towards the development of desktop games. While the JME IDE also supports the deployment of Android devices, it (at the time this book is being written) lacks the integration of convenient Android SDK tools like the Android Debug Bridge (adb), Dalvik Debug Monitor Server view (DDMS) or integration of the Android Emulator found in the ADT Bundle. So, instead of using the JME IDE, we will integrate the base libraries into our AR projects in Eclipse. The easiest way to obtain the JME libraries is to download the SDK for your operating system from http://jmonkeyengine.org/downloads and install it into the AR4Android base directory (or your own developer directory; just make sure you can easily access it later in your projects). At the time this book is being published, there are three packages: Windows, GNU/Linux, and Mac OS X.

Tip

You can also obtain most recent versions from http://updates.jmonkeyengine.org/nightly/3.0/engine/

You need only the Java libraries of JME (.jar) for the AR development, using the ADT Bundle. If you work on Windows or Linux, you can include them in any existing Eclipse project by performing the following steps:

  1. Right-click on your AR project (which we will create in the next chapter) or any other project in the Eclipse explorer and go to Build Path | Add External Archives.

  2. In the JAR selection dialog, browse to AR4Android/jmonkeyplatform/ jmonkeyplatform/libs.

  3. You can select all JARs in the lib directory and click on Open.

If you work on Mac OS X, you should extract the libraries from jmonkeyplatform.app before applying the same procedure as for Windows or Linux described in the preceding section. To extract the libraries, you need to right-click on your jmonkeyplatform.app app and select Show Package contents and you will find the libraries in /Applications/jmonkeyplatform.app/Contents/Resources/.

Please note that, in the context of this book, we only use a few of them. In the Eclipse projects accompanying the source code of the book, you will find the necessary JARs already in the local lib directories containing the subset of Java libraries necessary for running the examples. You can also reference them in your build path.

Note

The reference documentation for using JME with Android can be found at http://hub.jmonkeyengine.org/wiki/doku.php/jme3:android.

Installing VuforiaTM

VuforiaTM is a state-of-the-art library for computer vision recognition and natural feature tracking.

In order to download and install VuforiaTM, you have to initially register at https://developer.vuforia.com/user/register. Afterwards, you can download the SDK (for Windows, Linux, or Mac OS X) from https://developer.vuforia.com/resources/sdk/android. Create a folder named AR4Android/ThirdParty. Now create an Eclipse project by going to File | New | Project ... named ThirdParty and choose as location the folder AR4Android/ThirdParty (see also the section Creating an Eclipse project in Chapter 2, Viewing the World). Then install the VuforiaTM SDK in AR4Android/ThirdParty/vuforia-sdk-android-<VERSION>. For the examples in Chapter 5, Same as Hollywood – Virtual on Physical Objects and Chapter 6, Make It Interactive – Create the User Experience, you will need to reference this ThirdParty Eclipse project.

Which Android devices should you use?

The Augmented Reality applications which you will learn to build will run on a wide variety of Android-powered smartphone and tablet devices. However, depending on the specific algorithms, we will introduce certain hardware requirements that should be met. Specifically, the Android device needs to have the following features:

  • A back-facing camera for all examples in this book

  • A GPS module for the sensor-based AR examples

  • A gyroscope or linear accelerometers for the sensor-based AR examples

Augmented Reality on mobile phones can be challenging as many integrated sensors have to be active during the running of applications and computationally demanding algorithms are executed. Therefore, we recommend deploying them on a dual-core processor (or more cores) for the best AR experience. The earliest Android version to deploy should be 2.3.3 (API 10, Gingerbread). This gives potential outreach to your AR app across approximately 95 percent of all Android devices.

Please make sure to set up your device for development as described at http://developer.android.com/tools/device.html.

In addition, most AR applications, specifically the computer-vision based applications (using VuforiaTM), require enough processing power.

Summary


In this chapter, we introduced the foundational background of AR. We've presented some of the main concepts of AR, such as sensory augmentation, dedicated display technology, real-time spatial registration of physical and digital information, and interaction with the content.

We've also presented computer vision-based and sensor-based AR systems, the two major trends of architecture on mobile devices. The basic software architecture blocks of an AR application have also been described and will be used as a guide for the remaining presentation of this book. By now, you should have installed the third-party tools used in the coming chapters. In the next chapter, you will get started with viewing the virtual world and implementing camera access with JME.

Left arrow icon Right arrow icon

Key benefits

  • Understand the main concepts and architectural components of an AR application
  • Step-by-step learning through hands-on programming combined with a background of important mathematical concepts
  • Efficiently and robustly implement some of the main functional AR aspects

Description

Augmented Reality offers the magical effect of blending the physical world with the virtual world, which brings applications from your screen into your hands. AR redefines advertising and gaming, as well as education. It will soon become a technology that will have to be mastered as a necessity by mobile application developers. Augmented Reality for Android Application Development enables you to implement sensor-based and computer vision-based AR applications on Android devices. You will learn about the theoretical foundations and practical details of implemented AR applications, and you will be provided with hands-on examples that will enable you to quickly develop and deploy novel AR applications on your own. Augmented Reality for Android Application Development will help you learn the basics of developing mobile AR browsers, how to integrate and animate 3D objects easily with the JMonkeyEngine, how to unleash the power of computer vision-based AR using the Vuforia AR SDK, and will teach you about popular interaction metaphors. You will get comprehensive knowledge of how to implement a wide variety of AR apps using hands-on examples. This book will make you aware of how to use the AR engine, Android layout, and overlays, and how to use ARToolkit. Finally, you will be able to apply this knowledge to make a stunning AR application.

What you will learn

Decide which AR approach is right for you: sensor-based or computer vision-based Get camera-access for Android Overlay 3D objects on physical images with the JMonkeyEngine Learn how to use the GPS sensor to locate yourself in the world Master orientation sensors Learn the building blocks of implementing Augmented Reality Browsers Understand the power of the Vuforia SDK for computer vision-based AR Enable user interaction with Augmented Objects

What do you get with Print?

Product feature icon Instant access to your digital eBook copy whilst your Print order is Shipped
Product feature icon Black & white paperback book shipped to your address
Product feature icon Download this book in EPUB and PDF formats
Product feature icon Access this title in our online reader with advanced features
Product feature icon DRM FREE - Read whenever, wherever and however you want
Buy Now

Product Details


Publication date : Nov 25, 2013
Length 134 pages
Edition : 1st Edition
Language : English
ISBN-13 : 9781782168553
Vendor :
Google
Category :
Languages :

Table of Contents

14 Chapters
Augmented Reality for Android Application Development Chevron down icon Chevron up icon
Credits Chevron down icon Chevron up icon
About the Authors Chevron down icon Chevron up icon
About the Reviewers Chevron down icon Chevron up icon
www.PacktPub.com Chevron down icon Chevron up icon
Preface Chevron down icon Chevron up icon
Augmented Reality Concepts and Tools Chevron down icon Chevron up icon
Viewing the World Chevron down icon Chevron up icon
Superimposing the World Chevron down icon Chevron up icon
Locating in the World Chevron down icon Chevron up icon
Same as Hollywood – Virtual on Physical Objects Chevron down icon Chevron up icon
Make It Interactive – Create the User Experience Chevron down icon Chevron up icon
Further Reading and Tips Chevron down icon Chevron up icon
Index Chevron down icon Chevron up icon

Customer reviews

Filter icon Filter
Top Reviews
Rating distribution
Empty star icon Empty star icon Empty star icon Empty star icon Empty star icon 0
(0 Ratings)
5 star 0%
4 star 0%
3 star 0%
2 star 0%
1 star 0%

Filter reviews by


No reviews found
Get free access to Packt library with over 7500+ books and video courses for 7 days!
Start Free Trial

FAQs

What is the delivery time and cost of print book? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela
What is custom duty/charge? Chevron down icon Chevron up icon

Customs duty are charges levied on goods when they cross international borders. It is a tax that is imposed on imported goods. These duties are charged by special authorities and bodies created by local governments and are meant to protect local industries, economies, and businesses.

Do I have to pay customs charges for the print book order? Chevron down icon Chevron up icon

The orders shipped to the countries that are listed under EU27 will not bear custom charges. They are paid by Packt as part of the order.

List of EU27 countries: www.gov.uk/eu-eea:

A custom duty or localized taxes may be applicable on the shipment and would be charged by the recipient country outside of the EU27 which should be paid by the customer and these duties are not included in the shipping charges been charged on the order.

How do I know my custom duty charges? Chevron down icon Chevron up icon

The amount of duty payable varies greatly depending on the imported goods, the country of origin and several other factors like the total invoice amount or dimensions like weight, and other such criteria applicable in your country.

For example:

  • If you live in Mexico, and the declared value of your ordered items is over $ 50, for you to receive a package, you will have to pay additional import tax of 19% which will be $ 9.50 to the courier service.
  • Whereas if you live in Turkey, and the declared value of your ordered items is over € 22, for you to receive a package, you will have to pay additional import tax of 18% which will be € 3.96 to the courier service.
How can I cancel my order? Chevron down icon Chevron up icon

Cancellation Policy for Published Printed Books:

You can cancel any order within 1 hour of placing the order. Simply contact customercare@packt.com with your order details or payment transaction id. If your order has already started the shipment process, we will do our best to stop it. However, if it is already on the way to you then when you receive it, you can contact us at customercare@packt.com using the returns and refund process.

Please understand that Packt Publishing cannot provide refunds or cancel any order except for the cases described in our Return Policy (i.e. Packt Publishing agrees to replace your printed book because it arrives damaged or material defect in book), Packt Publishing will not accept returns.

What is your returns and refunds policy? Chevron down icon Chevron up icon

Return Policy:

We want you to be happy with your purchase from Packtpub.com. We will not hassle you with returning print books to us. If the print book you receive from us is incorrect, damaged, doesn't work or is unacceptably late, please contact Customer Relations Team on customercare@packt.com with the order number and issue details as explained below:

  1. If you ordered (eBook, Video or Print Book) incorrectly or accidentally, please contact Customer Relations Team on customercare@packt.com within one hour of placing the order and we will replace/refund you the item cost.
  2. Sadly, if your eBook or Video file is faulty or a fault occurs during the eBook or Video being made available to you, i.e. during download then you should contact Customer Relations Team within 14 days of purchase on customercare@packt.com who will be able to resolve this issue for you.
  3. You will have a choice of replacement or refund of the problem items.(damaged, defective or incorrect)
  4. Once Customer Care Team confirms that you will be refunded, you should receive the refund within 10 to 12 working days.
  5. If you are only requesting a refund of one book from a multiple order, then we will refund you the appropriate single item.
  6. Where the items were shipped under a free shipping offer, there will be no shipping costs to refund.

On the off chance your printed book arrives damaged, with book material defect, contact our Customer Relation Team on customercare@packt.com within 14 days of receipt of the book with appropriate evidence of damage and we will work with you to secure a replacement copy, if necessary. Please note that each printed book you order from us is individually made by Packt's professional book-printing partner which is on a print-on-demand basis.

What tax is charged? Chevron down icon Chevron up icon

Currently, no tax is charged on the purchase of any print book (subject to change based on the laws and regulations). A localized VAT fee is charged only to our European and UK customers on eBooks, Video and subscriptions that they buy. GST is charged to Indian customers for eBooks and video purchases.

What payment methods can I use? Chevron down icon Chevron up icon

You can pay with the following card types:

  1. Visa Debit
  2. Visa Credit
  3. MasterCard
  4. PayPal
What is the delivery time and cost of print books? Chevron down icon Chevron up icon

Shipping Details

USA:

'

Economy: Delivery to most addresses in the US within 10-15 business days

Premium: Trackable Delivery to most addresses in the US within 3-8 business days

UK:

Economy: Delivery to most addresses in the U.K. within 7-9 business days.
Shipments are not trackable

Premium: Trackable delivery to most addresses in the U.K. within 3-4 business days!
Add one extra business day for deliveries to Northern Ireland and Scottish Highlands and islands

EU:

Premium: Trackable delivery to most EU destinations within 4-9 business days.

Australia:

Economy: Can deliver to P. O. Boxes and private residences.
Trackable service with delivery to addresses in Australia only.
Delivery time ranges from 7-9 business days for VIC and 8-10 business days for Interstate metro
Delivery time is up to 15 business days for remote areas of WA, NT & QLD.

Premium: Delivery to addresses in Australia only
Trackable delivery to most P. O. Boxes and private residences in Australia within 4-5 days based on the distance to a destination following dispatch.

India:

Premium: Delivery to most Indian addresses within 5-6 business days

Rest of the World:

Premium: Countries in the American continent: Trackable delivery to most countries within 4-7 business days

Asia:

Premium: Delivery to most Asian addresses within 5-9 business days

Disclaimer:
All orders received before 5 PM U.K time would start printing from the next business day. So the estimated delivery times start from the next day as well. Orders received after 5 PM U.K time (in our internal systems) on a business day or anytime on the weekend will begin printing the second to next business day. For example, an order placed at 11 AM today will begin printing tomorrow, whereas an order placed at 9 PM tonight will begin printing the day after tomorrow.


Unfortunately, due to several restrictions, we are unable to ship to the following countries:

  1. Afghanistan
  2. American Samoa
  3. Belarus
  4. Brunei Darussalam
  5. Central African Republic
  6. The Democratic Republic of Congo
  7. Eritrea
  8. Guinea-bissau
  9. Iran
  10. Lebanon
  11. Libiya Arab Jamahriya
  12. Somalia
  13. Sudan
  14. Russian Federation
  15. Syrian Arab Republic
  16. Ukraine
  17. Venezuela