Search icon
Subscription
0
Cart icon
Close icon
You have no products in your basket yet
Arrow left icon
All Products
Best Sellers
New Releases
Books
Videos
Audiobooks
Learning Hub
Newsletters
Free Learning
Arrow right icon
Practical Predictive Analytics

You're reading from  Practical Predictive Analytics

Product type Book
Published in Jun 2017
Publisher Packt
ISBN-13 9781785886188
Pages 576 pages
Edition 1st Edition
Languages
Author (1):
Ralph Winters Ralph Winters
Profile icon Ralph Winters

Table of Contents (19) Chapters

Title Page
Credits
About the Author
About the Reviewers
www.PacktPub.com
Customer Feedback
Preface
1. Getting Started with Predictive Analytics 2. The Modeling Process 3. Inputting and Exploring Data 4. Introduction to Regression Algorithms 5. Introduction to Decision Trees, Clustering, and SVM 6. Using Survival Analysis to Predict and Analyze Customer Churn 7. Using Market Basket Analysis as a Recommender Engine 8. Exploring Health Care Enrollment Data as a Time Series 9. Introduction to Spark Using R 10. Exploring Large Datasets Using Spark 11. Spark Machine Learning - Regression and Cluster Models 12. Spark Models – Rule-Based Learning

Exponential moving average


For a simple moving average (SMA), equal weight is given to all data points, regardless of how old they are or how recently they occurred. An exponential moving average (EMA) gives more weight to recent data, under the assumption that the future is more likely to look like the recent past, rather than the older past.

The EMA is actually a much simpler calculation. An EMA begins by calculating a simple moving average. When it reaches the specified number of lookback periods (n), it computes the current value by assigning different weights to the current value,and to the previous value.

This weighting is specified by the smoothing (or ratio) factor. When ratio=1, the predicted value is entirely based upon the last time value. For ratios b=0, the prediction is based upon the average of the entire lookback period. Therefore, the closer the smoothing factor is to 1, the more weight it will give to recent data. If you want to give additional weight to older data, decrease...

lock icon The rest of the chapter is locked
Register for a free Packt account to unlock a world of extra content!
A free Packt account unlocks extra newsletters, articles, discounted offers, and much more. Start advancing your knowledge today.
Unlock this book and the full library FREE for 7 days
Get unlimited access to 7000+ expert-authored eBooks and videos courses covering every tech area you can think of
Renews at $15.99/month. Cancel anytime}