Who First Came Up With Moving Averages?
Technical analysts have been using moving averages now for several decades. They are so ubiquitous in our work that most of us do not know where they came from.
Statisticians categorize Moving Averages as part of a family of tools for “Time Series Analysis”. Others in that family are:
ANOVA, Arithmetic Mean, Correlation Coefficient, Covariance, Difference Table, Least Squares Fitting, Maximum Likelihood, Moving Average, Periodogram, Prediction Theory, Random Variable, Random Walk, Residual, Variance.
You can read more about each of these and their definitions at Wolfram.
Development of the “moving average” dates back to 1901, although the name was applied to it later. From math historian Jeff Miller:
MOVING AVERAGE. This technique for smoothing data points was used for decades before this, or any general term, came into use. In 1909 G. U. Yule (Journal of the Royal Statistical Society, 72, 721-730) described the “instantaneous averages” R. H. Hooker calculated in 1901 as “moving-averages.” Yule did not adopt the term in his textbook, but it entered circulation through W. I. King’s Elements of Statistical Method (1912).
“Moving average” referring to a type of stochastic process is an abbreviation of H. Wold’s “process of moving average” (A Study in the Analysis of Stationary Time Series (1938)). Wold described how special cases of the process had been studied in the 1920s by Yule (in connection with the properties of the variate difference correlation method) and Slutsky [John Aldrich].
From StatSoft Inc. comes this description of Exponential Smoothing, which is one of several techniques for weighting past data differently:
“Exponential smoothing has become very popular as a forecasting method for a wide variety of time series data. Historically, the method was independently developed by [Robert Goodell] Brown and [Charles] Holt. Brown worked for the US Navy during World War II, where his assignment was to design a tracking system for fire-control information to compute the location of submarines. Later, he applied this technique to the forecasting of demand for spare parts (an inventory control problem). He described those ideas in his 1959 book on inventory control. Holt’s research was sponsored by the Office of Naval Research; independently, he developed exponential smoothing models for constant processes, processes with linear trends, and for seasonal data.”
Brown’s book, Smoothing, Forecasting and Prediction of Discrete Time Series, was republished in 2004.
Holt’s paper, “Forecasting Seasonals and Trends by Exponentially Weighted Moving Averages” was published in 1957 in O.N.R. Research Memorandum 52, Carnegie Institute of Technology. It does not exist online for free, but may be accessible by those with access to academic paper resources.
See also Forecasting With Exponential Smoothing, by Hyndman, Koehler, Ord, and Snyder.
To our knowledge, P. N. (Pete) Haurlan was the first to use exponential smoothing for tracking stock prices. Haurlan was an actual rocket scientist who worked for JPL in the early 1960s, and thus he had access to a computer. He did not call them “exponential moving averages (EMAs)”, or the mathematically fashionable “exponentially weighted moving averages (EWMAs)”. Instead he called them “Trend Values”, and referred to them by their smoothing constants. Thus, what today is commonly called a 19-day EMA, he called a “10% Trend”. Since his terminology was the original for such use in stock price tracking, that is why we continue to use that terminology in our work.
Haurlan had employed EMAs in designing the tracking systems for rockets, which might for example need to intercept a moving object like a satellite, a planet, etc. If the path to the target was off, then some sort of input would need to be applied to the steering mechanism, but they did not want to overdo or underdo that input and either become unstable or fail to turn. Thus, the right sort of smoothing of data inputs was helpful. Haurlan called this “Proportional Control”, meaning that the steering mechanism would not try to adjust out all of the tracking error all at once.
EMAs were easier to code into early analog circuitry than other types of filters because they only need two pieces of variable data: the current input value (e.g. price, position, angle, etc.), and the prior EMA value. The smoothing constant would be hard-wired into the circuitry, so the “memory” would only have to keep track of those two variables. A simple moving average, on the other hand, requires keeping track of all values within the lookback period. So a 50-SMA would mean keeping track of 50 data points, then averaging them. It ties up a lot more processing power.
See more about EMAs versus Simple Moving Averages (SMAs) at Exponential Versus Simple.
Haurlan founded the Trade Levels newsletter in the 1960s, leaving JPL for that more lucrative work. His newsletter was a sponsor of the Charting The Market TV show on KWHY-TV in Los Angeles, the first-ever TA television show, hosted by Gene Morgan. The work of Haurlan and Morgan were a big part of the inspiration behind Sherman and Marian McClellan’s development of the McClellan Oscillator and Summation Index, which involve exponential smoothing of Advance-Decline data.
You can read a 1968 booklet called Measuring Trend Values published by Haurlan starting on page 8 of MTA Award Handout, which we prepared for attendees at the 2004 MTA conference where Sherman and Marian were awarded the MTA’s Lifetime Achievement Award. Haurlan does not list the origin of that mathematical technique, but notes that it had been in use in aerospace engineering for many years.