Time Series Basics

Learn time series fundamentals: trend, seasonality, noise, moving averages, autocorrelation, and why time-ordered data requires special methods.

22 min read
Intermediate

Data With Memory

Most statistics assumes observations are independent: one data point doesn't affect another. But time series data violates this assumption fundamentally.

Today's stock price depends on yesterday's. This month's sales are correlated with last month's. Temperature now is similar to temperature an hour ago.

Time series analysis handles data where order matters and observations are dependent.

Components of Time Series

Most time series can be decomposed into:

1. Trend: Long-term increase or decrease
2. Seasonality: Regular patterns that repeat (daily, weekly, yearly)
3. Cycles: Irregular fluctuations (business cycles, not fixed period)
4. Noise: Random variation

Retail Sales

Monthly sales at a clothing store:

Trend: Gradual increase over years (business growing)
Seasonality: Spikes in December (holidays), drops in January
Cycles: Economic recessions cause multi-year downturns
Noise: Random day-to-day variation

Understanding these components helps forecast future sales and remove patterns to analyze underlying changes.

Trend

Trend is the long-term direction: upward, downward, or flat.

How to identify:

  • Plot the data over time — is there a clear direction?
  • Calculate moving average to smooth out noise
  • Fit a regression line (linear trend) or curve (non-linear)

Why it matters: Separating trend from short-term fluctuations prevents overreacting to noise.

Stock Prices

A stock jumps 5% one day. Is this:

  • Part of an upward trend?
  • Random noise around a flat trend?
  • A reversal of a downward trend?

Without analyzing the trend, you can't tell. Time series analysis separates signal (trend) from noise.

Seasonality

Regular patterns that repeat at fixed intervals:

  • Daily (traffic peaks morning/evening)
  • Weekly (restaurant busy on weekends)
  • Monthly (bills due, paychecks arrive)
  • Yearly (holidays, weather, school schedules)
Ice Cream Sales

Monthly ice cream sales show clear seasonality:

  • Summer: High sales
  • Winter: Low sales
  • This repeats every year

Seasonal adjustment: Remove the seasonal pattern to see underlying trends. If this July's sales are up 10% from last July, is the business growing? Or just seasonal?

Unemployment, GDP, retail sales — most economic data is seasonally adjusted to remove predictable patterns and reveal real changes.

Moving Averages

Average of the last k observations. As time moves forward, the window slides (hence "moving").

Purpose: Smooth out short-term fluctuations to see the underlying trend.

7-Day Moving Average

Daily website traffic: [100, 150, 120, 200, 110, 130, 140]

7-day MA (centered): average of all 7 = 135.7

Next day: [150, 120, 200, 110, 130, 140, 160] 7-day MA = 144.3

The moving average smooths out daily spikes and drops, revealing the trend more clearly.

COVID-19 reporting used 7-day MAs to smooth out weekend reporting delays and see true trends.

Choosing k:

  • Larger k: Smoother, but lags current data more
  • Smaller k: More responsive, but noisier
  • k = 7 for weekly patterns, k = 12 for monthly (yearly data)

Autocorrelation

Correlation of a time series with itself at a lagged time.

Lag 1: Correlation between Xₜ and Xₜ₋₁ (today vs yesterday)
Lag 7: Correlation between Xₜ and Xₜ₋₇ (today vs last week)

High autocorrelation means the series has memory — past values predict future values.

Examples:

  • Stock prices: High lag-1 autocorrelation (today's price ≈ yesterday's)
  • Temperature: High autocorrelation (weather changes slowly)
  • Coin flips: Zero autocorrelation (each flip independent)

Why it matters: Autocorrelation violates independence assumptions of standard tests. Time series models account for this.

Forecasting Basics

Forecasting = predicting future values based on past patterns.

Simple methods:

1. Naive forecast: "Tomorrow will be like today"
Forecast = last observed value

2. Moving average forecast: "Tomorrow will be like recent average"
Forecast = average of last k values

3. Trend extrapolation: "The trend will continue"
Fit trend line, extend into future

4. Seasonal naive: "This December will be like last December"
Use same month from last year

All forecasting assumes the past predicts the future. Structural changes (pandemics, policy shifts, technology disruptions) break this assumption. Forecasts are most reliable for stable processes in the short term.

Common Time Series Patterns

Identifying Time Series Patterns
Pattern
Description
Example
StationaryNo trend, constant variance, no seasonalityWhite noise, coin flips
TrendingLong-term increase/decreaseStock prices, GDP over time
SeasonalRegular repeating patternRetail sales, energy use
CyclicIrregular ups and downsBusiness cycles, market bubbles
Random walkNo predictable pattern, high autocorrelationStock returns, drunk walk

Why Time Series is Different

Standard statistics assumes i.i.d. (independent and identically distributed). Time series violates independence.

Consequences:

  • Standard errors are wrong — autocorrelation makes them too small
  • Hypothesis tests invalid — independence assumption fails
  • Regression spurious — two trending series appear correlated even if unrelated

Solution: Use time series methods (ARIMA, exponential smoothing, state space models) that account for dependence.

Spurious Regression

Regress "number of Nicolas Cage movies" on "people who drowned in pools" (both trending up over time).

Result: r² = 0.67, highly significant!

But: This is nonsense. Both variables trend upward over time independently. The correlation is spurious — an artifact of ignoring time dependence.

Lesson: Never regress one time series on another without accounting for trends and autocorrelation.

Test your knowledge

🧠 Knowledge Check
1 / 5

What are the main components of a time series?