More Granular Measurements
For sports that have lumpy scoring (NFL, NHL, MLB) you might perform a similar analysis using even more granular data than scoring. For example, to remove cluster luck from baseball scoring, you might do an analysis of net base production or in football you might analyze yards per play or play success rates.
Grading Your Own Predictions
Now let’s say you’ve made a model to come up with your own predictions for games (we’ll cover several ways to do this in our model building section) and you want to assess your predictions vs the market (or someone else). In statistics and machine learning, two common ways of assessing performance are by the mean absolute error (“MAE”) and the root mean squared error (“RMSE”) of various models.
Mean Absolute Error
The great thing about these terms is that their names so accurately describe their calculations. The mean absolute error is the average (mean) of the absolute value of your model’s prediction error. So if you forecasted a game to be -5 and the game ended in -3 the absolute value of the error of your model was 2 points. Do that for every prediction and take the average. Simple enough.
Root Mean Squared Error
The root mean squared error is conceptually very similar to MAE except that you first 1) square your error term, then 2) take the average (mean) of the squared error terms and finally 3) take the squared root of those squared errors.
We’ve calculated the MAE and RMSE for the NBA ATS wagers that you made below. Naturally, since those wagers had a positive average MOV, we’re not surprised that the prediction error was less than the market.