Glossary

Error

In machine learning, *error* measures how far a model’s predictions deviate from the true values. It’s typically split into **training error** (how well the model fits the data it was trained on) and **generalization error** (how well it performs on unseen data). High error indicates poor model accuracy, often due to underfitting, overfitting, or noise in the data.

by Frank Zickert