In statistical learning, BER is a lower bound for classification errors (lowest possible prediction error). Attaining the minimum of classification errors is the main objective of all prediction methods; however, calculating BER is difficult. In many classification problems, it is impossible to calculate BER. BER for Gaussian distribution has no closed form when the covariance matrices of distributions are different. For nearly a century, many researchers have been trying to provide a way to calculate or compute BER. However, their efforts have only found some upper bounds for BER. We calculate Exact BER for any multivariate finite mixture model and compute BER for any real data set. The present demo compute BER for real data (dimension<5, records<300 and 3-class) and calculates BER for the binary and 3-class of bivariate Gaussian distribution. To illustrate the problem, in the following, we consider the simplest case: a univariate two-component Gaussian mixture model. Two densities are multiplied by corresponding weights and plotted in red and blue colors, respectively. The means and variances of Gaussian mixture components can be changed.
The blue area of graph shows BER. To calculate BER for a new Gaussian mixture, it is enough to adjust parameters of classes by slidbares. The decision boundaries have been shown by black dashed line. Please see the references for details of BER.