Comment on page
Option 2: In-processing
If we decide to mitigate bias at the in-processing stage, we will have to change the model architecture and/or optimisation process. The amended models will be able to maximise predictive accuracy while at the same time taking into account equality. In-processing techniques are not model agnostic, since they require an understanding of the workings of the model. Many of these methods are often used for neural networks.
- Exponentiated Gradient Reduction, an algorithm that converts the problem of classifying without bias into a sequence of cost-sensitive classification tasks. It returns a randomized classifier with the lowest empirical error according to the chosen bias metric (Argarwal et al 2018)
- Grid Search Reduction, which can be used for both classification and regression. For classification, it works very similarly to the Exponentiated Gradient Reduction, but it returns the deterministic classifier with the lowest empirical error according to the chosen bias metric among the candidates searched (Argarwal et al 2018). For regression it uses the same principle to return a deterministic regressor with the lowest empirical error subject to the constraint of bounded group loss (Argarwal et al 2019).