Fairness on Decision of Data
If you shoplifted a candy bar as a teen, should that determine your access to commerce? Should your name, race, or gender be considered when analyzing your risk profile? These biases are present in all humans, and as a result, are present in the algorithms humans create to assist with decision making.
In this presentation, Adyen will contextualize these biases, and present methods to tackle these issues when designing machine learning systems that strive for optimum performance and high ethical standards.
Some content is hidden, to be able to see it login here Login