Detecting Algorithmic Bias and Skewed Decision Making

When considering all predictor variables, including the race attribute, the model learned to correlate race with the criminality outcome..Then, the researchers were able to get a more accurate spotting of criminals..Researchers used variables that only related to a person’s criminality..This was instead of if they had of built a model highly dependent on race and appearance..The Takeaway First, this research shows that relying on race as a predictor leads to a skewed outcome..Second, it also shows how ineffective the police would be by allowing such bias to be at the core of their decisions..Visualizations of the predicted versus the actual data show how some locations with a high amount of arrests could be completely missed if they were to depend on race..How we construct our models and the variables we use can truly affect people’s opportunities, livelihood, and overall well-being..Therefore, this must be handled ethically and responsibly..As data scientists, our philosophy should be built on the pursuit of truth, not the manipulation of models to find the most convenient or profitable results at all costs, even at the cost of our ethics..It is important that we include bias assessments as part of the process, so we can be more confident that our models are designed to better our understanding of people and make smarter decisions, not dumb and discriminatory decisions.. More details

Leave a Reply