{"id":42725,"date":"2023-10-20T15:23:19","date_gmt":"2023-10-20T15:23:19","guid":{"rendered":"http:\/\/startupsmart.test\/2023\/10\/20\/big-data-algorithms-can-discriminate-and-its-not-clear-what-to-do-about-it-startupsmart\/"},"modified":"2023-10-20T15:23:19","modified_gmt":"2023-10-20T15:23:19","slug":"big-data-algorithms-can-discriminate-and-its-not-clear-what-to-do-about-it-startupsmart","status":"publish","type":"post","link":"https:\/\/www.startupsmart.com.au\/uncategorized\/big-data-algorithms-can-discriminate-and-its-not-clear-what-to-do-about-it-startupsmart\/","title":{"rendered":"Big data algorithms can discriminate, and it’s not clear what to do about it – StartupSmart"},"content":{"rendered":"
<\/em><\/span><\/p>\n \u201cThis program had absolutely nothing to do with race\u2026 but multi-variable equations.\u201d<\/em><\/p>\n That\u2019s what Brett Goldstein, a former policeman for the Chicago Police Department (CPD) and current Urban Science Fellow at the University of Chicago\u2019s School for Public Policy, said about a predictive policing algorithm he deployed at the CPD in 2010. His algorithm tells police where to look for criminals based on where people have been arrested previously. It\u2019s a \u201cheat map\u201d of Chicago, and the CPD claims it helps them allocate resources more effectively.<\/p>\n <\/p>\n Chicago police also recently collaborated with Miles Wernick, a professor of electrical engineering at Illinois Institute of Technology, to algorithmically generate a \u201cheat list\u201d of 400 individuals it claims have the highest chance of committing a violent crime. In response to criticism, Wernick said the algorithm does not use \u201cany racial, neighborhood, or other such information\u201d and that the approach is \u201cunbiased\u201d and \u201cquantitative.\u201d By deferring decisions to poorly understood algorithms, industry professionals effectively shed accountability for any negative effects of their code.<\/p>\n <\/p>\n But do these algorithms discriminate, treating low-income and black neighborhoods and their inhabitants unfairly? It\u2019s the kind of question many researchers are starting to ask as more and more industries use algorithms to make decisions. It\u2019s true that an algorithm itself is quantitative \u2013 it boils down to a sequence of arithmetic steps for solving a problem. The danger is that these algorithms, which are trained on data produced by people, may reflect the biases in that data, perpetuating structural racism and negative biases about minority groups.<\/p>\n <\/p>\n There are a lot of challenges to figuring out whether an algorithm embodies bias. First and foremost, many practitioners and \u201ccomputer experts\u201d still don\u2019t publicly admit that algorithms can easily discriminate. More and more evidence supports that not only is this possible, but it\u2019s happening already. The law is unclear on the legality of biased algorithms, and even algorithms researchers don\u2019t precisely understand what it means for an algorithm to discriminate.<\/p>\n <\/p>\n <\/p>\n\n
<\/em><\/p>\n<\/blockquote>\n