Elizabeth Barrette (ysabetwordsmith) wrote,
Elizabeth Barrette
ysabetwordsmith

  • Mood:

The Tyranny of Algorithms

Algorithms routinely discriminate against disadvantaged groups, such as women and people of color.  This is because they are programmed by biased humans and fed biased data.  So they become a form of institutionalized discrimination.  

This is a problem because you can't argue with an algorithm.  It manipulates data before humans even see it, which makes correction -- even detection -- difficult or impossible.  With a human, there's always the chance that they'll find your entry interesting despite your traits they dislike, or in facetime, you might talk them into being decent.

However, there are things we can do to fix this problem.

* If you code algorithms, test them on demographic data.  That is, about 51% of examples should be female, about 20% disabled (mix of visible and invisible disabilities of various types), demographics according to your locale or where the algorithm will be used, etc.

* If you buy and use algorithms, test them for bias.  Frex, feed them a batch of equivalent resumes that differ only by sex, color, religion, ability, etc. and see if the proportions remain intact or become biased.  If biased, complain to the programmer and demand an unbiased algorithm, or pay somebody to make you one.

* Algorithms all share one vulnerability humans do not: they can be hacked.  If you are a hacktivist, break into algorithms and command them to favor  disadvantaged groups.   Frex, approve all the women or all the people of color or both.  In the short term, this will help people overcome discrimination.  In the long term, corporations will learn that algorithms leave them vulnerable to manipulation, and they may decide to quit using them or at least make them more equitable and less attractive to hacktivists.

* Other folks can help by speaking up every time they spot a biased algorithm.  That includes filing a complaint with the company, but also panning the algorithm on social media where it is easier to aggregate many reports of the same problem into numbers large enough to have an impact.
Tags: activism, cyberspace theory, ethnic studies, gender studies, safety
Subscribe
  • Post a new comment

    Error

    default userpic

    Your IP address will be recorded 

    When you submit the form an invisible reCAPTCHA check will be performed.
    You must follow the Privacy Policy and Google Terms of use.
  • 0 comments