Thursday, September 20, 2012

The algorithm decided not to hire you: is that legal?


I spend a lot of time thinking about privacy and algorithms.  

The Wall Street Journal carried an interesting story "Meet the New Boss:  Big Data", about how algorithms are now being widely used to make human resources decisions, like hiring and promotion.  The article pointed out that such algorithms could run into legal problems, if they intentionally or unintentionally filter out protected categories of employees, like older employees, under US anti-discrimination laws.  But the article didn't discuss a more fundamental legal issue, at least in Europe.

In Europe, "automated individual decisions" are a violation of EU privacy laws.  Article 15 of the EU Privacy Directive guarantees:  "...the right to every person not to be subject to a decision which produces legal effects concerning him or significantly affects him and which is based solely on automated processing of data intended to evaluate certain personal aspects relating to him, such as his performance at work, creditworthiness, reliability, conduct, etc".  

Well, that's about as clear as a law can get.  In our age of Big Data, we all know algorithms are being refined and used more and more widely to make decisions about hiring and promotion, and many other topics.  But when these decisions are made solely by algorithms, they are violating EU privacy laws.  Period.  The only way such algorithms can be used legally is to supplement them with certain other measures to safeguard the legitimate interests of the person being evaluated, e.g., by allowing him to put his point of view.  

I'm a great believer that algorithms can help all of us (governments, businesses, individuals) make better decisions.  But when a computer program is making key decisions by itself about whom to hire or fire, or whether or not to extend credit to someone, it's fair to ask for additional safeguards.  The privacy laws in Europe require it.  I'm agnostic about whether algorithms are more or less fair than humans at making a lot of such decisions.  In any case, companies using such algorithms need to consider how to make them comply with European privacy laws.  When algorithms are used to supplement other evaluation tools, they should be legal.  When algorithms are used to make these decisions by themselves, there's a serious risk they would be considered illegal in Europe.  Use with care.  



2 comments:

Anonymous said...

Peter, see my 2 cents on this here: http://www.concurringopinions.com/archives/2012/09/big-data-for-all.html

john kropf said...

Well said. Noticed that under the proposed EU Data Privacy Directive, automated decision making is now under the name "profiling."