He states: I ran across an interesting article that you may want to share in your blog. It essentially explains that a lot of the big data samples are not representative of the truth and carry inaccuracies that may violate discrimination laws. They mention the Fair Credit Reporting Act, Equal Credit Opportunity Act, Americans with Disabilities Act, the Age Discrimination in Employment Act, the Fair Housing Act, the Genetic Information Nondiscrimination Act, the Federal Trade Commission Act, Etc. Some of this big data may have hidden biases within it and it is possible for accidental discrimination to happen. They talk about a smartphone app for reporting potholes in Boston and how the people in the lower income communities did not have a lot of smartphones - which meant that a lot of the pot holes in the lower income neighborhoods were not getting fixed for some time. They mention that these kinds of accidental discrimination slips could get people in trouble in the Big data world - people getting labeled by the data could end up causing all sorts of problems in the future. Thought that it was a thought provoking look at the future of Big Data.
There was one other article from the Harvard Business Review that was referenced within this article - it also stressed the importance of using caution when Big Data is linked to location or human culture. Interesting to learn that there are several cognitive biases that can be involved in the interpretation of this Big Data
Here is the other referenced article: https://hbr.org/2013/04/the-hidden-biases-in-big-data
So, what do you all think?
Note that we'll be discussing ethical issues toward the end of the course, and we'll totally want to think about these articles then!
No comments:
Post a Comment