4/14/2017 4 Comments Facebook Under Fire #12Once again Facebook is coming under fire for what many see as a very serious flaw, the spreading of illegal content on its website. Imagine that 24 hours a day, there are thousands of employees of Facebook who are known as “moderators” and it is their sole responsibility to screen content that is being added to the site literally every second. At the end of 2016, the platform of Facebook consisted of nearly 2 billion users, so one can only imagine how much content they are faced with managing on a moment to moment basis. And once again, the subject of uploaded content is being brought under the spotlight, namely for child pornography and terrorist propaganda that until it was brought to the attention of Facebook was allowed to remain on the sight. The company does recognize they have a problem, and recently CEO Mark Zuckerberg addressed the issue by stating that they are working to develop artificial intelligence programs to combat this problem. However, he also stated that this program is “years out” and until it does exist they must rely on the work of their moderators. This really presents a very intense ethical dilemma for the company, trying to preserve the right of free speech while at the same time protecting the users from being exploited by illegal material. In my opinion, they are doing as much as they possibly can, and they are very outspoken about how they deal with this problem. They certainly do not hide from the issue, and when the CEO steps up and addresses it I think it is safe to say they are truly committed to finding the right solution.
Adding even more intensity to the situation is that several governments have now begun to get involved, Britain and Germany have both been discussing possible ways to prosecute Facebook if it does not control the issue. While it is unknown at this time just how that process may unfold, what is clear is that either Facebook must employ more people to manage the situation now, or they must limit certain content until such time as an AI program can do it for them. Facebook currently does use a program from Microsoft known as “Photo DNA” and that program scans all uploads for known images of child abuse. Apparently, the program is not capable of moderating child pornography, which is certainly unfortunate. It does seem pretty clear to me that the company really is invested in doing the best for the users, and I would imagine that could be rather complicated. When we consider the different countries, cultures, and even societies that exist and how those differences also impact content and morality it gets quite complicated. For instance, in some countries marrying at a very early age may be acceptable, and what could appear as child exploitation may in fact just be a perfectly acceptable practice to the person that posts it something about it on Facebook. I will continue to pursue news information about this subject on a regular basis, as I feel that how Facebook continues to grow and emerge will stay front and center globally, and will set new standards that all of us may be impacted by.
4 Comments
Matt Provo
4/18/2017 03:12:09 pm
I enjoyed reading your post about Facebook. I was not aware that there large problem with people using Facebook to spread illegal content. I was surprised to read that Facebook staff are monitoring uploads, and there are not more sophisticated safeguards in place. It seems that Microsoft's Photo DNA which Facebook uses is not protecting against the most important problems.
Reply
Juan Reyes
4/19/2017 12:23:00 pm
Hi Luigi,
Reply
5/20/2017 07:50:03 pm
Hi Luigi,
Reply
Cammron Keehley
5/23/2017 08:59:11 pm
Hey Luigi,
Reply
Leave a Reply. |
Archives
May 2017
Categories
|