Image via Wikipedia
A judge in Milan, Italy has convicted three Google executives over a video uploaded to YouTube in a case, which could have serious implications for social media and ultimately, the web in general, at least in Italy. The video, uploaded back in 2006, featured a group of school kids bullying an autistic child. Google says it worked with Italian authorities to help ID the person responsible for uploading it, and the uploader and other participants from the video were sentenced to community service.
Now, in 2010, Google executives David Drummond, Peter Fleischer and George Reyes (3 out of 4 defendants) have been convicted for "failure to comply with the Italian privacy code." They were all found not guilty of criminal defamation.
Should these Google execs be held accountable?
"In essence this ruling means that employees of hosting platforms like Google Video are criminally responsible for content that users upload," writes Matt Sucherman, VP and Deputy General Counsel - Europe, Middle East and Africa on the Google Blog. "We will appeal this astonishing decision because the Google employees on trial had nothing to do with the video in question."
This is a case of a business being held accountable for user-generated content. Isn't the entire web generated by users? What if Google's search engine (algorithmically) indexed something illegal. Should company execs be penalized, even if they comply with authorities' requests for removal of such content? Ask yourself these questions:
- What if YouTube, Facebook, MySpace, Twitter, etc. had to shut down because it couldn't control the things users post?
- What if every blogging platform had to do the same?
- What if you went to jail for comments posted on your blog?
You're invited to become a Facebook fan of WebProNews!
You're not likely going to go to jail for comments posted on your blog, but the point is, that by allowing people to post comments on your blog, you are allowing user-generated content, that you can't necessarily control until after it's been posted, unless you don't let them go live until approving them. Google is being held accountable for content that users uploaded, which was not in their control until after the fact. YouTube users upload 20 hours of video every minute, according to Google.
You can see why this case is much bigger than just the specific instance it involves. The case is subject to appeal, but if it is not overturned, what will this mean for the web?
"The video was totally reprehensible and we took it down within hours of being notified by the Italian police," says Sucherman.
"To be clear, none of the four Googlers charged had anything to do with this video," he says. "They did not appear in it, film it, upload it or review it. None of them know the people involved or were even aware of the video's existence until after it was removed."
He goes on to talk about how the case "attacks the very principles of freedom on which the Internet is built," also mentioning that European Union law dictates that hosting providers have a safe harbor from liability as long as they remove illegal content once they are notified of its existence. "If that principle is swept aside and sites like Blogger, YouTube and indeed every social network and any community bulletin board, are held responsible for vetting every single piece of content that is uploaded to them — every piece of text, every photo, every file, every video — then the Web as we know it will cease to exist, and many of the economic, social, political and technological benefits it brings could disappear," Sucherman says.
If rulings such as the one against these Google execs were to become commonplace, how much do you think that would affect the social media industry? Companies like Google, Facebook, MySpace, etc. couldn't let users upload content, which essentially means social media couldn't exist. User-generated content couldn't exist. How could you blog? How could you leave a status update on Facebook, or upload a family photo to Picasa? There is always the possibility that some user could make a death threat or upload a murder video, so if the companies behind the services that were used to commit these crimes were held accountable, how could their businesses continue?
That's why Google is not only upset about the ruling against its executives, but calls it a "serious threat to the web."
Should Google (or any other site) be held responsible for content that users upload (even when said content is removed)? Share your thoughts.
About the Author:
Chris Crum has been a part of the WebProNews team and the iEntry Network of B2B Publications since 2003. Follow WebProNews on Facebook or Twitter. Twitter: @CCrum237