Under the Digital Millennium Copyright Act (DMCA), Internet service providers (ISPs) may not be liable for copyright infringement from copyrighted material passing through their systems if they take certain steps to police infringement on their site(s).

This is known as a “safe harbor.” Therefore, all services that host user-generated content must have a way for users to inform them that the content is protected.

Requirements for an ISP to Qualify for a Safe Harbor

To qualify for this safe harbor, ISPs must meet several requirements:

  • ISPs must not have actual knowledge of the copyrighted material posted on their servers.
  • If they receive notice that copyrighted material has been posted on their site, they must take reasonable steps to remove it.
  • Their direct financial benefit must not be derived from the infringing material.
  • There should be no control over the content of the material posted on their servers.
  • They must also designate an agent to receive notices of infringement.

Notice and Takedown Procedures

Before a lawsuit can be filed, a copyright owner must send a notice (commonly known as a “DMCA takedown notice”) to the website’s operator explaining that the material is infringing. Many websites have a form users can fill out that includes all the notice requirements.

The website operator should then notify the person who posted the material that it will be removed. There is an option for the person who posted the material to send a counter-notice, arguing that it does not actually violate copyright laws.

It is generally not the responsibility of the operator of a website to remove content if a takedown notice is received and acted upon.

How Does Moderating Your Website Impact Your Liability for Copyright Infringement?

Platform providers are increasingly seeking ways to curate user-generated content, both to promote good content and to filter out the bad. YouTube has recently taken steps to demonetize channels that do not align with the values of its advertisement buyers. Every online platform is trying to get rid of internet troll comments.

There is intense pressure on social media companies to moderate their content. Trump denounced “fake news” on social media. Social media companies in Germany have been required to remove certain content promptly or face heavy fines. British officials are considering similar measures.

While we would all like to see fewer internet trolls, platform providers’ efforts to moderate content may inadvertently violate copyright. Both the appearance of social media platforms and their internal moderating systems must be carefully crafted. The change from passive content hosts to providers of moderated or curated content may result in content hosts inadvertently undertaking substantial liabilities by making themselves ineligible for the copyright infringement safe harbor in the Digital Millennium Copyright Act.

According to the Digital Millennium Copyright Act (DMCA), most social media platforms, forums, and other online content hosts are not liable for copyright infringements committed by their users.

Online content hosts are covered by the DMCA in four categories:

  1. Transitory digital network communications
  2. System caching
  3. Information residing on systems or networks at the direction of users
  4. Information location tools

To qualify under this category, platforms must demonstrate (as a threshold matter) that the information was posted in the user’s direction. For example, social media sites, forums, and other online platforms fall under this category. Furthermore, platform providers must also establish “(1) it did not have actual or red flag knowledge of the infringing material; and (2) it did not receive a “financial benefit directly attributable to the infringing activity, in cases where it had the right and ability to control such activity.”

Platform providers are protected from direct and indirect copyright infringement claims if they prevent access to or remove the content in response to a DMCA notice from a copyright owner if these conditions are met.

The majority of platform hosts meet the requirements and are protected. The traditional business model of platform providers is to offer forums for their users. Generally, most platforms allow content users to upload and create anything that is legal. The DMCA may no longer offer adequate protection as platform providers increasingly control their users’ content.

In Mavrix Photographs LLC v. LiveJournal, Inc. (case no. 14-56596), the Ninth Circuit questioned whether a forum moderated by its owner is a passive host protected by the DMCA. Mavrix takes photographs of celebrities. Community members can post photos and stories in LiveJournal’s online community journals. ONTD, LiveJournal’s most popular online journal, focuses on celebrity news. ONTD members posted Mavrix’s copyright-protected celebrity photos on LiveJournal. The company sued. LiveJournal filed a DMCA safe harbor dismissal.

The trial court granted LiveJournal’s motion because LiveJournal qualified for the DMCA safe harbor since the infringing content qualified as “information residing on systems or networks at the direction of users.” Mavrix appealed.

The Ninth Circuit reversed the decision, holding there was a genuine issue of material fact as to whether LiveJournal’s content was posted at the request of users, considering LiveJournal’s content moderation procedures and the way LiveJournal’s authority appeared to its users. The court held that a reasonable jury could conclude there was an agency relationship between LiveJournal and its moderators because (1) it appeared to users that the moderators acted on LiveJournal’s behalf and (2) LiveJournal’s active moderating practices.

According to the court, LiveJournal had apparent or actual authority over the moderators based on the following facts:

  • LiveJournal has a hierarchical system of moderators who have varying levels of authority to review posts;
  • Except for one moderator, who is a LiveJournal employee, all positions are filled by volunteers;
  • A LiveJournal employee is assigned to build the ONTD journal; the employee also removes other moderators based on their performance; LiveJournal provides guidelines on how to review content, both to block undesirable content and select attractive content; and
  • LiveJournal relies heavily on ONTD for revenue.

Platform providers can learn a lot from this case. To avoid losing the DMCA safe harbor, platform providers must monitor their internal content management policies and how users perceive them. Regardless of whether a platform controls its users’ content, its presentation and dissemination strategies may make users believe that the provider is responsible for the content. A platform that promotes selected posts in regular email blasts may give the impression that it controls the content rather than its users. The curated feed can also give the impression that the content has been vetted.

The effect of algorithmic selection of content remains to be answered as well. Users may assume that the host’s algorithms can effectively filter and parse user-generated content so that they believe the content has been vetted systematically by the host’s algorithms. It may create copyright infringement liability, even if the algorithm is not as sophisticated as users may believe.

Do I Need to Hire a Lawyer?

If you run a website or own copyrighted material, you should probably consult a business lawyer in order to understand DMCA liability as it applies to your work fully.

A lawyer can walk you through the applicable laws. An experienced business lawyer can also represent you in court, if necessary. Use LegalMatch to find the right business lawyer for your needs in your area today.