15. DIGITAL CODE OF CONDUCT

Explains what a digital code of conduct is and what it should contain. The digital code of conduct involves concepts such as netiquette (respectful and appropriate conduct online) and good digital citizenship (safe online behaviour).

1.1 What is a digital code of conduct?

It’s a set of rules pertaining to the use of participatory media. A digital code of conduct forms the basis for user and content moderation. It’s also a valuable tool for educating your users about computer-mediated communications, in addition to helping keep your platform environment safe and respectful.

1.2 What should a digital code of conduct include?

Common topics include the sharing of personal information, disrespect toward other users, inappropriate content and disruptive behaviour (spamming [Backgrounder 11], abuse of the reporting mechanism, etc.). The code of conduct can also serve as a useful reminder of copyright principles to prevent users from inadvertently or deliberately publishing copyrighted works.

Set out in accessible language, a digital code of conduct should list (as exhaustively as possible) rights and responsibilities, expectations and the consequences in the event of an infringement. The code should specify the consequences if the rules are broken — for example, the removal of content or the suspension/closure of a user’s account.

1.3 How should the digital code of conduct be displayed?

There is no single recommendation for how or where a digital code of conduct should be displayed. Its importance will vary greatly depending on the type of platform and degree to which the platform is participatory. For example, in a social network where users have frequent opportunities to interact, the digital code of conduct may take the form of a contract that users must agree to before they can participate.

A digital code of conduct is not required by law; however, given that it forms the basis of user and content moderation, you are strongly recommended to include one. For more information on the regulatory framework that applies to participatory media, click here.

Mobile app stores do not require developers to post a digital code of conduct.

No self-regulation program specifically advocates having a digital code of conduct.

  • Some platforms integrate their digital code of conduct into their terms of use. However, in youth production, you’re best off making it a separate document to show your commitment to your users’ safety.
  • Ensure that parents can access your digital code of conduct at all times.
  • If your audience consists of preschoolers, your code must address their parents and be posted in the parents’ section.
  • If appropriate to your platform, present your digital code of conduct in a playful manner so as to prompt children to put the rules into practice.
  • Avoid legal jargon: your code should adopt a familiar tone and be expressed in plain language adapted to your audience’s level of maturity. For example, use familiar formulations like “When you come here to play, you don’t have the right to . . .”
  • A rule must be stated precisely. As needed, consider using examples to illustrate ideas that are more complex. A statement like “Watch your privacy! Don’t share personal information in the chat room” could be accompanied by a list of what constitutes personal information.
  • Consequences should be increased in severity based on type and number of offences: issuing a warning, removing the content, suspending the account for a day, closing the account.
  • Make participation contingent on your digital code of conduct: have users check a box to signal their acceptance (“I have read and accept…”) before allowing them to access your platform.
image001
Source:Teen Nick@ Viacom International Inc.

Bibliography.

Read More

14. USER AND CONTENT MODERATION

Addresses products that allow users to contribute content and/or interact with each other or the content.

1.1 What is moderation?

On what basis should moderation be performed?Moderating an interactive platform means screening content submitted by users and/or user interactions by applying a set of predefined rules to distinguish the acceptable from the unacceptable. In youth production, moderation covers the following:

  • “Inappropriate” speech, behaviour and content (adult users/predators, preventing real-world encounters, violence, drugs, alcohol, weapons, illegal activities, etc.)
  • Disrespectful behaviour toward other users (stalking, cyberbullying, inappropriate language, racist or sexist remarks, hate speech, etc.)
  • The disclosure of personally identifiable information
  • Disruptive behaviour (abuse of reporting mechanisms, spamming, impersonating platform staff, etc.)
  • Content that breaches intellectual property rights (e.g. posting a link to illegally download a movie or sharing a photo that does not belong to the user)

1.2 What are the possible approaches to moderation?

There are a number of ways to approach moderation:

  • Pre-moderation: when content submitted to a website is placed in a queue to be checked by a moderator before being made public.
  • Post-moderation: when submitted content is displayed immediately but replicated in a queue for a moderator to review and remove if inappropriate.
  • Automated moderation: deploying various technical tools to process user-generated content (UGC) through a set of defined rules. Often used in chatroom scenarios, these tools include:
    • White lists: predefined words. Users cannot enter their own text.
    • Black lists: users type their own messages which are filtered to remove any inappropriate words before they can be seen by other users.

1.3 How should users who violate the rules of good conduct be handled?

Consequences imposed for breaches of conduct should be graduated in line with the gravity of the alleged violations. For example, the user receives a formal warning after the first offence, his account is suspended for 24 hours after the second and so on, right up until the account is closed. This educates users who act in good faith but who, in their inexperience, bypass or transgress certain rules of conduct. It’s also a means for moderators to detect suspicious behaviour that could help identify a predator (Backgrounder 21, 1.3).

1.4 Which moderation approach is best?

There is no one universal approach for all platforms. Each approach offers a different level of control over published content and user interactions. Depending on your needs, you can opt for a combined approach. When evaluating your requirements, consider the following:

  • Assessing the risk posed by UGC
  • The levels of maturity and autonomy of your users
  • The budget you can allocate to moderation

For more information: How to De-Risk the Creation and Moderation of User-Generated Content

1.5 Flagging inappropriate content/behaviour

Tools that let users anonymously denounce inappropriate content and/or behaviour are essential to ensuring compliance with certain intellectual property laws, including copyright laws. In general, operators will be absolved of all liability if they act promptly when advised of the presence of infringing content on their platforms.

Inappropriate content or behaviour can be flagged in a number of ways; the important thing is to have a mechanism that’s easy to access and use. One way is to have a clickable “Report” icon everywhere content can be shared or users can interact. You must also establish effective procedures for responding quickly to such alerts.

While moderation is not required by law, it is strongly recommended in youth production for security reasons. For more information on the regulatory framework governing participatory media, click here.

UNITED STATES

Children’s Online Privacy Protection Act (COPPA)

Federal law that applies to products that collect personal information from U.S. children under 13 years of age. COPPA considers photos, videos and audio files containing a child’s image or voice to be personal information. If your platform enables content of this kind to be shared, you must:

  1. Screen and remove any content (photos, videos or audio tracks) where children can be seen and/or heard before this content goes online; OR
  2. Obtain verifiable parental consent before allowing children to submit photos, videos and/or audio recordings of themselves.

Under COPPA, certain user interactions — for example, contributing to a forum — do not require verifiable parental consent provided the operator takes reasonable measures (e.g. pre-moderation) to ensure that no personal information is inadvertently disclosed. Staff who have access to personal information (e.g. moderators) must be sufficiently trained to handle sensitive information of this kind.

Each app store has a rating system based on the presence or absence of various criteria (violence, language level, nudity, pornography, etc.). This applies to all app content, including user-generated content. Apps that allow UGC should employ a moderation method to filter content and have a whistle-blowing mechanism for inappropriate content as well as a means of blocking abusive users.

While no specific self-regulation code applies to content moderation, it is strongly recommended for participatory media.

  • Consider your production budget and determine what portion you wish to allocate to moderation, based on your platform’s features, keeping in mind that content filtering and moderating can entail significant operating costs.
  • Moderation should be based on defined rules. These rules of conduct, expressed in accessible language, should be laid out in your digital code of conduct and/or terms of use.
  • Include a clause granting you the right to remove any content that violates your platform’s policies.
  • Post security reminders in “critical” areas of your platform. For example, in the chatroom, remind children not to disclose personal information like phone numbers.
  • If you opt for pre-moderation, erase all content containing personal data: blur faces in photographs, strip metadata from documents and so on.
  • Moderators work closely with children. Be sure to select them carefully and train them to manage problem situations as well as work with personal information.
  • Make sure your reporting mechanism is easy to find and use. Design it so that users can include the reason for the complaint (e.g. “Why are you reporting this content?” with checkboxes they can tick and/or a text box), since this will speed up processing.
  • To prevent the exchange of encrypted data (age, address, etc.), the following are recommended:
    • Block the numbers keypad, making it impossible for the user to enter digits (1, 2, etc.).
    • Use a black-list tool to block numbers written out in words (one, two, etc.)
  • For products where extra precautions are advisable (e.g. for very young audiences), the use of a white list is recommended.
  • Periodically review and update your moderation methods (e.g. black list words).
  • Moderating an interactive platform may raise legal issues: for your own protection, seek professional legal advice.

Bibliography.

Read More

13. USER-GENERATED CONTENT (UGC)

Applies to platforms that allow users to publish content created or referenced by users.

1.1 What are the characteristics of user-generated content?

User-generated content (UGC) is broadly defined as “material uploaded to the Internet by website users.” As such, it encompasses both original content and links to other content posted by users. In general, UGC is when the user is the content’s author. Such content can be created offline and downloaded on your platform (e.g. photos) or created using tools available on your platform (e.g. designing an item for a virtual world or leaving a comment on a forum). Content that users share but have not created — for example, newspaper articles, YouTube videos and so on — can also be considered UGC.

Media or platforms that publish UGC are known as interactive or participatory. Such platforms have varying implications with regard to the collection of personal information, intellectual property and user and content moderation.

1.2 What risks to personal information are associated with UGC?

With young audiences, particular attention needs to be paid to the inadvertent collection of personal information, i.e. when children publicly share information allowing them to be personally identified. A child may, for example, divulge his or her physical address in a public forum. In this sense, to ensure the safety of young users, you should consider incorporating content moderation mechanisms.

For further details on the collection of personal information, click here.

1.3 Should a participatory youth platform be moderated?

Allowing users to post content on a given platform implies a certain loss of control: there’s nothing to stop reprehensible conduct like sharing offensive content or infringing upon intellectual property rights. Making interactive youth media “safe” means implementing mechanisms to moderate content and users.

1.4 What are the implications for intellectual property on participatory media?

To avoid intellectual property breaches, including copyright breaches, as well as secure adequate rights to use content generated by users, operators of participatory platforms must monitor all content posted. The information pertaining to intellectual property is set out in the platform’s terms of use, which is a legal document.

User-generated content as it pertains to personal information is addressed specifically in the section on the United States. For other countries, refer to the general information on the collection of personal information. User and content moderation on participatory platforms is addressed in depth in Backgrounder 14. The section below mainly addresses intellectual property.

CANADA

Copyright Act

On January 2, 2015, Canada adopted a new “notice and notice” regime that gives copyright owners a certain amount of control over how their works published online, including UGC, are used. Under the regime, when a copyright owner thinks that a user might be infringing their copyright, they can send a notice of alleged infringement to the platform operator. The platform operator must then forward the notice to the user who has allegedly infringed the copyright. An operator who refuses to send the notice may be held responsible for allowing copyright to be breached.

Additional information:

Government of Canada, Office of Consumer Affairs: Notice and Notice Regime

**Québec: An Act to Establish a Legal Framework for Information Technology

Québec is the only province offering platform operators legal protection with regard to user-generated content. Operators are not liable for the activities of those who use their services unless they are aware that the UGC is serving illicit purposes. At that point, the operator must promptly remove the content. However, operators are not responsible for overseeing content stored or shared through their service nor for investigating whether or not the content is used for illicit purposes.

Additional information:

Copyright Act

Québec – An Act to Establish a Legal Framework for Information Technology

UNITED STATES

Communications Decency Act (CDA) and Digital Millennium Copyright Act (DCMA)

The United States has a legal apparatus that protects participatory media operators fairly well.

Where platform content is created entirely by third parties, the operator may qualify for immunity under the Communications Decency Act. This shields the operator from all liability for disseminating UGC, including cases of alleged defamation, misrepresentation, negligent, fraudulent or misleading statements, false advertising and other crimes. However, the immunity does not cover intellectual property infringement.

The Digital Millennium Copyright Act protects operators from allegations of intellectual property violation: under this law, the operator cannot be held responsible for distributing and/or storing copyright-infringing UGC. To qualify, operators must publish a copyright policy, notice and takedown procedures, and notification to the effect that users who repeatedly violate copyright will have their accounts closed.

Children’s Online Privacy Protection Act (COPPA)

Federal law that applies to products that collect personal information from U.S. children under 13 years of age. COPPA considers photos, videos and audio files containing a child’s image or voice to be personal information. If your platform enables content of this kind to be shared, you must obtain verifiable parental consent before allowing children to take part in the activity. You must also be aware of collecting personal information inadvertently.

N.B. COPPA applies only to data collected directly from children. For example, if you were to invite an adult (parent, teacher, etc.) to share a photo of a child, COPPA would not apply.

Additional information:

Federal Trade Commission: Complying with COPPA: Frequently Asked Questions

EUROPEAN UNION AND FRANCE

Electronic Commerce Directive

According to this directive, operators are not liable for the illegal activity or information placed on their systems by a user when they are in no way involved in the activity or information. Upon obtaining actual knowledge or awareness of its illegality, the operator must act promptly to remove the content or block access to it.

Additional information:

European Commission, Electronic Commerce Directive

AUSTRALIA

Australian Copyright Act

Australian law offers sparse protection to online operators. This means that when user-generated content infringes copyright on your platform, you can be held partially responsible for allowing the breach. You therefore need to implement measures to monitor UGC on your site (e.g. a system that approves UGC before it is posted online) and remove content that fails to respect copyrights.

Additional information:

Australian Copyright Council

Office of the Australian Information Commissioner

All stores require developers to classify their apps based on their overall content, including UGC and advertisements. Each store has a rating system based on the absence or presence of various criteria (e.g. violence, language level, etc.). Stores also require developers to respect intellectual property rights.

Apple App Store

Apps that allow UGC must develop methods to filter content, a whistle-blowing mechanism for offensive content and a way of blocking abusive users. Apps that allow content to be downloaded from third-party sources (e.g. YouTube, Vimeo, etc.) must have obtained explicit permission from said sources.

Additional information

Since it is an area well regulated by law, user-generated content is not covered by any self-regulatory programs.

  • If you invite children to share photos and/or personal videos, make sure you filter such content before posting it online. Known as “pre-moderation,” this helps you avoid publicly sharing personally identifiable data.
  • Obtain adequate rights for user-generated content through your terms of use. For example, do you want to own the content or just have a license to use it?
  • Place UGC under license: the rights you grant your users to use UGC posted on your platform must not be more extensive than those you obtained from users who shared the content.
  • Post a notice to distinguish copyrighted works on your platform. For example, if your terms of use specify that your users remain the owners of content they create, your notice might look like this:

© 2015 [your name] and contributors

  • Indicate in your terms of use that users must not post content that breaches intellectual property rights and that you reserve the right to remove any content that constitutes a breach.
  • Develop procedures like a content filtering system to prevent having any UGC that infringes intellectual property rights posted on your platform.
  • In your terms of use, state the procedure for filing a complaint regarding intellectual property. Complaints must be processed promptly:
  1. Remove the infringing content.
  2. Inform the user who posted the infringing content that the content has been removed, explain why and refer the user to your intellectual property policy.
  3. If appropriate under your digital code of conduct, consider imposing a sanction on the offending user.
  4. Inform the complainant that the content has been removed.
  • Incorporate a mechanism that lets users easily flag offensive UGC.
  • Participatory media raise a number of legal issues: for your own protection, seek professional legal advice.

Bibliography.

Read More