21. SAFEGUARDING CHILDREN FROM PREDATORS

Addresses the subject of predators — users who take advantage of Internet anonymity to make contact with minors — along with preventive and screening measures. 

1.1 What is a “predator”?

Predation wears many faces but one thing is certain: an online predator is an individual who uses the anonymity of a participatory platform to make contact with children.

A predator exploits the naivety of young people who, for lack of experience, are less likely to be suspicious. Predator tactics range from posing as a child to gain the victim’s trust to revealing themselves as an adult from the start, then seducing their victims by showing them attention, kindness and affection. Predators generally set out to commit some form of sexual offence, such as:

  • Asking the child to send explicit content (e.g. nude photos of the child)
  • Encouraging the child to participate in sexual activities via webcam
  • Initiating a sexually oriented conversation via chat
  • Inviting the child to meet in person

1.2 How can predation be prevented?

Preventing predators from using your platform essentially hinges on your moderation strategy. Your digital code of conduct identifies which behaviours are unacceptable; your moderation strategy then ensures that these rules are enforced.

1.3 How can predators be screened?

Predator screening is primary accomplished through moderation (monitoring online interactions) and by processing complaints received through reporting mechanisms.

Predators know that most youth platforms are closely moderated. Because of this, they rarely risk any gestures that would be easily detected. Instead, they set out to obtain information that will let them contact the child outside of your platform in a less controlled environment. Pay attention to repeated and insistent requests for personal information (age, address, city of residence, etc.) or suggestions to meet up on another platform (e.g. Facebook or Skype).

Warning! Violations do not always equal predation. Children also transgress the rules; doing so is “normal” behaviour to an extent. However, predators tend to commit the same violation repeatedly. For example, let’s say that after issuing a number of warnings to a user who repeatedly solicits personal information, you close that user’s account. If the same email address then opens a new account under a new pseudonym and again commits the same type of violation, this could indicate a predator.

Predators can be very patient when it comes to building a relationship with a child over time that lets them accumulate information bit by bit. While such an approach is more difficult to detect, data crossing-referencing can reveal behaviour of this kind.

1.4 What if you think you’ve identified a predator on your platform?

Believing you’ve identified a potential predator on your platform is a serious matter. Sexual offences against minors are covered by the criminal codes of all Western countries. Accordingly, you have a moral and legal obligation to report suspicious behaviour towards children.

Keep the account active to enable an investigation and gather information about the suspected predator (username, email address, comments, conversation history, etc.). Forward the information to your local police department or a child protection agency, which will assess the case and tell you how to proceed. The section below lists child protection organizations by country.

The sexual exploitation of children on the Internet is a criminal act. As a youth platform operator, it is your duty to report suspicious behaviour to the authorities, such as:

  • Your local police force
  • National child protection organizations
  • International bodies mandated to protect children from online sexual exploitation

Whatever authority you contact, your report will be evaluated and contentious cases referred to the local law enforcement agency, which will take the action needed and tell you how to proceed. Below is a list of the major organizations by country:

INTERNATIONAL BODIES

CANADA

UNITED STATES

FRANCE

AUSTRALIA

Mobile app stores do not specifically address the question of sexual predators in their terms of use.
  • It is your responsibility to take steps to prevent predation and protect your audience, which consists of vulnerable users. User safety must be a priority. Consequently, you must provide the financial, human and technical resources needed to support a moderation strategy appropriate to your platform.
  • User and content moderation must be adapted to your platform’s features as well as its audience:
    • The more opportunities for interaction between users, the greater the vigilance required.
    • Studies show that the most at-risk youth are not the youngest, but rather tweens and teens. Take extra precautions with these age groups.
  • Choose your moderators with care and train them to recognize the signs that could indicate a predator.
  • In high-risk areas (e.g. chatrooms, private messaging, etc.), use a black list to block the input of numbers, thus preventing personal information like addresses or phone numbers from being shared.
  • Establish effective procedures for dealing promptly with complaints received through the reporting mechanism.
  • If you think you have a predator on your hands, report the situation to a qualified authority or law enforcement agency.

Bibliography.

Read More

20. CYBERBULLYING

Examines the problem of cyberbullying and provides tips on how to prevent or intervene in a situation of online harassment.

1.1 What is cyberbullying?

It’s a situation in which an abuser voluntarily and repeatedly violates a victim online with a view to humiliating, threatening or harassing them. On participatory platforms, this can manifest in a number of ways such as:

  • Sending unwanted, nasty, threatening or insulting messages
  • Targeting someone by inviting others to make fun of them
  • Pressuring others to exclude the victim from the community
  • Impersonating the victim to issue inappropriate messages that causes others to respond negatively to the victim
  • Sharing the victim’s content without their consent

1.2 How can you prevent cyberbullying on your platform?

Cyberbullying is essentially prevented through moderation. Unacceptable behaviours are set out in the digital code of conduct; your moderation strategy should seek to ensure compliance with the code. You must also be prepared to apply the appropriate sanctions in the event of violations.

Furthermore, you should provide tools that let users notify you if they are bullied on your platform. These include features for blocking users and thus preventing any future contact as well as whistle-blowing mechanisms.

1.3 When cyberbullying is flagged, how should it be handled?

Cyberbullying is a serious issue and reports of bullying must be taken seriously. As a youth platform, you must promote respect for other users and take prompt action in the event of disrespectful behaviour.

Your moderators should establish procedures for responding to reports of bullying based on your digital code of conduct. Consequences that gradually increase in severity ranging from a simple warning to closure of the account must be imposed as needed. Depending on the gravity of the incident, you can also direct the victim to resources offering specialized support.

Each country regulates cyberbullying through differing laws based on the actions of the aggressor. As operator, your role is to punish the offender and direct the victim toward resources. Below is a list of anti-cyberbullying organizations by country:

CANADA

INTERNATIONAL

UNITED STATES

FRANCE

AUSTRALIA

Mobile app stores make no specific mention to cyberbullying in their terms of use.

The question of cyberbullying is covered by existing legislation.

  • Ensure that your digital code of conduct encourages civil behaviour and denounces disrespectful acts toward other users. Specify that repeated violations of the code will entail temporary or permanent suspension of the user’s account.
  • Design your platform in such a way as to give young people a chance to think before acting. For example, use pop-up windows to display messages like “Are you sure you want to share this?”
  • Encourage victims to discuss a bullying situation with their parents.
  • If you have the parent’s email address, you can send them a message outlining the situation and the steps you have taken to manage it. This applies to the parents of both the victim and the aggressor.
  • When you intervene with an aggressor, make sure they understand the following: the nature of the misconduct; how this behaviour violates the digital code of conduct; and the consequences that will ensue if the aggressor repeats the offence. For example:

On our platform, we do not tolerate disrespect toward other users. Your comment “[insert comment]” dated January 1, 2011 addressed to User12 fails to comply with our digital code of conduct [insert link to your code]. This is your first warning. If you display a similar lack of respect toward a user again, your account will be suspended for three days.”

Bibliography.

Read More

19. MAINSTREAM SOCIAL MEDIA

Presents the issues related to integrating mainstream social networks (e.g. Facebook, Instagram, Twitter) into a youth platform.

1.1 Social media and youth platforms

Integrating social media is a common marketing practice for mainstream products. However, in youth production, this practice becomes problematic.To comply with U.S. legislation on the collection of personal information, most mainstream social networks require users to be 13 years of age or older to open an account. Furthermore, since these platforms are rarely moderated, they expose children to a number of risks, including contact with strangers, cyberbullying and inappropriate content. Incorporating social media into your platform sends out the wrong message: you invite children to lie about their age (Backgrounder 7, 1.2) to access websites that, inaccessible to users under 13, can expose them to various risks.

You are responsible for creating a safe environment that does not encourage your users to lie to gain access to potentially dangerous websites.

In general, regulators consider mainstream social networks inappropriate for children under the age of 13, since they expose their users to various risks. Children generally do not have the required maturity to handle these risks. The U.S. is the only country to provide specific guidance on the integration of mainstream social media into youth products.UNITED STATESChildren’s Online Privacy Protection Act (COPPA)

Federal law to protect personal information about children collected online. COPPA applies to products that collect personal information from U.S. children under 13 years of age, including collection by companies based outside the U.S.

Operators are responsible for ensuring that any third parties who collect information through their platforms comply with COPPA rules. This includes social media plug-ins (such as “Follow us on Twitter” or the Facebook “Like” button), which often employ tracking technologies. Given that the information collecting practices of mainstream social media constitute a violation of COPPA if the user is under 13, and considering that the operators of child-directed websites are strictly liable for any personal information collected through their platforms, you shouldn’t incorporate social media if your platform addresses an audience under the age of 13.

Additional information:

Federal Trade Commission: Complying with COPPA: Frequently Asked Questions

Apple App StoreFor Kids Category apps, the App Store recommends clearly indicating in the app’s intro page whether it contains social features, i.e. those that put the child in contact with other users or allow information to be shared through the app (e.g. top scores in a game, social media sharing features, etc.).

Additional information

None of the main Canadian self-regulatory bodies address the topic of social media on youth platforms.
  • If your platform’s target audience is exclusively in the “under 13 years of age” category, avoid integrating mainstream social media.
image001
Source: Justin’s World
  • Access to social media integrated into a platform aimed at ages 13 and under should be strictly limited to the parents’ section.
  • If you integrate social media into the parents’ section, place an age-appropriate parental gate to prevent children from accessing the area.
  • If your target audience is partially in the “under 13 years of age” category, install an age filter. This limits access to social media to the part of the site open to ages 13 years and up only.
  • If you use an age filter, it’s smarter to ask children to enter their date of birth (DD/MM/YYYY) rather than have them tick a box saying “age 13 and over”/“Aged under 13.”

Bibliography.

Read More