Digital trust and safety
Objective 3: Foster an inclusive, open, safe and secure digital space that respects, protects and promotes human rights
31 (a) Create a safe and secure online space for all users that ensures their mental health and well-being by defining and adopting common standards, guidelines and industry actions that are in compliance with international law, promote safe civic spaces and address content on digital platforms that causes harm to individuals, taking into account work under way by United Nations entities, regional organizations and multi-stakeholder initiatives;
31 (b) Prioritize the development and implementation of national online child safety policies and standards, in compliance with international human rights law, including the Convention on the Rights of the Child;
31 (c) Establish regular collaboration between national online safety institutions to exchange best practices and develop shared understandings of actions to protect privacy, freedom of expression and access to information while addressing harms;
31 (d) Ensure that laws and regulations on the use of technology in areas such as surveillance and encryption are in compliance with international law;
31 (e) Develop, in consultation with all relevant stakeholders, effective methodologies to measure, monitor and counter all forms of violence and abuse in the digital space;
31 (f) Monitor and review digital platform policies and practices on countering child sexual exploitation and abuse which occurs through or is amplified by the use of technology, including distribution over digital platforms of child sexual abuse or child sexual exploitation material, as well as solicitation or grooming for the purpose of committing a sexual offence against a child.
32 (a) Call on digital technology companies and developers to engage with users of all backgrounds and abilities to incorporate their perspectives and needs into the life cycle of digital technologies;
32 (b) Call on digital technology companies and developers to co-develop industry accountability frameworks, in consultation with Governments and other stakeholders, that increase transparency around their systems and processes, define responsibilities and commit to standards as well as auditable public reports;
32 (c) Call on digital technology companies and social media platforms to provide online safety-related training materials and safeguards to their users, and in particular, related to children and youth users;
32 (d) Call on social media platforms to establish safe, secure and accessible reporting mechanisms for users and their advocates to report potential policy violations, including special reporting mechanisms adapted to children and persons with disabilities.