CHILD SAFETY POLICY OF
"NOX – THE SOCIAL ALARM CLOCK"
Last updated : April 14, 2025
Version : 1.0
PREAMBLE
This document constitutes the “Child Safety Policy” of the mobile application “NOX – The Social Alarm Clock” (hereinafter referred to as “NOX”), published by the company LAPSUS. It is part of the contractual documentation governing access to and use of the Application, alongside the Legal Notices, the Terms of use, the General Sales Conditions, the Privacy Policy, the Data Collection Policy, the Illicit Content Reporting and Removal Policy, the Copyright and Intellectual Property Policy, the Advertising and Sponsored Content Policy, the Moderation Policy , the List of Prohibited Content, the Code of Conduct , and the Legal Provisions for the Attention of the Authorities.
This “Child Safety Policy” aims to clearly and accessibly define the rules, prevention mechanisms, reporting procedures, technical settings, and specific commitments implemented by the Publisher regarding the protection of minors, in particular users aged 15 to 17 accessing the Application.
The Publisher reaffirms its commitment to upholding the fundamental rights of children, preventing digital risks, combating harmful or inappropriate content, and cooperating with the competent child protection authorities, in accordance with applicable legislation, including:
- The General Data Protection Regulation (GDPR – EU 2016/679)
- The French Law on Confidence in the Digital Economy (LCEN – France),
- The Digital Services Act (DSA – EU 2022/2065),
- Specific requirements of distribution platforms such as the Google Play Console.
The use of the Application by a minor User, or the interaction of an adult User with a minor within the Application, constitutes full acceptance of this “Child Safety Policy” and binds each individual to its provisions.
Users are encouraged to regularly consult this “Child Safety Policy,” which is available at all times in the “Legal Information” submenu within the “Settings” menu of the Application or via the NOX legal information website: https://www.nox.app/legal/.
USERS ARE STRONGLY ENCOURAGED TO READ THIS “CHILD SAFETY POLICY” CAREFULLY BEFORE USING “NOX.”
INTRODUCTION
“NOX – The Social Alarm Clock” is an innovative social network based on the unique concept of personalized alarms. The Application allows Users to create, send, receive, and watch short videos—called “nox”—designed to make waking up a fun and interactive experience. The core idea of “NOX” is to create a new, global, intergenerational space for creativity and expression that transforms waking up into a social and entertaining moment.
“NOX” exists exclusively as a mobile application designed, developed, and published by the French company LAPSUS, a simplified joint-stock company with a capital of €12,000, headquartered at 10 rue Louis Amiot, 01000 Bourg-en-Bresse, registered with the Bourg-en-Bresse Trade and Companies Register under number 879 161 818, and represented by Mr. Luc NOUGUIER, President of LAPSUS (hereinafter referred to as “the Publisher”).
The “NOX” application is available for download worldwide as of Monday, April 14, 2025, on the following platforms:
- Apple “App Store”
- Google “Play Store”
- Amazon “Amazon Appstore”
- Samsung “Galaxy Store”
- Huawei “AppGallery”
- Xiaomi “Get Apps”
- Oppo “App Market”
- Vivo “V-Appstore”
- Tencent “Appstore”
- Xiaomi “Mi Store App”
- Aptoide “Aptoide”
- “APKPure”
Aware of the challenges related to digital safety, child protection, and the prevention of online risks, LAPSUS, in its role as the Application’s Publisher, has decided to implement a specific policy regulating access, use, and visibility of the Application for minor users aged 15 to 17, in accordance with applicable legislation and the standards of major distribution platforms (Google Play Store, Apple App Store, etc.).
This "Child Safety Policy" applies to any User under the age of 18, as well as any other User interacting, directly or indirectly, with an account identified or likely to belong to a minor.
It aims to:
- define the conditions for access and registration applicable to minors;
- regulate the privacy and display settings specific to their age;
- present the protection mechanisms against sensitive or inappropriate content;
- outline the reporting procedures and enhanced moderation methods in case of suspicious behavior or harmful content involving a minor;
- specify the Publisher's commitments regarding cooperation with the competent authorities.
The Publisher reminds that, although the Application is designed for intergenerational and responsible use, it does not implement an automated parental control system, nor identity verification by certified third parties. However, it applies strict technical rules, restricted settings for minors, and prioritizes human moderation in high-risk situations.
The goal of this "Child Safety Policy" is to ensure a balance between freedom of expression, the safety of young users, and the prevention of abuse, within a proactive approach of compliance with the law, digital ethics, and contractual transparency.
1. DEFINITIONS
For the purposes of this "Child Safety Policy", the following terms shall have the following meanings, whether used in the singular or plural:
Contractual Terms: refers to all legal documents applicable to the use of the Application, including the Legal Notices, Terms of Use, General Sales Conditions, Privacy Policy, Data Collection Policy, Illicit Content Reporting and Removal Policy, the Child Safety Policy, the Copyright and Intellectual Property Policy, the Advertising and Sponsored Content Policy, theModeration Policy,List of Prohibited Content, Code of Conduct, and Legal Provisions for the Attention of the Authorities.
Application: refers to the mobile application "NOX – The Social Alarm Clock," created, developed, and published by the Publisher. The Application is intended to be downloaded and installed on a compatible Device (smartphone or tablet), and allows the use of features offered by the Publisher, such as the creation and reading of "noxs." Subject to compliance with the "Terms of Use" and Legal Notices and any technical requirements, the Application is available on the following download platforms: Apple’s "App Store," Google’s "Play Store," Amazon’s "Amazon Appstore," Samsung’s "Galaxy Store," Huawei Technologies' "Huawei AppGallery," Xiaomi's "Xiaomi Get Apps," Oppo’s "Oppo App Market," AppGallery," Xiaomi's "Xiaomi Get Apps," Oppo’s "Oppo App Market," Vivo's "V-Appstore," Tencent’s "Tencent Appstore," Xiaomi’s "Mi Store App," Aptoide S.A’s "Aptoide," "APKPure." Access to and updates of the Application, as well as any geographical or technical restrictions, may vary depending on the app stores and the applicable policies.
Publisher: refers to the French company LAPSUS, a simplified joint-stock company (SAS) registered with the Bourg-en-Bresse Trade and Companies Register under number 879 161 818, with its registered office located at 10 rue Louis Amiot, Bourg-en-Bresse. The Publisher is legally responsible for the brand and the "NOX – The Social Alarm Clock" Application. The Publisher is responsible for the design, development, publishing, and Publisher is responsible for the design, development, publishing, and availability of the Application, as well as all associated Services. It is also responsible for publishing and maintaining the technical aspects of the Application, including any updates or new features offered to Users.
User: refers to any individual who accesses and uses the Application for private or professional purposes. The User may be a non-professional individual, a consumer as defined by the Consumer Code, or any other person using the Publisher's services. The User may therefore enter into an agreement with the Publisher to acquire content or use features available on the Application, under the conditions defined by the "Terms of Use" of the Application.
Minor: refers to any person under the age of 15. This age corresponds to the minimum required to register on the Application, as set by the Publisher in accordance with applicable legislation, particularly concerning the protection of minors and digital consent, as well as the "Child Safety Policy".
Child: refers to any minor within the scope of the "Child Safety Policy", including those who have accessed the Application in violation of the registration conditions. The rules stated here aim to ensure the safety of children under all circumstances, including in cases of unauthorized or fraudulent access.
Parties: refers collectively to the Publisher of the Application and the User, bound by the acceptance of the applicable contractual documents, including the "Terms of Use", "General Sales Conditions", and "Legal Notices". The Parties acknowledge their mutual commitments, respective responsibilities, and the legal rules governing their relationship in the context of using the Services offered through the Application.
Device: refers to any hardware used by the User to access the Application, download, install, and use its features. This includes, but is not limited to, smartphones, tablets, laptops or desktop computers, and compatible connected objects. The Device must meet the minimum technical requirements set by the Publisher, including operating system, available memory, performance, and connectivity. The User is solely responsible for installing the Application on their own Device, ensuring its compatibility, proper functioning, and security of their digital environment, including regular system updates, the use of antivirus digital environment, including regular system updates, the use of antivirus software, and protection against malware.
Download Platform or Store: refers to any third-party service that allows downloading or updating the Application, such as Apple’s "App Store," Google’s "Play Store," Amazon’s "Amazon Appstore," Samsung’s "Galaxy Store," Huawei Technologies' "Huawei AppGallery," Xiaomi's "Xiaomi Get Apps," Oppo’s "Oppo App Market," Vivo's "V-Appstore," Tencent’s "Tencent Appstore," Xiaomi’s "Mi Store App," Aptoide S.A’s "Aptoide," "APKPure." Appstore," Xiaomi’s "Mi Store App," Aptoide S.A’s "Aptoide," "APKPure." Access to these platforms is subject to the general terms and conditions of each provider, over which the Publisher has no control.
Service: refers to all the features and services offered by the Publisher via the Application, including but not limited to:
- Creation, publishing, sending, and reading of "noxs";
- Programming and triggering alarms to automatically play one or more "nox(s)";
- Creation, publishing, sending, and reading of "dreams";
- Management of a personal profile;
- Visiting other Users' personal profiles;
- Textual, visual, and voice communication with Users via messaging;
- Access to subscription plans such as "NOX Premium", which provides additional features or benefits;
- Any other service or content that the Publisher decides to make available to the User, subject to compliance with the "Terms of Use", "Legal Notices", and applicable laws.
The Service is subject to change or enhancement over time, including through updates, fixes, interface modifications, or the addition of new offers, whether free or paid, without creating any obligation for the Publisher beyond what is specified herein or in specific conditions.
Host: refers to the technical service provider responsible for ensuring the storage, accessibility, and operational maintenance of the Application and its digital content. The Host acts as a technical intermediary in accordance with applicable legislation (notably the LCEN and DSA), without intervening in the hosted content. The Host’s contact information is provided in the "Legal Notices", "Terms of Use", and "Privacy Policy", in accordance with the identification requirements imposed on the Publisher.
User Account: refers to the personal space of each User, accessible after a registration or login process via an identifier (email address, phone number, etc.) and a strictly confidential password. From this space, the User can, among other things:
- Add or modify their profile picture;
- Add or modify up to ten cover photos;
- Add or modify their profile description;
- Add content to their profile ("noxs," "renoxs," dream collection);
- Modify the thumbnails of their created "noxs";
- Pin created "noxs";
- Pin "renoxs";
- Delete content from their profile ("noxs," "renoxs," dream collection);
- Access their list of received notifications;
- Manage their subscription list;
- Access their list of followers;
- Manage their list of friends;
- Modify their name;
- Modify their username;
- Modify the email address for accessing their account;
- Modify the phone number for accessing their account;
- Modify the password for accessing their account;
- Send a request for personal data transmission;
- Subscribe to or unsubscribe from the paid service "NOX Premium";
- Send a request for account certification;
- Manage privacy settings;
- Access their liked "noxs";
- Access their viewed "noxs";
- Access the archives of created "dreams";
- Access the archives of created "noxs";
- Modify the application language;
- Modify the application appearance;
- Manage blocked Users;
- Manage sensitive content settings;
- Access the Application's user guide;
- Access the legal information of the Application;
- Access the social media of the Application and the Publisher;
- Delete their account.
The User Account is personal, non-transferable, and protected by confidential identifiers whose security is the sole responsibility of the User. The User commits to immediately inform the Publisher of any suspicion of fraudulent access or compromise of their account and to comply with the applicable "Terms of Use".
Evolving Features: refers to any new feature, tool, or service added later to the Application, whether experimental, promotional, or permanent, subject to the "Terms of Use" and applicable rights.
Social Features: refers to the tools for interaction between Users within the Application, such as messaging, comments, subscriptions, "likes," shares, and any communication, visibility, or community relationship feature integrated by the Publisher.
Inappropriate Behavior: refers to any behavior or interaction, direct or indirect, contrary to the spirit of the "Code of Conduct", which may harm the well-being of other Users or compromise the overall atmosphere of the Application.
Content: refers to any type of digital content accessible, published, or distributed via the Application, including texts, images, videos, sounds, data, or any other element. Content includes, but is not limited to:
- The "noxs" created by Users, intended for other Users or for their own use, which are videos of a maximum duration of two minutes, intended to wake Users of the Application;
- The thumbnails of "noxs," which are images serving as covers for a "nox." They represent the content of the "nox" and are designed to capture attention at first glance in the Application;
- The "dreams" created by Users, which are photos or videos of a maximum duration of two minutes, intended to stay online in the Application for only 24 hours. "Dreams" allow the User to communicate daily with the community of Users who follow them in the Application;
- Collections of "dreams," which are groups of "dreams" organized into folders, directly accessible from the Users' profiles. Each collection is visually represented by a cover image, designed to evocatively illustrate the theme or common universe of the "dreams" it contains;
- Profile photos;
- Cover photos;
- Profile text descriptions;
- Comments posted in the comment section of a "nox";
- Text messages exchanged in messaging conversations;
- Photos exchanged in messaging conversations;
- Videos exchanged in messaging conversations;
- Voice messages exchanged in messaging conversations;
- GIFs exchanged in messaging conversations;
- Any file or information made available by the Publisher or third parties on the Application, provided it complies with copyright, the "Terms of Use", and applicable legislation.
The User declares and guarantees that they hold all the necessary rights for the publication, distribution, or availability of any Content via the Application, including copyright, image rights, or any other intellectual property rights. They agree to hold harmless and indemnify the Publisher against any claim, action, or damage resulting from a violation of applicable law or third-party rights.
Public Content: refers to any content visible to other Users beyond the creator, including public "noxs," public "dreams," comments, dream collections, or profile elements.
Private Content: refers to any content accessible only to one or more designated Users, such as private "noxs," private "dreams," or private messages.
Classic Content: refers to any content published on the Application that is neither sponsored, sensitive, nor prohibited. It may be public or private, depending on the settings chosen by the User at the time of publication, and remains subject to the present "Terms of Use" and the "Code of Conduct". Classic Content is the standard, unmarked form of content on the Application and carries no special labeling at the time of distribution.
Organic Content: refers to any content spontaneously published by a User or, where applicable, by the Publisher, without financial compensation, promotional intent, or any link to an advertising campaign, communication effort, or commercial partnership. Organic Content does not carry any commercial or promotional tag and is not subject to specific monetization-related processing. It may be public or private depending on the chosen publication settings and remains subject to the "Terms of Use" and the "Code of Conduct".
Sponsored Content: refers to any "nox" or "dream" published by a User on the Application with the "Sponsored Content" tag, activated at the time of publication to indicate that the content is shared for promotional or advertising purposes.This tag is mandatory whenever the content is published in exchange for financial gain, material benefit, or a visibility agreement with a third party, including the Publisher. It ensures transparency toward other Users and automatically displays the visible mention “Sponsored Content” when the content is played or opened.
The use of this tag, as well as the obligations associated with it, are defined in the "Advertising and Sponsored Content Policy", which all Users are contractually bound to.
Sponsored Content is distinct from commercial advertisements shown within the Application and content published as part of a Paid Partnership managed by the Publisher.
Prohibited Content: refers to any content, regardless of format or medium, that violates the "List of Prohibited Content", contravenes the contractual rules applicable to the Application (such as the "Terms of Use", the "Code of Conduct", or the "Moderation Policy"), or breaches applicable laws and regulations, particularly those concerning the protection of minors, public safety, fundamental rights, and intellectual property.
"Prohibited Content" triggers immediate moderation measures, contractual sanctions, or reports to the competent authorities, without the need for prior warning.
Sensitive Content: refers to any content published on the Application with the "Sensitive Content" tag, activated by the User at the time of publication due to its potentially shocking, disturbing, violent, sexual, or emotionally difficult nature. This tag is mandatory for such content to be tolerated on the Application. It is intended to restrict access, warn viewers, and protect Users—especially minors. Content that may offend sensitivities but is tolerated under certain conditions is defined in the "List of Prohibited Content". If published without the "Sensitive Content" tag, such content will be considered non-compliant and may be subject to moderation measures, including permanent and irreversible removal.
Sensitive Content Management Setting: refers to the system integrated into the Application that allows for the display or blocking of sensitive content based on the User’s age and declared preferences. This setting is automatic and non-editable for minors and customizable for adult Users.
Nox: refers to a video with a maximum duration of two minutes, primarily designed to wake a User of the Application in a fun and interactive way.
Renox: refers to a "nox" — a video of no more than two minutes designed to wake a User in a fun and interactive way — that has been republished by a User on their own profile.
Dream: refers to a photo or video that disappears after 24 hours, used to communicate with one’s community of Users.
Alarm: refers to the feature of the Application that allows the User to schedule or receive a "nox" at a specific time, for a personalized, social, and interactive wake-up experience.
Messaging: refers to the communication system integrated into the Application, allowing Users to exchange text messages, voice messages, photos, GIFs, and other files, either in private or group conversations.
Notification: refers to any message or alert sent to the User via the Application or through an external channel (push notification, email, etc.), for the purpose of providing information, reminders, security alerts, or promotional content, in accordance with the User’s communication preferences.
Subscription: refers to a contract concluded between the User and the Publisher, involving the payment of a fee in exchange for prolonged access to certain features, services, or exclusive content on the Application, for a fixed or renewable period.
NOX Premium: refers to the paid monthly subscription plan offered by the Publisher, providing the User with an enhanced experience of the Application, including the removal of unsolicited ads, advanced features, customization options, and exclusive benefits. Sponsored content and paid partnerships remain visible. The details and terms of the offer are defined in the "Terms of Use" and the "General Sales Conditions".
Personal Data or Personal Information: as defined by the General Data Protection Regulation (GDPR – EU 2016/679) and any equivalent or complementary national legislation, refers to any information relating to an identified or identifiable natural person. An “identifiable natural person” is one who can be recognized, directly or indirectly, in particular by reference to an identifier such as a name, identification number, location data, online identifier, or to one or more specific elements related to their physical, physiological, genetic, psychological, economic, cultural, or social identity.
The processing of such Personal Data, carried out as part of the use of the Application and associated Services, is governed by the “Publisher's Privacy Policy”, available at https://www.nox.app/legal/privacy-policy.
The Publisher undertakes to comply with all applicable regulations concerning the protection of personal data, privacy, and the rights of data subjects.
Consent: refers to the free, informed, specific, and unambiguous expression by which the User agrees, through a clear affirmative action or declaration, to the processing of their personal data for one or more specific purposes.When data is collected through trackers, consent must be obtained before any placement or activation of non-essential trackers, in accordance with applicable regulations and the "Data Collection Policy". Consent can be withdrawn at any time via the Application's settings, without retroactive effect.
Privacy Settings: refers to all options available within the Application that allow the User to personalize how their personal data is managed, how visible their content is, or how exposed they are to certain types of sensitive content.
Data Collection:Data Collection: refers to any operation carried out by the Publisher, its third-party partners, or subcontractors involving the collection of personal or non-personal data related to Users of NOX. This may occur actively (e.g., through user input) or passively (e.g., through trackers or Application features), to ensure the proper functioning, continuous improvement, security, and personalization of the provided services. Data collected may include registration data, published or viewed content, user-defined preferences, technical data related to browsing or Application usage.
Browsing Data: refers to information automatically collected during the use of the Application, related to how the User accesses, uses, and interacts with features or content. This data may include, but is not limited to, the type and model of the Device used, operating system, interface language, session durations pages or content viewed, actions performed, errors encountered, technical performance or log data.
Browsing Data may be used for technical, statistical, analytical, or personalization purposes, in accordance with the "Data Collection Policy".
Advertising Network: refers to any entity, internal or external to the Publisher, responsible for the delivery, management, optimization, or performance measurement of advertising content distributed within the Application. Advertising Networks may use technologies such as trackers, SDKs, or advertising identifiers to adapt, target, or analyze ads. They may act on behalf of the Publisher or third parties, under the terms defined by the "Data Collection Policy" and the "Advertising and Sponsored Content Policy".
Third-Party Partner: refers to any external service provider, technology vendor, or collaborator with whom the Publisher works for the functioning, analysis, optimization, or monetization of the Application, and who may, in that capacity, collect, receive, process, or access User-related data.This includes, in particular partners involved in the deployment or activation of trackers (e.g., Google Ads, Firebase Analytics), those responsible for hosting, audience measurement, performance management, ad distribution, security, or data analysis. The terms governing the relationship between the Publisher and each Third-Party Partner are outlined in the "Data Collection Policy".
Processing: refers to any operation or set of operations performed on Personal Data, such as collection, recording, organization, structuring, storage, adaptation, modification, retrieval, consultation, use, disclosure, restriction, deletion, or destruction.
Tracker: refers to any technological device that enables the reading, writing, or collection of information on the User’s Device, either directly or indirectly, from the Application. This includes, but is not limited to cookies, SDKs, web beacons, tags, advertising identifiers (e.g., IDFA, GAID), or any similar mechanism.Trackers may be used for technical, functional, statistical, audience measurement, or advertising purposes, and may require prior User consent depending on their nature, in accordance with the "Data Collection Policy" and applicable regulations.
Essential Functional Trackers: refers to tracking technologies strictly necessary for the technical functioning of the Application or the provision of a service expressly requested by the User. These trackers do not require prior User consent, as they help ensure service security, session management, content synchronization, alarm triggering, proper functioning of essential Application settings.Their use is governed by the "Data Collection Policy" and complies with applicable legal requirements regarding privacy.
Consent-Based Trackers: refers to trackers or similar technologies used for non-essential purposes, including advertising, statistics, personalization, audience measurement, or profiling, which require the User’s free, informed, specific, and unambiguous consent. Consent must be obtained before any data is stored or read on the Device, either during Application installation or via Privacy Settings. The User may withdraw consent at any time, without retroactive effect, in accordance with the "Data Collection Policy".
SDK (Software Development Kit): refers to a third-party software module integrated into the Application, enabling the addition of technical features, interfacing with external services, or collecting behavioral data about Users. SDKs may be used for audience analysis, performance measurement, marketing attribution ad management security, authentication. Some SDKs may include consent-based trackers, especially when they access advertising identifiers or perform profiling. Their use is governed by the "Data Collection Policy".
Tag: refers to a piece of code embedded in the Application, used to trigger a tracking event or record a user interaction, typically for measurement, targeting, or marketing performance purposes. A tag may be used, for example, to track an impression or click, trigger a conversion in an ad campaign, transmit data to an analytics tool or advertising network.
Tracking Pixel or Spy Pixel: refers to a snippet of code or invisible image (usually 1x1 pixel in size) embedded in the Application interface, enabling the tracking of content display, loading, or interaction by the User for analytics, audience measurement, or advertising performance purposes. These pixels may transmit information to third-party tools or advertising networks, particularly for retargeting or conversion campaigns, and are subject to the User's prior consent when involving the processing of personal data.
Advertising Identifier: refers to a unique, resettable identifier assigned by the operating system of the Device (e.g., IDFA on iOS or GAID on Android), allowing the tracking of the User’s advertising preferences and the delivery of personalized or targeted ads. This identifier may be used by the Publisher or third-party partners for performance measurement, retargeting, or behavioral analysis. Its use is subject to the User’s prior consent, in accordance with the "Data Collection Policy" and applicable privacy regulations.
Advertisement: refers to any content displayed within the Application for promotional or commercial purposes, intended to promote a product, service, brand, or entity. Advertising may be delivered by the Publisher, third-party advertising networks, or Users in the context of sponsored content, and may take various forms (banners, interstitials, videos, tagged organic posts, etc.). It is distinct from editorial or personal content by its commercial purpose.
External Advertising: refers to any form of advertising inserted into the Application via a third-party advertising network, such as Google Ads. It may be automatically displayed in the interface (banners, interstitials, videos, etc.), based on distribution settings defined by the network and the User’s preferences. The display of External Advertising is subject to the User’s consent, especially when it involves trackers or advertising identifiers (IDFA, GAID).
Internal Advertising: refers to any advertisement displayed within the Application by the Publisher, either directly or via its own advertising system. It may promote the Publisher’s services, content, or offers, or those of selected partners within the context of promotional agreements. Internal Advertising may appear as visuals, videos, suggestions, notifications, or dedicated placements within the interface. It may be contextual or targeted, depending on the User’s settings and consent when personal data processing is required.
Targeted Advertising: refers to any form of advertisement personalized based on the User’s data, such as browsing behavior, profile, interactions, or preferences. It may be delivered by a third-party network (as part of External Advertising) or by the Publisher (as part of Internal Advertising), subject to the User’s prior consent in accordance with applicable regulations. Activating Targeted Advertising involves the use of advertising identifiers or trackers for profiling purposes.
Paid Partnership: refers to any agreement between the Publisher and a third-party entity resulting in the display of promotional or advertising content within the Application. These contents are published directly by the Publisher in an organic format and are clearly labeled as “Paid Partnership.”This type of content is the sole responsibility of the Publisher and is distinct from Sponsored Content (published by Users) and advertisements delivered by internal or external ad networks. The terms applicable to Paid Partnerships are defined in the "Advertising and Sponsored Content Policy".
Creator: refers to a User who publishes original or creative content on the Application, for which they are either the author or the holder of the necessary distribution rights. The Creator is responsible for ensuring compliance with applicable intellectual property rights and agrees to only publish content that complies with the contractual rules of the Application.
Protected Work: refers to any original intellectual creation expressed in a perceptible form and meeting the conditions for protection under the French Intellectual Property Code. This includes, but is not limited to, literary, artistic, musical, audiovisual, graphic, visual, photographic, software, digital, or multimedia creations.
Intellectual Property: refers to all exclusive rights granted by law to individuals or legal entities over their intellectual, artistic, technical, or commercial creations. This includes copyright and related rights of performers, producers, and publishers; rights over databases; and industrial property rights such as trademarks, designs, models, and patents. These rights give their holders control over the use, reproduction, performance, or distribution of their works, creations, or distinctive signs, in accordance with the provisions of the Intellectual Property Code and applicable international agreements.
License of Use: refers to the authorization granted by a rights holder, allowing the Publisher or a User to use a protected work or content within the limits defined by the agreed conditions. The license may cover various rights (reproduction, representation, adaptation, etc.) and specify the duration, geographic scope, authorized uses, and any applicable restrictions. It does not transfer intellectual property but provides a legal framework for lawful exploitation under applicable provisions.
Licensed Content: refers to music, works, or protected elements made available by the Publisher within the Application, for which a valid license has been obtained. This license allows, under the conditions defined by the Publisher, Users to incorporate such content into their own creations published on the Application. The use of Licensed Content remains subject to compliance with associated rights, any limitations set out in the license, and applicable contractual rules.
Infringement: refers to any use, reproduction, performance, distribution, modification, or adaptation of a protected work without the prior authorization of the rights holder, constituting a violation of their intellectual property rights. Infringement may apply to literary, artistic, musical, audiovisual, software, or any other protected content and may result in civil and criminal liability in accordance with the Intellectual Property Code and applicable laws.
Third Party: refers to any individual or legal entity not party to the contractual relationship between the Publisher and the User, and who is neither acting as the Publisher nor as a User of the Application. A Third Party may, for instance, be a rights holder of published content or be affected by a creation, publication, data processing, or use within the Application without directly participating in it.
Moderation: refers to all actions carried out by the Publisher’s dedicated human moderation team to monitor, assess, classify, process, restrict, or remove content or behavior reported or identified as contrary to the Application’s contractual rules, the law, or the ethical standards defined by the Publisher. Moderation may be reactive (following a report) or proactive (based on internal checks), and aims to ensure a compliant, safe, and respectful environment for all Users.
Report: refers to the feature available to Users that allows them to notify the Publisher, via the Application, of any content, behavior, or profile potentially violating the present contractual policies, applicable laws, or the rights or safety of an individual— particularly a child. A report may trigger a priority review by the human moderation team, in accordance with procedures outlined in the "Moderation Policy" and the "Child Safety Policy".
Report Category: refers to the descriptive category selected by the User when submitting a report, used to classify the presumed nature of the violation (e.g., hate speech, harassment, identity theft, inappropriate content, child endangerment, etc.). The Report Category guides how the moderation team processes the report and may, in some cases, automatically trigger the display of specific forms adapted to the severity or nature of the content in question.
Content Author: refers to the User who originally published, shared, or made content available on the Application, including content that becomes the subject of a report. The Content Author is legally responsible for complying with contractual rules, applicable rights, and legal obligations related to that content.
Complainant: refers to the User who initiated a report by notifying the Publisher of content, behavior, or a profile they believe violates the contractual rules or violates the contractual rules or applicable law. The identity of the Complainant remains strictly confidential and is never disclosed to the Content Author, unless required by law or judicial order.
Sanction: refers to any action taken by the Publisher against content, behavior, or a User that violates the contractual rules, current legislation, or the standards defined by the Publisher. Sanctions may include, but are not limited to: reclassification or removal of content; temporary or permanent restriction of functionalities, suspension or deletion of the user’s account, reporting to competent authorities in the case of serious misconduct or suspected criminal offense. The application of sanctions is at the sole discretion of the Publisher, in accordance with its "Moderation Policy".
Competent Authorities: refers to all judicial, administrative, law enforcement, or institutional authorities legally empowered to receive, process, or request information, reports, or evidence related to the protection of minors, the prevention of criminal acts, the fight against illicit content, and the suppression of dangerous behavior. This includes, but is not limited to, police or gendarmerie services, judicial authorities, independent authorities (such as regulatory agencies), any entity legally designated for this purpose under applicable law.
Legal Notices: refers to the document prepared in accordance with current French and European legislation intended to inform the User of key aspects related to the use of the service. It outlines the identification details of the Publisher, hosting information, intellectual property rights, data processing terms, and legal responsibilities of both the User and the Publisher, particularly under the French Law for Confidence in the Digital Economy (LCEN), the General Data Protection Regulation (GDPR), and the Digital Services Act (DSA). This document is available at the following address: https://www.nox.app/legal/legal-notice.
Terms of Use (CGU): refers to the contractual document governing the conditions of access, use, and operation of the Application, as well as the respective rights and obligations of the Publisher and the User. This document specifically defines the rules of User behavior, the conditions for content publication, the liability limits of each Party, the moderation procedures implemented, and the applicable sanctions in the event of a breach. The "Terms of Use" constitute a legally binding contract that must be accepted in order to access the services offered through the Application. This document is available at the following address: https://www.nox.app/legal/terms-of-use
General Sales Conditions (CGV): refers to the contractual document governing commercial relations between the Publisher and the User in the context of paid offers proposed on the Application. The "General Sales Conditions" detail subscription conditions, payment terms, commitment periods, cancellation procedures, as well as applicable rights regarding withdrawal or refund. They are legally binding for any individual subscribing to a paid service on the Application. This document is available at the following address: https://www.nox.app/legal/terms-of-sale
Privacy Policy: refers to the document outlining the Publisher’s commitments regarding the protection of personal data processed in the context of using the Application. It defines the purposes of data processing, legal bases, recipients, retention periods, Users’ rights (access, rectification, deletion, objection, portability, restriction), as well as the contact methods for exercising those rights.
It is directly linked to the "Data Collection Policy", which constitutes a complementary technical component detailing how information is actually collected via the Application. This document is available at the following address: https://www.nox.app/legal/privacy-policy
Data Collection Policy: refers to the document that details the technical modalities for data collection within the Application, whether data is voluntarily entered by the User or collected automatically (identifiers, logs, trackers, SDKs, etc.). It distinguishes the types of data collected, their sources, the tools used, and the contexts of collection.
It serves as the technical extension of the "Privacy Policy", supplementing it by informing Users precisely about how data processing is carried out, in accordance with the GDPR. This document is available at the following address: https://www.nox.app/legal/data-collection-policy
Illicit Content Reporting and Removal Policy: refers to the document that defines the procedure through which Users or any third party can report content that is illegal or contrary to current laws, as well as the methods for handling, removing, notifying, and contesting such reports. This policy is established in compliance with the obligations set out in the LCEN, the Digital Services Act (DSA), and the DMCA. This document is available at the following address: https://www.nox.app/legal/reporting-policy
Child Safety Policy: refers to the document that outlines the specific measures implemented by the Publisher to ensure the protection of minors on the Application. It defines age restrictions, parental control mechanisms, enhanced moderation rules, and the Publisher's commitments to preventing risks related to sensitive content or inappropriate interactions. This document is available at the following address: https://www.nox.app/legal/child-safety-standards-policy
Copyright and Intellectual Property Policy: refers to the document that defines the rules applicable to the use, publication, and protection of content protected by copyright or other intellectual property rights within the Application. It details the Publisher’s rights, the obligations of Users, procedures for reporting infringement, and the conditions for content removal or sanction in the event of a violation. This document is available at the following address: https://www.nox.app/legal/copyright-policy
Advertising and Sponsored Content Policy: refers to the document governing the distribution of advertising, promotional, or sponsored content on the Application. It outlines transparency obligations, rules for identifying sponsored content, legal compliance requirements, applicable restrictions, and the Publisher’s commitments to commercial ethics and User protection. This document is available at the following address: https://www.nox.app/legal/advertising-policy
Moderation Policy: refers to the document describing the principles, criteria, and moderation procedures applied within the Application. It governs the monitoring of published content, removal, suspension or alert actions, and the recourse available to Users. It aims to ensure a respectful, safe, and legally compliant environment. This document is available at the following address: https://www.nox.app/legal/moderation-policy
List of Prohibited Content: refers to the document that provides a non-exhaustive list of content types that are either tolerated or prohibited on the Application, such as violent, hateful, discriminatory, sexual, illegal, or misleading content. This list serves to inform Users of the boundaries not to be crossed and is used as a reference for moderation and sanction actions. This document is available at the following address: https://www.nox.app/legal/prohibited-content
Code of Conduct: refers to the document establishing the behavioral expectations for Users within the Application. It promotes kindness, respect, civility, inclusion, and responsible use of social features. It serves as an ethical guide that complements the legal and contractual obligations outlined in the other policy documents.This document is available at the following address: https://www.nox.app/legal/code-of-conduct
Legal Provisions for the Attention of the Authorities: refers to the reference document established by the Publisher, intended exclusively for administrative, judicial, law enforcement, or regulatory authorities. Its purpose is to structure, formalize, and facilitate legal and technical cooperation with such authorities. This document defines the procedures for communication, reporting, requisition, information sharing, or content removal, in compliance with the applicable legal framework, including the provisions of the LCEN, the GDPR, the Digital Services Act, and the Code of Criminal Procedure. It is available at the following address: https://www.nox.app/legal/provisions-for-authorities
2. PURPOSE AND SCOPE OF THE LEGAL NOTICES
This “Child Safety Policy” aims to define the principles, rules, protection mechanisms, handling procedures, and contractual commitments implemented by the Publisher of the NOX – The Social Alarm Clock Application regarding the protection of minors, particularly users aged 15 to 17.
Its objective is to ensure a digital environment that complies with legal requirements, respects the sensitivities of younger users, and is secured against the risks of inappropriate content, abuse, harassment, or exploitation.
More specifically, this "Child Safety Policy" governs:
- The conditions of access to the Application for minors and the applicable age restrictions;
- The operation of the Sensitive Content Management Setting, automatically enabled for minors;
- The rights and obligations of minor Users when using social features;
- The reporting procedures specific to child safety violations;
- The priority moderation procedures applied to situations involving minors;
- The Publisher’s cooperation with Competent Authorities in cases of danger, criminal offenses, or illegal content affecting a child.
This ”Child Safety Policy” applies, without restriction or reservation:
- To any User of the Application under the age of eighteen (18), as soon as they access the Application, create an account, or use any of its features;
- To any content published, shared, or distributed via the Application, regardless of its format (text, image, video, audio, message, interaction);
- To any interaction between Users, particularly when it involves or targets a minor directly or indirectly;
- To all individuals or entities using the Application, including adult Users who may interact with minors via social features.
This “Child Safety Policy” is established in accordance with French law, notably the Law for Confidence in the Digital Economy (LCEN) of June 21, 2004, as well as Regulation (EU) 2022/2065 on a Single Market for Digital Services (Digital Services Act), in force across the European Union.
In the event of any conflict between this “Child Safety Policy” and other contractual documents of the Publisher, the provisions most protective of minors shall prevail.
3. ACCEPTANCE AND ENFORCEABILITY
Access to the Application, the creation of a User Account, as well as the use of the services, features, or content provided, constitutes full, complete, and unconditional acceptance of this “Child Safety Policy”.
This acceptance is deemed acquired upon the User’s first use of the Application, whether the User is a minor or an adult, registered or not, and constitutes a binding contractual commitment between the Parties.
All Users acknowledge that they have read this “Child Safety Policy” and agree to comply strictly with its provisions. Failing that, they must immediately cease using the Application.
A. EXPRESS AND IMPLIED ACCEPTANCE
The User acknowledges that it is their responsibility to read all the information contained in this “Child Safety Policy”, that they possess the legal capacity required to access the Application, and that they agree to strictly abide by its terms throughout their use. Any navigation or interaction within the Application constitutes full and unconditional acceptance of this “Child Safety Policy”.
B. MANDATORY NATURE AND ENFORCEABILITY
This “Child Safety Policy” is binding on all Users of the Application as soon as they access, view, or interact with content published on NOX.
By accessing the Application, the User acknowledges that they have read this “Child Safety Policy” and agrees to comply with its terms. This acceptance is implicit but legally binding. It also applies to anyone submitting a report, even if they do not have a NOX account.
Acceptance is deemed acquired upon first use of the Application by the User, whether minor or adult, registered or not, and constitutes a binding contractual commitment between the Parties.
This “Child Safety Policy” holds the same contractual value as the other legal documents governing the use of the Application. It is binding on all Users, including adults who interact with minor accounts.
The Publisher may enforce this Policy against the User at any time, especially in the event of a report, unlawful behavior, inappropriate content, or a situation posing a risk to a minor.
By accessing the Application, the User irrevocably agrees to the obligations set forth herein, as well as to the applicable legal framework, both at the national level (French law, LCEN) and European level (GDPR, DSA). The User acknowledges that failure to comply with these provisions may result in restricted access or even civil, contractual, or criminal liability.
C. REFUSAL OF ACCEPTANCE
If the User does not accept this “Child Safety Policy” or objects to it, they must immediately stop using the Application. The Publisher reserves the right to refuse, suspend, or revoke access to anyone who does not comply with this Policy.
D. ACCESSIBILITY AND RETENTION
This “Child Safety Policy” is accessible at any time from the “Legal Information” submenu within the “Settings” menu of the Application and on the website: https://www.nox.app/legal/child-safety-standards-policy.The User may keep a copy or request one at any time via the dedicated contact service: legal@nox.app
E. AMENDMENTS AND UPDATES
The Publisher reserves the right to modify this “Child Safety Policy” at any time in order to account for:
- Changes in applicable legislative or regulatory frameworks (LCEN, GDPR, DSA, etc.);
- Developments in features or content formats offered on NOX;
- Feedback from the User community or competent authorities .
In the event of a material change, Users will be informed by any appropriate means. The current version of the Policy is the one published at: https://www.nox.app/legal/child-safety-standards-policy.
Any material change to this “Child Safety Policy” will be communicated to Users in advance through an appropriate method (notification, email, in-app message, etc.). A material change refers to any amendment that significantly impacts the User’s rights or obligations. In such cases, the User must explicitly accept the new version via a formal validation (e.g., checkbox, electronic signature, in-app confirmation). Without such acceptance, access to the Application will be suspended until express agreement is provided.
Non-material changes may simply be notified and will take effect without requiring formal acceptance, provided the information is made accessible to the User.
F. USER COMMITMENT
By accepting this “Child Safety Policy”, the User agrees to:
- Respect all the rules, principles, and obligations set forth herein;
- Use the Application in accordance with its intended purpose, without attempting to alter its functioning, access it fraudulently, or misuse its features for purposes contrary to public order or decency;
- Comply with the Code of Conduct;
- Always behave respectfully, courteously, and responsibly toward other Users, the Publisher, and third parties;
- Use the Application in alignment with its community spirit, social objective, and respect for fundamental rights;
- Use the Application fairly, ethically, responsibly, and without abuse—both technically and socially;
- Refrain from any action that could harm the proper functioning of the Application, the security of its systems, or the experience of other Users;
- Report any manifestly illegal content or behavior contrary to this Policy using the tools provided for that purpose;
- Not attempt to circumvent moderation, reporting, or sensitive content management systems implemented by the Publisher;
- Not use the Application for fraudulent, unauthorized commercial, manipulative, or unlawful purposes;
- Accept that any violation of this “Child Safety Policy” may result in sanctions, including content removal, account suspension, or account termination.
4. MINIMUM REQUIRED AGE AND ACCESS CONDITIONS
This section outlines the minimum age required to use the NOX – The Social Alarm Clock Application, the access conditions in the absence of parental control, and the mechanisms implemented by the Publisher to detect and remove accounts potentially belonging to underage individuals.
A. MINIMUM REQUIRED AGE
Access to the NOX – The Social Alarm Clock Application is strictly reserved for individuals who are at least fifteen (15) years old at the time of registration.
In accordance with Article 8 of the General Data Protection Regulation (GDPR – EU 2016/679) and current French legislation, registration is not permitted for individuals under the age of 15, unless otherwise provided for by the User's country of residence.
At present, the Publisher does not offer any parental control system that would allow authorization or supervision of the Application’s use by minors under the age of 15.
Any attempt to register by a User who has not reached the minimum required age constitutes a violation of these provisions and may result in the immediate and unannounced deletion of the corresponding User Account.
B. AGE DECLARATION DURING REGISTRATION
The User’s age is self-declared during registration via the Account creation form. The User agrees to provide information that is truthful, accurate, and complete.
Currently, the Publisher does not implement an automated system or third-party identity verification service to confirm the User's actual age. However, in line with its monitoring and due diligence obligations, the Publisher reserves the right to request additional verification and, where appropriate, suspend or delete any account suspected of belonging to a User under the age of 15.
C. RIGHT TO CLOSE A SUSPICIOUS ACCOUNT
As part of its prevention policy—and without constituting a general monitoring obligation—the Publisher may, on its own initiative, delete any User Account where there is reasonable evidence to believe it was created in violation of the age requirements set out in this “Child Safety Policy”.
Account deletion may occur without notice and without compensation, particularly when:
- The content published or interactions clearly reveal the User’s actual age;
- The account’s behavior raises serious doubt about the User's age;
- An alert, report, or complaint is submitted regarding the User's age.
D. RESPONSIBILITY OF LEGAL REPRESENTATIVES
A minor User aged 15 to 17 remains, in all cases, under the responsibility of their legal guardians. These guardians are fully responsible for their child’s use of the Application, including the content they publish or view, the interactions they engage in, and any harm they may suffer or cause.
The Publisher recommends that parents and legal guardians:
- Review this “Child Safety Policy” in full;
- Educate the minors under their care about the risks associated with digital interactions;
- Supervise the use of the Application—especially during the initial weeks following registration.
5. SENSITIVE CONTENT AND AUTOMATIC RESTRICTIONS FOR MINORS
This section describes how the “Sensitive Content” classification system works, the display settings available based on the User’s age, and the automatic restrictions imposed on minors to limit their exposure to inappropriate content.
A. DEFINITION OF SENSITIVE CONTENT
In the context of using the NOX – The Social Alarm Clock Application, “Sensitive Content” refers to any content classified as such in the “List of Prohibited Content”. In other words, it is content likely to disturb the sensitivity of a minor audience due to its suggestive, explicit, disturbing, or shocking nature. Content that must be marked as “Sensitive Content” within the Application includes, in particular:
- Visual or audio elements inappropriate for minors;
- Nudity, even partial or artistic;
- Shocking images or visuals that may cause emotional or visual discomfort;
- Any other visual representation deemed inappropriate for Users under the age of eighteen (18).
This classification is part of the Publisher’s commitment to protecting minors and preventing digital risks, in accordance with platform distribution standards and applicable legal requirements.
B. USER RESPONSIBILITY FOR CLASSIFICATION
When creating a public nox, a public nox thumbnail, a private nox, a private nox thumbnail, a public dream, or a private dream, the User is required to answer the Sensitive Content declaration.
For example, when publishing a "nox" (public or private), the following question is asked:
“Sensitive Content
Does your NOX contain nudity, violence, or any other sensitive content?
Yes – No”
The User agrees to respond honestly, accurately, and responsibly, understanding the impact this classification will have on content visibility.
In the event of concealment, false declaration, or clear abuse, the Publisher may:
- Reclassify the content as "Sensitive Content";
- Temporarily remove the content;
- Permanently delete the content;
- Sanction the User Account in accordance with the List of Prohibited Content.
C. PUBLISHER’S RIGHT TO RECLASSIFY
The Publisher reserves the right, at any time and at its sole discretion, to:
- Mark content as "Sensitive Content" if it was not declared as such by its creator;
- Modify the existing classification in case of a clear error or report;
- Restrict, hide, or delete any “Sensitive Content,” particularly if it is accessible to a minor audience.
Such reclassification may result from:
- Proactive moderation by the dedicated team;
- A report submitted by another User;
- Routine checks conducted by the Publisher.
D. SENSITIVE CONTENT DISPLAY SETTINGS
Within the “Sensitive Content” submenu of the Application's “Settings” menu, three display options are available:
- “Hide sensitive content” :
Sensitive Content is completely hidden and does not appear anywhere in the Application interface for the concerned User. It is inaccessible, even through video players, user profiles, or messaging. - “Display sensitive content blurred, with a prior warning”:
Sensitive Content is visible but blurred and preceded by a warning message before opening. The User must manually confirm their intention to view it. - “Display sensitive content without blurring or warning”:
Sensitive Content is displayed without restriction or any visual indicator. This setting is reserved for Users who are of legal age and have explicitly activated this preference.
E. AUTOMATIC SETTINGS FOR MINORS
For all Users under the age of eighteen (18), the automatically applied setting is:
“Hide sensitive content.”
This setting:
- Is enabled by default upon registration based on the declared date of birth;
- Is locked and cannot be changed by the minor User;
- Fully disables the display of any content classified as “Sensitive Content,” regardless of its source.
As such, no “Sensitive Content” is accessible to minors within the Application.
F. SETTINGS FOR ADULT USERS
For Users who are eighteen (18) years of age or older, the default setting is:
“Display sensitive content blurred, with a prior warning.”
These Users may manually activate the following option via the “Sensitive Content” submenu in their account’s Privacy Settings:
“Display sensitive content without blurring or warning.”
This setting may be changed at any time and constitutes explicit consent, recorded in the “User Account” settings.
6. PRIORITY MODERATION AND ENHANCED PROTECTION OF MINORS
This section outlines the Publisher’s commitments regarding the priority handling of reports concerning the safety of minors, as well as the specific moderation rules applied in this context.
A. PRIORITY HANDLING OF REPORTS INVOLVING A MINOR
The Publisher enforces a priority and enhanced moderation policy for any report, content, or behavior that may involve a minor User, either directly or indirectly.
Reports involving any of the following grounds are automatically considered high priority:
- Child endangerment;
- Child abuse;
- Sexualization of minors;
- Child exploitation or abuse.
These reports are immediately escalated internally to the team in charge of moderating “Prohibited Content”, with no automated processing and no involvement of external service providers. A direct human review is conducted by authorized members of the Publisher’s team.
B. INTERNAL HUMAN MODERATION TEAM
All moderation within the Application is carried out exclusively by human moderators employed directly by the Publisher. No external service provider is involved in the final decision-making process regarding the removal or classification of content involving a minor.
The Publisher ensures that members of the moderation team:
- Are trained and sensitized to the issues surrounding online child safety;
- Have the tools necessary to detect, evaluate, and address inappropriate content;
- Operate strictly within the applicable legal framework and the contractual policies of the Application.
C. SANCTIONS IN CASE OF HARM TO MINOR SAFETY
As part of its duty of care, the Publisher may apply one or more of the following sanctions, depending on the severity of the situation and the available information:
- Retroactive classification of the content as “Sensitive Content,” with immediate restriction of visibility;
- Temporary removal of the content;
- Permanent and irreversible deletion of the content;
- Temporary suspension of the responsible User Account;
- Permanent deletion of the responsible User Account;
- Permanent banning of the responsible User Account, without notice or recourse;
- Direct reporting to Competent Authorities, especially in the event of suspected criminal activity.
These measures may be taken without prior notice when a child’s safety is at risk. However, the affected User will be informed of the sanction applied—except in cases involving reports to judicial or law enforcement authorities.
D. NO RIGHT TO REPLY OR APPEAL
No right of reply, appeal, or request for justification may be exercised by a User who has been sanctioned for behavior endangering child safety.
This provision exists to preserve the effectiveness of protective measures, ensure the confidentiality of procedures, and protect potentially affected minors from retaliation or re-exposure to illicit content.
7. ZERO TOLERANCE FOR CHILD SEXUAL ABUSE AND EXPLOITATION (CSAE)
This section formalizes the Publisher’s zero tolerance policy toward any behavior or content involving CSAE, by detailing applicable prohibitions, content removal measures, account bans, and reports to Competent Authorities.
A. ZERO TOLERANCE PRINCIPLE
The Publisher applies a strict and non-negotiable zero tolerance policy regarding any form of sexual abuse, endangerment, exploitation, or violence involving a child, regardless of its nature, severity, or context.
This includes, but is not limited to:
- The distribution, sharing, promotion, or possession of child sexual abuse material (also referred to as CSAM – Child Sexual Abuse Material);
- Grooming, i.e., the deliberate approach or manipulation of a minor with the intent to exploit them sexually or psychologically;
- Any form of solicitation, incitement, conversation, or sexual behavior toward a minor, including humorous, indirect, or simulated forms;
- The depiction, representation, or explicit suggestion of sexual violence or degrading practices involving children, even if fictional or AI-generated.
No circumstance, artistic intent, or appeal to freedom of expression can justify the presence or dissemination of such content within the Application.
B. REINFORCED CONTRACTUAL PROHIBITIONS
The following are expressly prohibited and constitute serious violations of the Application’s contractual documents (including the "Terms of Use", the "Moderation Policy", and this "Child Safety Policy"):
- Any attempt to publish, host, or link to CSAM-related content;
- Any use of the Application to initiate suspicious, inappropriate, or risky contact with minors;
- Any behavior aimed at targeting, manipulating, trapping, or exposing a minor to sensitive or dangerous content;
- Any attempt to bypass sensitive content filters with the intent to reach a minor audience.
These prohibitions apply to all features of the Application.
C. ALIGNMENT WITH INTERNATIONAL STANDARDS
The Publisher adheres to international best practices for combatting online child sexual abuse and exploitation, including those promoted by:
- The Tech Coalition, a global alliance of digital actors fighting CSAE;
- The National Center for Missing and Exploited Children (NCMEC) in the United States;
- The European Centre for Missing and Sexually Exploited Children (eNCMEC – EU);
- The Digital Services Act (DSA) guidelines of the European Union and applicable ISO standards.
In line with these frameworks:
- All reports related to abuse risks are automatically escalated as highest priority;
- The Publisher immediately deletes any content suspected of CSAE, without requiring prior validation;
- Any implicated User Account is permanently banned with no possibility of appeal.
D. REPORTING TO COMPETENT AUTHORITIES
Whenever content, activity, or interaction is reasonably suspected of constituting sexual abuse, child pornography, exploitation, or endangerment of a child, the Publisher:
- Immediately reports the matter to the appropriate judicial, law enforcement, or administrative authorities (such as specialized police units, prosecutors, or certified reporting platforms);
- Retains all technical data, metadata, logs, and IP addresses required for case processing;
- Cooperates fully with authorities, in compliance with applicable national and international legal frameworks.
No information regarding such reporting is disclosed to the implicated User, either before or after transmission to authorities.
8. IN-APP REPORTING MECHANISMS
This section outlines the reporting tools integrated within the Application, the types of content concerned, the available report categories for Users, as well as the confidentiality, prioritization, and processing rules applied by the Publisher.
A. REPORTABLE CONTENT AND ELEMENTS IN THE APPLICATION
Any content accessible through the Application may be reported if it is suspected to violate the law, the “Terms of Use”, the “Code of Conduct”, the “List of Prohibited Content”, or any of NOX’s contractual policies.
The types of reportable content include:
- “nox”;
- “nox” thumbnails;
- “dream”;
- “dream” collection titles;
- “dream” collection thumbnails;
- Comments;
- Profile pictures;
- Cover photos;
- Textual profile descriptions;
- Text messages, voice messages, photos, or GIFs sent via messaging.
The Publisher guarantees that all of the above can be reported at any time via a clearly accessible "Report" button or command within a few clicks.
The Publisher reminds Users that, regardless of whether content is public or private, it is subject to the same legal and contractual obligations. Content shared through private messaging may also be reported and moderated if it violates applicable rules.
This system is designed to allow for rapid, targeted, and confidential responses to inappropriate or potentially harmful content, especially in cases involving child safety.
B. AVAILABLE REPORT CATEGORIES AND ASSOCIATED FORMS
To ensure precise and fair moderation, the Application provides a fixed list of report categories. Each type of reportable content can be associated with one or more predefined reasons selected by the User during the reporting process.
General report categories:
- Spam or scam;
- Prohibited sales;
- Copyright infringement;
- Identity theft;
- Misinformation or fake news;
- Pornography or sexually explicit content;
- Hate speech;
- Privacy violation;
- Harassment or threats;
- Terrorism incitement;
- Violence or dangerous behavior.
Child safety-specific report categories:
- Child endangerment;
- Child abuse;
- Child sexualization;
- Child exploitation or abuse.
Depending on the selected report category, the form will vary to collect the relevant information required for proper assessment:
- No additional form (immediate report submission):
For obvious or widely documented violations, Users are taken directly to the report validation step, with no extra text fields or file uploads required.
This applies to:- Spam or scam;
- Prohibited sales;
- Pornography or sexually explicit content;
- Hate speech;
- Terrorism incitement;
- Violence or dangerous behavior;
- Child endangerment;
- Child abuse;
- Child sexualization.
- Open text form (explanation required):
Users must provide detailed contextual information.
This applies to:- Identity theft;
- Misinformation or fake news;
- Privacy violation;
- Harassment or threats;
- Child exploitation or abuse.
- Structured multi-field form:
Users are required to provide precise legal justification for the report.
This applies exclusively to:- Copyright infringement.
The Publisher reserves the right to update this list in response to changes in the legal landscape, detected misuse, or evolving moderation needs. Any updates will be reflected in this document or via Application updates.
C. REPORT PROCESSING AND PRIORITIZATION
All reports are forwarded to the internal moderation team, with no reliance on external service providers or autonomous AI moderation systems.
Reports involving child safety are prioritized through a specific response workflow. Although no guaranteed response time is provided, these reports receive enhanced attention and are reviewed by authorized human moderators, in compliance with applicable laws and the Publisher’s internal procedures.
To ensure continuous and responsive moderation, operations are conducted 24/7 from three main locations: Bourg-en-Bresse, France; Montreal, Canada; Ho Chi Minh City, Vietnam.
This international structure ensures real-time content review and efficient prioritization of cases involving minors.
D. CONFIDENTIALITY GUARANTEES
The reporting process offers the following guarantees:
- The reporting User remains anonymous to the reported User;
- The reported content is reviewed and stored confidentially by the internal team;
- Good faith reports will never lead to any sanction against the reporter, even if the reported content is ultimately deemed compliant.
However, abusive or malicious use of the reporting system (e.g., repeated unfounded reports, slander, harassment via reporting) may result in suspension of the responsible User Account, in accordance with the “Moderation Policy” and the “Terms of Use”.
9. OFFICIAL CONTACT POINT AND ASSISTANCE
This section identifies the dedicated email address for child safety concerns, outlines its purpose, the types of requests that can be submitted, and the Publisher’s confidentiality commitments toward those who use it.
A. DEDICATED CONTACT ADDRESS FOR CHILD SAFETY
The Publisher of the NOX – The Social Alarm Clock Application provides an official contact point dedicated to child safety, which can be reached at the following address: childsafety@nox.app
This email address may be used by any individual—User, parent, authority, journalist, organization, or other concerned party—to:
- Report inappropriate content or behavior that may endanger a minor;
- Request clarification on this policy or on a situation involving a child’s safety;
- Alert the Publisher to an emergency, abuse, or verified risk affecting a minor User;
- Submit a formal report, with supporting documents or evidence (screenshots, witness statements, external reports, etc.);
- Request formal cooperation in the context of a judicial or administrative procedure.
B. RECEIPT AND HANDLING OF COMMUNICATIONS
All communications sent to childsafety@nox.app are:
- Received and reviewed by an authorized member of the Publisher’s safety or compliance team;
- Logged in a dedicated security register, in compliance with GDPR and the Digital Services Act;
- Treated with priority, subject to the operational availability of the team;
- When necessary, forwarded without delay to the Competent Authorities if the content of the message warrants an emergency response or legal processing.
There is no automatic reply guaranteeing a confirmation of receipt, but any valid or serious message will receive formal and confidential processing.
C. CONFIDENTIALITY AND LEGAL FRAMEWORK
The information communicated via this channel is:
- Strictly confidential, unless required by law to be transmitted to the authorities;
- Securely stored, for a period proportional to the severity of the incident and the Publisher’s duty to cooperate;
- Not used for commercial, statistical, or analytical purposes;
- Protected by applicable data protection laws, as outlined in the Publisher’s “Privacy Policy”.
Using this channel does not constitute a formal legal complaint, but may lead to judicial cooperation, content removal, or immediate precautionary action.
10. EDUCATION AND PREVENTIVE TECHNOLOGIES
This section describes the awareness initiatives implemented for Users, as well as the technological solutions that the Publisher is exploring or deploying to prevent dangerous or inappropriate behavior toward minors.
A. PUBLISHER’S EDUCATIONAL COMMITMENT
The Publisher of the NOX – The Social Alarm Clock Application recognizes that protecting children online requires not only moderation and enforcement but also a continuous effort to educate, inform, and prevent.
To that end, the Publisher commits to:
- Clearly inform Users about the existence and risks of Sensitive Content from their first use of the Application;
- Reiterate the rules related to minor safety in all contractual documents;
- Explicitly state the prohibitions related to CSAM, grooming, and the sexualization of minors within reporting interfaces and content publication forms;
- Promote responsible behavior by integrating educational reminders at key moments in the user experience.
B. DETECTION AND EXPLORATION OF TECHNOLOGICAL SOLUTIONS
The Publisher is actively exploring the integration of automated prevention technologies to enhance child safety within the Application. These tools may include:
- Automatic blurring or masking systems for potentially sensitive or disturbing visual content;
- Smart alert mechanisms to detect risky terms, content, or intentions in messages or public posts;
- Auto-blocking of publication in cases where content has been repeatedly flagged by the community for endangering minors;
- Increased monitoring of repetitive or suspicious behaviors, particularly interactions directed at minor accounts.
No such technology will be implemented without respecting:
- Fundamental User rights (privacy, freedom of expression, transparency);
- The requirements of the General Data Protection Regulation (GDPR);
- The standards of the Digital Services Act (DSA) and best practices promoted by the Tech Coalition.
C. COLLABORATION WITH EXPERT THIRD PARTIES
The Publisher reserves the right to establish partnerships with organizations specialized in preventing violence against children, in order to:
- Provide training for internal moderation teams;
- Acquire educational materials for the community;
- Improve the Application’s compliance with international child safeguarding standards.
Such partnerships may include nonprofits, foundations, government agencies, or EU institutions dedicated to child protection and cybersecurity.
11. SANCTIONS, PROHIBITIONS, AND STRICT ENFORCEMENT OF THE POLICY
This section details the sanctions applicable in case of violations of this Policy, the strictly prohibited behaviors, the lack of recourse for offenders, and the consistent and uncompromising enforcement of disciplinary measures by the Publisher.
A. APPLICABLE DISCIPLINARY MEASURES
Any violation—confirmed or reasonably suspected—of this “Child Safety Policy” will result in immediate measures, proportionate to the severity of the facts, and without the need for prior notice.
Depending on the nature of the offense, the Publisher may apply one or more of the following sanctions:
- Retroactive classification of content as “Sensitive Content,” with immediate display restriction, without prior notice or right of appeal;
- Temporary removal of content, without prior notice or right of appeal;
- Permanent and irreversible deletion of content, without prior notice or right of appeal;
- Temporary suspension of the responsible User Account, without prior notice or right of appeal;
- Permanent deletion of the responsible User Account, without prior notice or right of appeal;
- Permanent banning of the responsible User Account, without prior notice or right of appeal;
- Direct report to Competent Authorities, especially in case of suspected criminal offense.
No compensation, reimbursement, or account reinstatement may be claimed by a User sanctioned under this Policy.
B. INAPPLICABILITY OF EXCEPTIONS OR JUSTIFICATIONS
The Publisher reminds Users that freedom of expression, humor, fiction, parody, artistic expression, or provocation cannot be used to justify:
- The publication of content depicting or suggesting child abuse;
- Inappropriate, sexualized, or suggestive interactions with minors;
- Intentions clearly contrary to public order and the protection of children.
No form of contextualization, relativization, or comedic framing will be accepted as a mitigating factor.
C. INADMISSIBILITY OF CONTESTATION AND LACK OF RIGHT TO APPEAL
In matters concerning the protection of minors and the fight against CSAE, no appeal, claim, or right of reply may be exercised by the sanctioned User.
This rule also applies to:
- Content removals for material considered dangerous to a child;
- Account blocks related to severe violations or suspicious behavior;
- Reports to authorities or to European coordination platforms (under the DSA).
The Publisher reserves the right not to justify its decisions in cases involving the protection of minors, to preserve the confidentiality of procedures and ensure the safety of those involved.
D. ZERO-TOLERANCE ENFORCEMENT OF THE POLICY
This Policy applies uniformly to all Users, regardless of age, status, follower count, or length of use of the Application.
No exceptions, no favoritism, and no special treatment will be granted in the enforcement of this Policy.
The Publisher acts in strict accordance with its legal and contractual obligations, and in the best interest of child protection.
12. IMMEDIATE DANGER SITUATIONS – CALL TO LOCAL AUTHORITIES
This section emphasizes the imperative duty to contact local authorities without delay in situations where a child is at immediate risk. It reminds Users that the Publisher does not replace emergency services, and outlines the judicial cooperation measures implemented in such cases.
A. SHARED RESPONSIBILITY AND DUTY TO ALERT
In any situation where a child is suspected of being in immediate danger—whether the threat is physical, psychological, sexual, or emotional—the Publisher explicitly urges every User to contact the relevant local authorities first, without waiting for or relying solely on the internal reporting mechanisms.
NOX – The Social Alarm Clock is not an emergency service, nor is it a judicial or law enforcement authority. The Publisher cannot intervene directly outside the scope of its technical and contractual responsibilities.
B. GENERAL EMERGENCY CONTACTS
In the event of a critical situation, Users are encouraged to immediately contact:
- Local police or gendarmerie services (e.g., emergency number 17 in France, or the equivalent in the User’s country of residence);
- National platforms for reporting illegal content or sexual violence (in France: www.internet-signalement.gouv.fr);
- Social services, child protection hotlines, or nonprofit organizations specialized in safeguarding minors.
The Publisher strongly advises any person who becomes aware of a serious and imminent risk to:
- Not rely solely on the Application’s reporting feature;
- Take swift action to protect the potential victim by contacting the appropriate authorities directly.
C. COMMITMENT TO IMMEDIATE COOPERATION WITH LAW ENFORCEMENT
Upon receipt of a formal request from a competent authority (police, public prosecutor, cybercrime unit, child protection agency), the Publisher commits to:
- Immediately transmit relevant content, metadata, or information required for the investigation;
- Block access to the identified accounts or content upon receipt of the request;
- Preserve evidence in accordance with applicable legal provisions (LCEN, DSA, French Code of Criminal Procedure);
- Actively cooperate with the requesting authorities, in full compliance with the applicable legal framework.
This cooperation is carried out without any requirement to notify the concerned User and without unjustified delay.
13. ENTRY INTO FORCE AND ACCESSIBILITY OF THE POLICY
This section specifies the effective date of this “Child Safety Policy”, its enforceability for Users, and how it can be accessed at any time. It aims to ensure transparency and permanent accessibility of the behavioral standards expected on NOX.
A. EFFECTIVE DATE
This “Child Safety Policy” enters into force on April 14, 2025, the official launch date of the NOX – The Social Alarm Clock Application. It applies from the User’s first use of the Application, whether the User is registered or simply a visitor.
B. ENFORCEABILITY
The “Child Safety Policy” is binding on all Users, regardless of status, location, or length of registration. It constitutes a contractual document supplementary to the “Terms of Use”, clarifying specific behavioral commitments.
Any person who accesses the Application or uses its features is deemed to have read, understood, and accepted the “Child Safety Policy” in full.
C. OFFICIAL LANGUAGE
The French version of the “Child Safety Policy” is the authoritative legal version between the Parties. In the event of a translation, only the French version is legally binding.
14. POLICY EVOLUTION AND UPDATES
This “Child Safety Policy” is a living document, subject to updates in order to adapt to changes in the Application, the community, safety standards, and the applicable legal framework.
A. UPDATES BY THE PUBLISHER
The Publisher reserves the right to modify, supplement, or update the “Child Safety Policy” at any time, particularly in response to :
- Changes in French, European, or international legislation;
- Substantial changes to the Application’s features;
- Shifts in moderation or safety priorities;
- Feedback and experiences from the NOX community.
B. USER NOTIFICATION
Users are informed of any substantial update to the “Child Safety Policy”:
- Through an in-app notification or message;
- Via an announcement published on the official NOX website: https://nox.app.
In some cases, the Publisher may require explicit acceptance of the new version of the “Child Safety Policy” before allowing continued use of the services.
C. PERMANENT ACCESS
The current version of the “Child Safety Policy” is available at all times
- In the “Legal Information” submenu of the Application’s Settings;
- On NOX’s legal website: https://www.nox.app/legal.
It may be freely consulted, downloaded, or archived by the User.
Users are encouraged to regularly review the “Child Safety Policy” to ensure they remain in compliance with its most recent provisions.
15. GOVERNING LAW AND JURISDICTION
This “Child Safety Policy” is governed by, interpreted under, and enforced in accordance with French law, subject to the application of any mandatory provisions of European Union law or any more protective local legislation applicable to the User in their country of residence.
In the event of a dispute concerning the interpretation, validity, execution, or termination of this policy, or the use of the Application, the competent courts shall be those under the jurisdiction of the Publisher’s registered office, unless otherwise required by mandatory legal provisions, particularly in matters of consumer protection.
In accordance with Articles L.611-1 et seq. of the French Consumer Code, the User is informed that they may submit any dispute with the Publisher to a consumer mediator, free of charge, to seek amicable resolution.
The consumer mediator proposed by the Publisher is: MEDICYS, 73 boulevard de Clichy, 75009 Paris, Telephone: +33 (0)1 49 70 15 93, Email: contact@medicys.fr, Website: https://www.cc-mediateurconso-bfc.fr
Nothing in this policy shall limit the User’s mandatory legal rights under the applicable laws of their country of residence.