Corporate & Legal Compliance
DSA Compliance Page

DSA Compliance Page (Digital Services Act)

Last updated: 02.04.2026 · Version: 1.2

This page explains WIN's ("Platform") compliance approach under the European Union Digital Services Act (Digital Services Act — Regulation (EU) 2022/2065, "DSA"), including points of contact, illegal content notice (notice & action), objection/complaint and transparency mechanisms.

⚠️
  • This page has been prepared within the DSA framework for users located in / recipients in the EU and EU authorities. - For users outside the EU, mandatory local law (e.g., in Turkiye: Law No. 5651, KVKK, TCK, etc.) may also apply. - DSA compliance becomes important where the Platform provides services in / targets the EU (falling within DSA scope). This page is designed as a preparedness + transparency text for both current and planned EU activity. - Entry into the EU market is still in approval processes and no definite date has been set yet; as a general target, Q4 2026 may be foreseen.

Related documents

Under the DSA, this page should be read together with the following WIN documents:


Section 1: Service provider information and DSA scope

This section sets out the framework regarding the platform operator, the nature of the service, and when the DSA applies.

1.1 Operator (Provider)

E-mail (support@whoisnextapp.com) and official notification address (Section 2.1) are published as the Art. 11 DSA point of contact.

1.2 Nature of the service (DSA terminology)

As a platform where users create profiles, generate content (profile text/photo), match with other users and communicate, WIN is generally considered in DSA terminology as:

  • Intermediary service provider, and
  • Hosting service in terms of content hosting/making content available, and
  • Online platform in terms of matching/distributing user content to other users

1.3 Applicability of the DSA (EU "offering" criterion)

The DSA applies where services are offered in the EU, regardless of whether the service provider is established in the EU. In the DSA text, the existence of a "substantial connection with the EU" may be assessed through factors such as establishment in the EU, significance of the number of recipients in the EU, or targeting activity to the EU (DSA recitals).

If services are offered in the EU, the DSA enforces points of contact (Art. 11-12), legal representative (Art. 13), content moderation transparency (Art. 14-15), illegal content notice (Art. 16), statement of reasons (Art. 17), objection/complaint (Art. 20-21), and other obligations.


Section 2: DSA contact points and EU legal representative

This section explains the points of contact mandated by the DSA (Art. 11-12) and the legal representative requirement (Art. 13) for providers not established in the EU.

2.1 Point of contact for EU authorities (Art. 11 DSA)

Pursuant to DSA Article 11, WIN designates and publicly discloses a single point of contact for direct communication by electronic means with EU Member State authorities, the European Commission and the Board for Digital Services.

  • Art. 11 contact e-mail: support@whoisnextapp.com - Official notification address: Acibadem Mah. Asafbey Sk. Imer Apt. No: 7 A, Kadikoy / Istanbul (WIN TECH Bilisim Organizasyon ve Ticaret A.S.) - KEP / UETS: There is currently no KEP or UETS address; this page will be updated when obtained. - Accepted languages: Turkish, English - Who can use it? Only EU Member State authorities, the Commission and the Board.

2.2 Point of contact for users (Art. 12 DSA)

Pursuant to DSA Article 12, WIN provides a point of contact enabling users to communicate in a user-friendly, swift and direct manner. This channel cannot rely solely on automated tools (human support must be available).

WIN's user communication channels (including DSA scope):

In DSA-related submissions, adding the term "DSA" and relevant content/account information, where possible, accelerates the assessment.

2.3 EU legal representative (Art. 13 DSA) — providers not established in the EU

If WIN is not established in the EU and offers services in the EU, under DSA Article 13, WIN must appoint in writing an EU-based legal representative.

WIN currently does not have an EU-established legal representative. The legal representative appointment will be made together with entry into the EU market. Trigger: Commencement of services to citizens in at least 5 EU Member States (or similar meaningful market presence). The Member State where the representative will be appointed and the exact date have not yet been determined; this page will be updated when clarified.


Section 3: Illegal content notice (Notice & Action) — Art. 16 DSA

This section explains how users or third parties can report allegations of illegal content and how WIN processes these notices (Art. 16 DSA).

3.1 What is "illegal content"?

In the DSA context, "illegal content" is information/content considered illegal under EU law or the law of the relevant Member State. Examples (non-exhaustive): child sexual abuse material, non-consensual intimate content, threats/blackmail, fraud, encouragement of hate crimes, unlawful disclosure of personal data, etc.

Content may violate Community Guidelines or Terms of Use; however, not every violation is "illegal." Nevertheless, WIN may also apply content/service restrictions under contractual rules (without prejudice to Art. 17 DSA statement of reasons).

3.2 Notice channels

Under Art. 16, WIN accepts notices via fully electronic means. Channels:

  • Report button on profile screens: Selecting the relevant category by being redirected to the "Report" page via a button on the screen where a person's profile is viewed (fastest and preferred method).
  • Support request via in-app Help/Contact screen or e-mail.
  • Notice through website communication channels.
  • For written and structured notices by private individuals, the Complaint and Notice Form may be used. For content-focused moderation notices, the Content Violation Form may also be used.

Official bodies and competent authorities (law enforcement, prosecution office, court, administrative authority, etc.) may use the form on the Illegal Content Notice Form page to submit illegal content notices in a standard and duly compliant way; details are aligned with the Law Enforcement Guide.

3.3 Minimum information required in a notice (Art. 16/2)

Under DSA Article 16/2, for notices to be "sufficiently precise and adequately substantiated," they should include the following elements:

Mandatory elementWhat does it mean?Practical example
(a) Statement of reasonsExplain why the content is illegal"This profile includes a non-consensual nude photo."
(b) Location informationThe "exact location" of the contentContent ID, screen name, date/time; URL if possible
(c) Name + e-mailIdentity of notifier (exception: child abuse crimes)"Name Surname / e-mail"
(d) Good faith statementDeclaration that the information is believed to be true and complete"My declaration is true and complete."
⚠️

Under DSA Article 16/2-(c), in certain offence types involving child sexual abuse, the notifier name/e-mail obligation may not apply. WIN prioritizes such notices and may direct them to competent authorities where necessary.

3.4 Notice receipt acknowledgment and decision notification (Art. 16/4-5)

  • Receipt acknowledgment: If e-mail/electronic contact information is included in the notice, WIN sends a receipt acknowledgment without undue delay (Art. 16/4).
  • Decision notification: WIN notifies the notifier of its decision regarding the noticed content and the application/objection remedies (Art. 16/5).

3.5 Assessment standard and automation (Art. 16/6)

WIN assesses notices in a timely, diligent, non-arbitrary and objective manner. If automated tools are used, the user is informed in the notice response about such use (Art. 16/6).

Under DSA Article 16/3, if a notice is clear and substantiated enough to allow a diligent hosting provider to identify illegality of the reported content without detailed legal analysis, it may create "actual knowledge/awareness" for the relevant content. Therefore, WIN aims to design the notice form to be as structured and explanatory as possible.


Section 4: Content moderation and "Statement of Reasons" — Art. 14, 15, 17 DSA

This section explains disclosure of moderation policies in terms (Art. 14), transparency reports (Art. 15), and "statement of reasons" processes for content/account restrictions (Art. 17).

4.1 Moderation sources (automation + human review)

For security and community integrity, WIN uses automated systems, user reports and human moderation processes together. In line with the Community Guidelines and Terms of Use:

  • Automated visual/text screening: Photos and profile texts are scanned through Google Cloud Vision (Safe Search) and MeiliSearch infrastructures; visuals caught by obscenity/nudity filters are rejected.
  • Selfie verification (liveness): Live selfie matching and pose verification with profile photo are performed with Google Gemini 2.5 Flash; inappropriate content screening on selfies is supported by Google Cloud Vision (Safe Search); fake account risk is reduced through these processes.
  • Report-based review: Following user notices, content is reviewed by the moderation team.
  • Human review: Complex cases, objections and high-priority safety reports (violence, CSAE, etc.) are manually assessed by expert moderators.
  • Pursuant to DSA Article 14, policies/tools used in content moderation and the functioning of the internal complaint system are presented clearly, understandably and accessibly in the Terms of Use and relevant policies. - Under DSA Article 14/2, material changes in the terms may be notified to users.

4.2 Restriction types (Art. 17/1)

Upon identifying illegal content or contractual violations, WIN may apply the following measures in accordance with the terms and proportionately:

  • Removal of content / blocking access / reducing visibility (demotion)
  • Restricting specific functions (e.g., photo upload, messaging)
  • Suspending premium subscription benefits (WIN currently uses subscription-only model; no virtual currency/gifts)
  • Temporary suspension or permanent closure of account

4.3 Statement of Reasons (Art. 17)

Under DSA Article 17, when WIN applies a content/service restriction, it provides the affected user with a clear and specific statement of reasons.

At minimum, the statement of reasons includes the following (Art. 17/3):

  • What happened? (removal/access block/demotion/suspension/closure, etc.), duration and geographical scope
  • Which facts and circumstances were relied on? (user notice or own-initiative review)
  • Was automation used? (automated detection/review information)
  • If illegality is alleged: Legal basis + why it is considered illegal
  • If rule violation exists: Contractual basis + why it is considered a violation
  • Remedies: Internal complaint/objection, out-of-court resolution, judicial remedy

Under DSA Article 20, users may apply via the internal complaint/objection system for at least 6 months. For quick dispute resolution, applying within 30 days is recommended where possible.

4.4 Risk to life/physical safety and suspicion of crime (Art. 18)

Under DSA Article 18, when WIN becomes aware of information giving rise to suspicion of a crime involving a threat to life or physical safety, it may notify law enforcement/judicial authorities of the relevant Member State(s) without undue delay.

For operational processes related to this heading, see Law Enforcement Guide.


Section 5: Internal complaint/objection system — Art. 20 DSA

This section explains the free, electronic and effective internal complaint/objection mechanism for decisions such as content removal/account restriction (Art. 20 DSA). Persons subject to sanctions may submit their objections by e-mail within 6 months from notification of the decision.

5.1 Which decisions can be objected to?

Under DSA Article 20, users (and notifying persons) can object to the following decisions for at least 6 months:

  • Decisions to remove content / block access / limit visibility
  • Partial/full suspension or termination of service
  • Account suspension or closure
  • Premium subscription restrictions (WIN currently uses subscription-only model)

5.2 How to object (practical flow)

Step 1: Access the notification

The restriction decision may be delivered to you via the in-app notification center and/or e-mail.

Step 2: Use the "Object" button

You are directed to the relevant form via the "Object" button in the decision notice. This function is provided in-app as part of DSA Art. 20 compliance.

Step 3: Write your reasoning and add evidence

Explain why the decision subject to objection is incorrect; add screenshot, link/ID, etc.

Step 4: Review and result

The objection is reviewed with human oversight and finalized within 30 days at the latest. Where necessary, the decision is withdrawn or corrected (Art. 20/4).

WIN evaluates objection applications and communicates results within 30 days at the latest. This period may exceptionally be extended due to operational intensity or extraordinary circumstances; in that case, the applicant is informed.

In addition to this internal complaint/objection mechanism under the DSA, where an offline or written and structured application is required, the Objection Form may also be used.

5.3 Human oversight (Art. 20/6)

WIN does not rely solely on automated systems when deciding objections; oversight by appropriately qualified personnel is provided (Art. 20/6).


Section 6: Out-of-court dispute resolution — Art. 21 DSA

This section explains access to certified out-of-court dispute settlement bodies in the EU for disputes that cannot be resolved through internal objection (Art. 21 DSA).

Under DSA Article 21, for disputes relating to decisions under Art. 20, the user may choose a certified out-of-court dispute settlement body in the EU. This process:

  • Does not eliminate the internal objection route (internal objection is recommended first),
  • Does not prevent court action,
  • The settlement body's decision does not have to be binding.

The EU Commission publishes the list of certified out-of-court dispute settlement bodies: Out-of-court dispute settlement bodies under the DSA (opens in a new tab).

Users located in the EU may also complain to the Digital Services Coordinator of their Member State. Official Commission list page: Digital Services Coordinators (opens in a new tab).


Section 7: Trusted flagger — Art. 22 DSA

Under DSA Article 22, notices submitted through the Art. 16 mechanism by entities with trusted flagger status are handled as priority and processed without undue delay.

To verify trusted flagger status, WIN may request identity/institution and DSA coordinator information in the application and may process only through verified channels.


Section 8: Measures against misuse — Art. 23 DSA

Under DSA Article 23, WIN:

  • May suspend for a reasonable period, after at least one warning, the service of users who frequently provide manifestly illegal content. Suspension periods: 30 days after first violation, 90 days for repeated violations, or permanent closure; in severe violations, direct permanent closure may be applied without warning.
  • May restrict for 90 days after at least one warning the notice/complaint function of persons who frequently submit manifestly unfounded notices/complaints.

This policy is regulated clearly and with examples in the Terms of Use and related policies.


Section 9: Transparency reports and metrics — Art. 15 and Art. 24 DSA

This section explains at what intervals and with what scope content moderation and DSA compliance metrics will be published.

9.1 Annual content moderation report (Art. 15)

Under DSA Article 15 (without prejudice to applicability and exceptions), WIN aims to publish reports at least once a year including the following headings:

WIN has a status compliant with the DSA micro enterprise definition (fewer than 50 employees; annual turnover/balance sheet total not exceeding EUR 10 million) and this status is not expected to change in the foreseeable period. Under DSA Art. 15/2 and Art. 19, the company will benefit from the micro enterprise exception regarding certain transparency and online platform obligations. Nevertheless, with the goal of user safety and accountability, WIN will continue to implement a substantial part of these standards by adopting voluntary compliance as well.

  • Authority orders (removal/information requests) and response times
  • Number of Art. 16 notices (by category), processing times, use of automation
  • Own-initiative moderation activities and restriction types
  • Internal complaint/objection statistics and outcomes
  • Purposes, error rate and safety measures of automated systems

9.2 Additional reporting for online platforms (Art. 24)

As an online platform (Art. 24), WIN additionally reports items such as:

  • Number/outcome/duration of out-of-court disputes under Art. 21
  • Number of suspensions under Art. 23 (by type)

9.3 EU average monthly active recipients (Art. 24/2)

Under DSA Article 24/2, for each platform, the number of average monthly active recipients within the EU is published at least every six months as the average of the previous 6 months.

Since the Platform is not yet live in the EU market, this metric is currently considered 0 (zero) or invalid. Once services start being offered in the EU, it will be updated at least every six months as the average of the past 6 months. Methodology: calculated according to DSA Art. 24/2 and related delegated methodology; does not include personal data.

9.4 Transfer of statements of reasons to the EU database (Art. 24/5)

Under DSA Article 24/5, online platforms may be obliged to transmit certain decisions and statements of reasons under Art. 17 to the public database managed by the Commission. WIN will comply with this obligation when applicability is clarified; transmissions will observe the principle of not sharing personal data. Procedures will be updated as the Commission's technical specification and mandatory fields are clarified.

9.5 Place where reports will be published

Transparency reports are published on the Transparency Reports page.


Section 10: Interface design (dark pattern prohibition) — Art. 25 DSA

Under DSA Article 25, WIN aims not to design interfaces in a way that would deceive/manipulate users' ability to make free and informed decisions.

Applied practices:

  • Subscription cancellation: Cancellation can be done through the Apple App Store or Google Play store account where the purchase was made, with the same ease as subscription initiation (Regulation on Subscription Agreements Art. 22-25). In-app store redirection is provided.
  • User preferences: Permissions, notifications and cookie/tracking preferences are not repeatedly requested through pressure; they can be managed in settings.

For related consumer/subscription processes, see Subscription and Purchase Terms (especially Section 4) and Preliminary Information Form.


Section 11: Advertising transparency and sensitive data — Art. 26 DSA + GDPR/KVKK

This section applies where WIN displays ads in the app interface or where a need arises to mark user content as "commercial communication" (Art. 26 DSA).

WIN uses the Google AdMob ad network within the application. Within DSA Article 26 requirements:

  • Ad labeling: All ads are marked clearly in a way distinguishable from organic content.
  • Targeting information: "Why this ad" / targeting parameter information provided by the ad network is presented to the user as far as possible; DSA-compliant interfaces are used.
  • Ad network change: WIN reserves the right to change ad model, provider, or integration; DSA compliance is observed in changes.

Under DSA Article 26/2, there is no automatic monitoring of content users provide via the Platform (messages, profile); however, if "commercial communication" is identified in case of notice/complaint, restriction may be applied.

⚠️

WIN does not use users' sensitive data such as religious belief, political opinion, sexual orientation, ethnic origin, health information, etc. for ad targeting/profiling. Such data is processed only for clearly specified purposes (e.g., matching/filter preferences); no profiling based on special categories under GDPR Art. 9 is performed in ad targeting.


Section 12: Recommendation system (matching) transparency — Art. 27 DSA

Under DSA Article 27, WIN explains the main parameters in the recommendation/matching system and the user's options to influence/change them within Section 8 of the Terms of Use.

Main parameters:

  • Location/distance: Geographic proximity is the most important parameter determining matching likelihood; obtained instantly when location permission is granted, location history is not stored.
  • Preferences and filters: Age range, distance, basic matching preferences (gender, etc.).
  • Interaction history: Like/reject behaviors, messaging interactions.
  • Safety signals: Blocking/complaint relations, verification statuses.
  • Profile quality/consistency: Missing fields, repetitive behavior patterns.
  • Cooldown: Profiles that appeared in your deck but were not selected are not shown again for a certain period.
  • Premium effects: Seeing 20 decks per day (Standard: 5), x2 visibility in decks, deck filtering, profile hiding, read receipts in messages, seeing online/offline status, score display.

User control: Preferences and filters can be changed in Settings; location permission is managed in device settings. Recommendation parameters are presented in in-app "Settings" and relevant screens.

The recommendation system currently operates at a basic level due to low user volume. As the number of users increases, the algorithm may be improved and ranking/recommendation options may be added. In such changes, this page and the Terms of Use will be updated.


Section 13: 18+ age policy and protection of minors — Art. 28 DSA

WIN is not intended for users under 18. The DSA Article 28 approach regarding protection of minors is taken into account in terms of product safety.

Registration and age verification (aligned with Section 3 of the Terms of Use):

  • Google Sign-In (OAuth): During registration, name, e-mail and profile photo from Google account are shared; if birth date data from store/account source exists, it is taken as basis.
  • Phone login (SMS/OTP): Phone-based identity verification may be used; e-mail and phone verification are completed during profile creation process.
  • User declaration: When date of birth data cannot be obtained, a screen where the user selects age is presented and confirmation of the checkbox "I declare that I am over 18 years old" is requested.
  • Additional control: Upon suspicion or notice of being under 18, accounts understood to belong to minors are immediately suspended or closed. If age verification cannot be completed, the account is not included in the discovery flow.

In cases where WIN knows with reasonable certainty that the user is a minor, it does not present profiling-based ads using minors' personal data (Art. 28/2). This obligation does not create a requirement for "additional personal data processing" to determine whether someone is a minor (Art. 28/3).


Section 14: Updates and version management

WIN may update this page in line with legal requirements, product and security needs. The current text is published in the in-app "Legal" section and at whoisnextapp.com (opens in a new tab).

Mechanisms explained on this page are operated as a consistent compliance set together with Terms of Use, Community Guidelines and Transparency Reports.