Search results for
Lawyers
Markus von Fuchs advises in intellectual property law, in particular in competition, patent, and trademark law as well as on the protection of know-how. He advises companies on protecting and commercially exploiting intellectual property, for example through licensing, sales, R&D, and cooperation agreements. He also focuses on the judicial and extrajudicial defense of intellectual property rights in interim injunction and principal proceedings. He further advises on border seizing procedures, initiates and advises on criminal measures relating to product and brand piracy, and on the infringement of business and business secrets. Markus von Fuchs also advises many companies on developing and introducing new technologies and business models. He has particular expertise in the optical and medical technology sectors.
Dr. Oliver Hornung advises national and international IT service providers and users in the legal structuring and negotiation of IT, project, and outsourcing contracts, as well as in matters of copyright and licensing. He is also regularly involved in distressed projects (dispute management) and advises clients in conciliation and arbitration proceedings and, where necessary, in litigation.
The regulatory environment for the use of data and corresponding technologies is complex and new legal acts are constantly being added by the European Commission. In this dynamic environment, Dr. Oliver Hornung advises his clients on all legal issues, in particular with a focus on AI compliance, Data Act, NIS-2, cyber security, cloud computing and data law.
Another focus of his legal advice is data protection with a focus on digital health and the EU's Digital Decade. If necessary, Dr. Oliver Hornung and his team defend the rights of his clients before supervisory authorities or in court.
Finally, Dr. Oliver Hornung advises start-ups on all questions relating to IT law and data protection law. In addition to his extensive practical work, Dr. Oliver Hornung is also a frequently requested lecturer in IT law and data protection law.
Norbert Klingner specializes in national and international movie/TV and advertising film production, financing, insurance, and distribution. He represents well-known producers, distributors, global distributors, and movie financing entities. His expertise ranges from negotiating and drafting contracts from the beginning of the material development to all matters related to production and financing up to the strategically correct exploitation and licensing. A selection of the film productions in which Mr. Klingner was involved can be found on the Internet Movie Database IMDb.
Margret Knitter advises her clients in all matters of intellectual property and competition law. This includes not only strategic advice, but also legal disputes. Her practice focuses on the development and defense of trademark and design portfolios, border seizure proceedings and advice on developing marketing campaigns. She advises on labelling obligations, packaging design, marketing strategies and regulatory questions, in particular for cosmetics, detergents, toys, foodstuffs and Cannabis. She represents her clients vis-à-vis authorities, courts and the public prosecutor's office.
In the field of media and entertainment, she mainly advises on questions of advertising law, in particular product placement, branded entertainment and influencer marketing. She is a member of the board of the Branded Content Marketing Association (BCMA) for the DACH region and member of the INTA Non-Traditional Marks Committee.
Dr. Matthias Nordmann advises international groups, mid cap companies, investors and entrepreneurs on company, commercial and corporate law in particular on structuring and mergers & acquisitions. He has a special focus on transactions in IP/IT driven industries as well as real estate.
Dr. Andreas Peschel-Mehner has provided legal counsel to all forms of digital business since the inception of the world wide web. His advisory spans start-ups, multi-channel offerings and international internet companies and focuses on all applicable legal fields with a particular emphasis on data protection and usage, terms and conditions, consumer protection, compliance, advertising, gaming and competition law, among numerous others. Dr. Andreas Peschel-Mehner also commands broad expertise in media and entertainment law, in particular issues touching on the film and television industry and those related to media production finance and the global exploitation thereof, with digital media advisory on changes to utilization models, revenue streams and video on demand platforms composing a significant part of his counsel.
An excerpt of the projects Dr. Andreas Peschel-Mehner has accompanied can be found on the Internet Movie Database IMDb. His advisory expertise is augmented by decades of involvement with and counsel of national and international computer game publishers and studios. Finally, developments and use of KI technologies across all his expert areas has become a strategic element of his practice.
News
SKW Schwarz again recognized in the JUVE ranking “Succession, Wealth, Foundations”
The latest edition of the JUVE Handbook of Commercial Law Firms once again confirms the strong position of SKW Schwarz in the field of Succession, Wealth, Foundations. The editors particularly highlight the breadth and depth of expertise within our Private Clients practice.
Key strengths noted in the ranking include:
- our specialized expertise in family law, particularly in divorce matters, asset division and marital agreements,
- our comprehensive advisory capabilities regarding succession and wealth structuring, covering foundation law, inheritance law, corporate law and tax law,
- our extensive experience in business succession planning, gift and exit taxation, as well as disputes over compulsory portions,
- our capabilities in establishing and advising family offices, lifetime foundations and complex wealth structures,
- our strong notarial and transaction-related know-how in finance and real estate matters.
This renewed recognition underscores JUVE’s assessment of the continued development of our practice and the exceptional commitment of our Private Clients team. We are grateful for the trust our clients place in us, which makes achievements like this possible.
The full JUVE entry is available here.
Digital Omnibus – part of the European Commission's new digital package
The EU Commission recently adopted a new digital package. The digital package is intended to help companies in the EU – from start-ups to industrial enterprises – reduce compliance and administrative burdens so they can focus more on innovation and growth. At the heart of the package is the Omnibus Regulation (‘Digital Omnibus’), which is primarily intended to simplify rules for artificial intelligence, cybersecurity and data. Below, we provide an overview of the most relevant rules and changes.
After numerous EU digital regulations have gradually come into force in recent years as part of the Digital Decade and are already being applied in some cases (an overview of the status of the laws can be found on our Digital Decade landing page), the EU now wants to move into a phase of consolidation and simplification – primarily in response to pressure from industry, which is facing increasingly significant compliance costs and, in some cases, overlapping obligations. The package is intended to address precisely this issue by better harmonising existing regulations, reducing duplicate requirements and making application and implementation more practical for businesses.
The Digital Omnibus, which primarily aims to consolidate regulations on artificial intelligence, cybersecurity and data, is complemented by the Data Union Strategy, which aims to facilitate access to high-quality data for AI, and by the European Business Wallets, which provide companies with a single digital identity.
Below, we would like to provide an initial overview of the changes in the Digital Omnibus:
What are the key changes to EU data law?
With the Digital Omnibus, the EU is pursuing a consolidated further development of data law. The aim is to simplify regulations, reduce administrative burdens and create a clearer framework for data-driven innovation. The focus is on amendments to the Data Act and selective changes to the GDPR – all data rules are to be consolidated in these two main pieces of legislation.
The following adjustments are planned for the Data Act:
- Consolidation of previous legal acts: Several previously coexisting legal acts, including the Open Data Directive, the Free Flow of Non-Personal Data Regulation and the Data Governance Act, are to be integrated into the Data Act in order to create a uniform set of rules for non-personal data.
- Data intermediation services: Mandatory registration and the EU label for data intermediaries are to be abolished, significantly streamlining the regulatory framework as a whole. New intermediation models should be able to be offered more quickly and with less red tape as barriers to market entry are lowered.
- Data altruism: The legal framework for public interest data sharing will be simplified to reduce the complexity of existing structures and requirements. Organisations should be able to make data available more easily for research, health or sustainability purposes without having to comply with extensive administrative processes.
- Public sector data sets: Existing requirements for public data sets are to be consolidated and harmonised to eliminate existing fragmentation. It should be easier for companies to understand which public data can be used under which conditions in order to strengthen innovation in the internal market.
- Business-to-government access (B2G): Access by government agencies to company data should be clearly limited to genuine emergencies and crises such as natural disasters or pandemics. Outside of such situations, companies should not be subject to additional or unclear disclosure requirements.
- Relief through bureaucracy reduction and harmonisation: The regulatory framework in the areas of data, data protection, cybersecurity and AI should be streamlined and harmonised by centralising reporting systems and reducing information requirements.
The following changes are planned for the scope of cloud switching obligations:
The Omnibus proposal specifically realigns the scope of the Data Act's cloud switching rules. The basic principle of easier switchability between cloud, edge and data processing services remains in place, but is made more precise and placed on a more proportionate and risk-based basis. The most important adjustments are summarised below:
- Restriction of the scope of application for SMEs and micro-enterprises: The switching obligations shall only apply if they are technically feasible and economically reasonable for these providers. This is intended to relieve smaller market participants of disproportionate regulatory requirements.
- Exemption for customer-specific data processing services: Individually developed data processing solutions that are provided exclusively for a single customer will no longer be subject to the full cloud switching obligations. The reason for this is that interoperability and standardised data portability are often neither technically feasible nor practicable in such tailor-made architectures.
- Emphasis on the technical and economic feasibility of switching: The omnibus clarifies that the requirements should only apply if they can be met at reasonable cost. This clarification reduces existing legal uncertainties and prevents providers from finding themselves in situations where compliance would be virtually impossible or disproportionately expensive.
- Strengthening existing industry standards: The reform makes it clear that service providers do not have to develop new proprietary interfaces. The use of industry-standard data formats and protocols should be sufficient to meet the requirements. This reduces development effort and integration costs, especially for smaller providers.
- More user-friendly switching framework: The EU remains committed to reducing switching costs and giving users real opportunities to switch providers. At the same time, the reform aims to ensure that specialised or smaller providers are not squeezed out by excessive compliance burdens.
Overall, the aim is to create a more differentiated, proportionate and risk-based switching framework. The cloud switching regime of the Data Act will remain functional, but will focus more clearly on standardisable services and on providers for whom the implementation of the obligations is realistic and economically viable.
The following adjustments are planned for the GDPR and the rules on cookies:
- New approach to cookie banners and consent management: Until now, regulation has been based on a two-tier divided structure: access to end devices fell under the ePrivacy Directive, while the subsequent processing of personal data was subject to the GDPR. The Commission's new proposal ends this dual system. In future, cookies and similar tracking technologies will be fully integrated into the GDPR, resulting in a harmonised legal framework with common principles, enforcement mechanisms and sanctions. The Commission recognises an existing problem: consent management often works poorly in practice. Users are confronted with complex pop-ups, and many reflexively click ‘Accept all’ to continue browsing. This hardly represents the informed consent originally intended by the legislator. The aim of the reform is therefore to make consent a functional and credible legal basis again. Among other things, the proposal stipulates that cookie banners must offer a genuine ‘one-click option’ to reject all non-essential cookies – visible, equivalent and as easily accessible as the ‘Accept all’ option. A rejection must be valid for at least six months.
- Central system for data protection preferences: The planned rules on technical preference signals are even more far-reaching: users should be able to set data protection decisions once (e.g. in their browser or operating system). Websites and apps must automatically respect these machine-readable signals in future. Companies must therefore design their consent mechanisms in such a way that these standards can be processed.
- Differentiation between high-risk tracking and low-risk uses: The proposal introduces a ‘whitelist’ of certain privacy-friendly types of use, for example for statistical analyses or aggregated audience measurements. If the specified conditions are met, companies may process device data for narrowly defined purposes without consent and without cookie banners. For companies that primarily perform performance analyses or service optimisations, this means fewer banners, less compliance overhead and a more user-friendly experience.
- Stricter enforcement, but more legal certainty: Through integration into the GDPR, violations of rules on end device access will in future be subject to the existing framework of sanctions. At the same time, the reform aims to increase legal certainty by reducing fragmentation and clarifying protection standards.
- Clarifications regarding the definition of personal data: The proposal implements current ECJ case law. Data is not considered personal to a recipient if the recipient has no realistic possibility of re-identification. However, the original controller who pseudonymised the data retains all obligations under the GDPR.
- Technical guidelines via implementing acts: The Commission is given the power to lay down technical criteria and methods for pseudonymisation and the assessment of re-identification risks. This is intended to provide companies with clearer assessment criteria and practical guidance in future.
- Changes to the GDPR: The ‘Digital Omnibus’ does not change the basic structure of the GDPR, but addresses specifically identified problem areas:
- Innovation and AI: The proposal clarifies that the development and operation of AI systems and models can be based on the legal basis of ‘legitimate interest’ as long as the processing meets all the requirements of the GDPR and is not prohibited by other EU or national regulations or subject to consent. If special categories of personal data appear only residually in training or test data sets and are not the subject of the collection, a narrow exception to the usual processing prohibition is introduced. Controllers must implement appropriate safeguards throughout the AI lifecycle, remove such data as soon as it is identified, and ensure that it is not used to derive results or made available to third parties. Data subjects retain an unrestricted right to object to the processing of their personal data for these AI purposes.
- Simplification of everyday compliance obligations: Information obligations do not apply if there are legitimate reasons to believe that data subjects already have the information and the processing does not pose a high risk. This benefits smaller companies with limited data usage.
In addition, the right to information is protected against misuse: Controllers can respond to manifestly unfounded requests with a refusal or a reasonable fee; with a lower burden of proof than today to show that a request is excessive.
- Data protection impact assessments are harmonised through EU-wide uniform lists, both for types of processing that always require a DPIA and for those that do not. This is supplemented by a uniform methodology and template.
- Notifications of data breaches to supervisory authorities will in future be aligned with the ‘high risk’ threshold – the same threshold at which notification of the data subjects is already required. Notification will be made centrally via a single point of contact linked to other digital and cybersecurity-related regulations. For companies, this means fewer reports with little benefit, a more predictable risk assessment and more efficient communication with supervisory authorities.
- Innovation and AI: The proposal clarifies that the development and operation of AI systems and models can be based on the legal basis of ‘legitimate interest’ as long as the processing meets all the requirements of the GDPR and is not prohibited by other EU or national regulations or subject to consent. If special categories of personal data appear only residually in training or test data sets and are not the subject of the collection, a narrow exception to the usual processing prohibition is introduced. Controllers must implement appropriate safeguards throughout the AI lifecycle, remove such data as soon as it is identified, and ensure that it is not used to derive results or made available to third parties. Data subjects retain an unrestricted right to object to the processing of their personal data for these AI purposes.
What does this mean for companies? For many businesses, the immediate headline will be the prospect of fewer and simpler cookie banners. But the real change runs deeper: all device‑based data access is drawn into a single GDPR‑based regime, augmented with central preference signals, a privacy‑friendly whitelist for low‑risk uses, and tougher expectations around consent design. At the same time, long‑running ambiguities around pseudonymized data, AI training, access requests, information duties, DPIAs and breach notifications are addressed through targeted legislative clarifications and mechanisms for future technical guidance. In practical terms, businesses that invest early in mapping their cookie and tracking practices to the new whitelist, in re‑engineering consent flows around “one‑click” choice and central signals, and in aligning AI and analytics projects with the clarified legitimate‑interest and pseudonymization framework will be best placed to benefit from the promised simplification – and to avoid becoming the test cases for the strengthened enforcement system that comes with it.
What changes will the Digital Omnibus bring to cyber security law?
Simplified reporting of cyber security incidents: Under current law, companies must comply with various legal reporting obligations under different legal acts in the event of a cyber security incident (e.g. Art. 32 GDPR, Art. 23 NIS-2 Directive, Art. 14 CRA and many other sector-specific reporting obligations such as Art. 73 AI Act for high-risk AI systems, Art. 19 DORA in the financial sector, etc.). Each of these reporting obligations is subject to different content requirements and different reporting deadlines and is addressed to different authorities. The EU Commission's proposal aims to simplify reporting under cybersecurity law and consolidate it in a single point of contact at the European Agency for Cybersecurity (ENISA). A central reporting portal is to be set up at ENISA, where affected companies can submit their mandatory reports on cybersecurity incidents in a collective manner. These reports will then be processed centrally by ENISA and forwarded to the relevant authorities. The exchange of reported information between authorities is also to be facilitated. The reporting platform for vulnerabilities established under Article 16 CRA is to be used to implement the changes. The EU Commission expects that this will reduce the annual costs associated with reporting cybersecurity incidents by up to 50%.
The Commission's draft specifically addresses inter alia existing reporting obligations under NIS-2, GDPR, eIDAS-VO and DORA. However, the substantive requirements for the individual reporting obligations and the respective competent supervisory authority remain largely unaffected by the proposed amendments. However, the proposal also contains some substantive adjustments. For example, the deadline for reporting data protection incidents in Art. 33 GDPR is to be increased to 96 hours and will in future only apply to breaches with a high risk to data subjects.
What are the main changes in the area of artificial intelligence and the AI Act?
The AI Act came into force in August 2024 and is being implemented in stages: some provisions, such as certain prohibitions, requirements for AI competence and rules for general-purpose AI models, are already in force. The remaining provisions are to become binding from 2 August 2026. The European Commission identified several challenges during the 2025 stakeholder consultations and is now proposing the following adjustments:
- New timetable for high-risk AI systems: The application of the rules will be linked to the availability of standards and support tools. Once the Commission has confirmed that these are sufficiently available, the rules will enter into force after a transition period.
- Annex III AI systems: 6 months after the Commission's decision or by 2 December 2027 at the latest.
- Annex I systems: 12 months after the Commission's decision or by 2 August 2028 at the latest.
- AI competence: The obligation for companies to ensure an adequate level of AI competence is removed. Instead, the Commission and Member States should encourage providers and users to provide sufficient AI competence.
- Processing of special categories of personal data: Providers and users of AI systems may process special categories of personal data for bias detection and correction, provided that appropriate safeguards are in place.
- Registration of high-risk AI systems: Systems used in high-risk areas for tasks that are not themselves considered high-risk no longer need to be registered.
- Expansion of the use of AI regulatory sandboxes and real-world testing: From 2028, an EU-wide regulatory sandbox is to be established, among other things.
- Abolition of the requirement for a harmonised post-market monitoring plan.
- Extension of simplified compliance rules to small and medium-sized enterprises (SMEs): For example, simplified rules for the technical documentation required for AI systems are to apply to SMEs.
- Centralisation of supervision of AI systems based on general-purpose models: Supervision will be bundled at the AI Office to reduce governance fragmentation. AI in very large online platforms and search engines will also be supervised at EU level.
- Clarification of interaction with other EU legislation: Procedures will be simplified to ensure the timely availability of conformity assessment bodies.
Unpleasant surprises regarding presumed rights of representation between spouses/registered civil partners
In consulting practice, one often finds the assumption that spouses/registered civil partners are allowed to make comprehensive decisions for each other if the other becomes incapacitated due to illness or accident.
This is not the case, which can lead to unpleasant surprises.
On 1 January 2023, the reform of guardianship law (Betreungsrecht) introduced the so-called emergency representation right for spouses (Notvertretungsrecht, Section 1358 of the German Civil Code (BGB)) for the first time. Previously, statutory power of representation was only provided for in relation to transactions covering basic living expenses (Section 1357 (1) BGB). Put simply, this made it possible to represent the other spouse ‘within the scope of weekly shopping’. However, there was no further authorisation.
Even with the introduction of the right of emergency representation for spouses (Notvertretungsrecht), this has not changed significantly and only within the scope of health care. In detail:
‘If one spouse is legally unable to manage their health care affairs due to unconsciousness or illness,’ the other spouse (Section 1358 (1) BGB) may, for a maximum period of six months, essentially:
- consent to or refuse examinations, medical treatment and surgical procedures (with restrictions under Section 1358(6) in conjunction with Section 1829 of the German Civil Code (BGB))
- conclude and enforce treatment/hospital contracts or contracts for urgent rehabilitation measures,
- decide on measures that may deprive the patient of their physical liberty (such as bed rails) to a limited extent, and
- assert claims against third parties (e.g. social security institutions) on the basis of the illness.
During this period, the spouse is also exempt from medical confidentiality obligations towards the patient.
The spouse/partner must exercise the right of representation in accordance with the wishes or presumed will of the patient.
However, this right of emergency representation does not apply if the partners are separated or if a power of attorney has been granted that includes the aforementioned rights, or if guardianship (Betreuung) has been established with this area of responsibility. However, representation vis-à-vis the spouse may be objected to in advance or this decision may be entered in the Zentrale Vorsorgeregister (Central Register of Lasting Powers of Attorney). In some cases, the approval of the Betreuungsgericht (guardianship court) is still required.
In order to exercise the right of emergency representation, a doctor must also confirm in writing that the requirements are met (cf. Section 1358 (4) BGB).
The explanations make it clear that representation in (other), especially financial matters, is not covered.
Extensive representation rights for spouses can only be achieved by expressly granting them power of attorney before an emergency arises. In addition to bank power of attorney, it is generally advisable to grant power of attorney (Vorsorgevollmacht) in order to ensure that your spouse/partner is able to act on your behalf. Further important information can be found here.
In the power of attorney (Vorsorgevollmacht), ideally in combination with a patient declaration (Patientenverfügung), it is possible to comprehensively and detailedly regulate which rights the partner (or even a third party) should have in the event of incapacity and which treatment (or non-treatment) is desired.
We would be happy to advise you comprehensively on the issue of emergency representation rights as well as on questions regarding power of attorney and patient declarations.
Selection process with regional exceptions in Saxony – Focus on quality and resilience
The latest award decision by the South West Saxony Rescue Services Association (RettZV SWS) for nine rescue stations in the Vogtlandkreis and Zwickau districts (2026–2033) marks a milestone: for the first time in years, GWB tenders have been dispensed with in favour of a traditional tender process. In consultation with the health insurance funds, an administrative selection procedure was used. The basis for this is the amendment to the Saxon Rescue Service Act. This has made it possible to apply the area exemption for hazard prevention.
Instead of rigid price offers, the selection process focused on quality and resilience (49%) and economic efficiency (51%). Among other things, concepts for disaster control, mass casualties, school medical services, youth work and hygiene were evaluated. Cost-effectiveness was assessed on the basis of personnel security and special services such as mountain rescue and rescue dog teams – but without binding calculations.
This development confirms SKW Schwarz's position: Long-term price calculations in the rescue service are not sustainable in view of the shortage of skilled workers, wage increases, rising inflation and changing requirements. Instead, procurement procedures must focus on the performance, quality and resilience of civil protection. Negotiations between the responsible bodies and health insurance funds, supported by the service providers, ensure that no money is wasted.
The new procedure enables flexible, annual planning and permanent controlling instead of long-term risk premiums. There are incentives for quality and voluntary work in civil protection. Special termination rights and audits by the rescue services association provide additional security.
Our recommendation: Financing and procurement in the emergency medical services should be based on regional structures and actual performance – not on costs over many years. This is the only way to ensure that rescue services remain crisis-proof, quality-oriented, economical and future-proof.
How the upcoming reform of emergency care at the federal level will affect the refinancing of rescue services and civil protection remains a big question mark, especially for refinancing.
We will also be discussing this at the DVNWforum on 25 February 2026 in Berlin (more information).
SKW Schwarz – your partner for sustainable and resilient solutions in emergency medical services and civil protection.
German Bundestag passes NIS 2 Implementation Act – companies should become NIS 2 compliant now
The German NIS 2 Implementation Act was finally passed by the Bundestag yesterday, 13 November 2025 (see here). The next step will be for the Bundesrat to deal with the law. The announcement in the Federal Law Gazette and the entry into force of the law are expected by the beginning of 2026 at the latest (!).
Companies that have so far relied on the German legislature to proceed slowly with the legislative process (which, incidentally, is contrary to European law) should therefore address NIS-2 now at the latest and begin implementation.
With our team, which has supported numerous NIS 2 implementation projects in recent months, we are available to assist companies with all legal issues relating to IT security. We also have a network of technical service providers with whom we can provide interdisciplinary support and implementation for your NIS 2 project as required. We would also like to refer you to our comprehensive white paper, which is available here: see here.
Three points we would like to point out to companies once again:
- Companies must check for themselves whether they are affected by the law. The specific applicability check is a legal matter. Contrary to what has been widely reported, the BSI will not publish a list of affected companies. This is essential for management to make the right decision and avoid liability. Our NIS 2 impact tool, which is available here, can serve as an initial assessment: see here.
- There are no transition periods. Once the law comes into force, all obligations apply, in particular those relating to the implementation of cyber risk management measures, registration with the BSI and the reporting of relevant security incidents to authorities and other bodies.
- Even if your company is not directly subject to the NIS2 rules, there is a significantly higher probability that one or more of your customers are subject to them and therefore (must) require all suppliers to be able to document that they are fulfilling their part of cyber resilience. Very few companies are already prepared for such requests, even though they are very important for competitiveness.
For further information, please feel free to contact our team at any time.
Artificial Intelligence in Transition: How Germany Compares on the Global Stage
How are different countries addressing the opportunities and challenges of Artificial Intelligence? The new Lexology Guide “Panoramic Next: Artificial Intelligence” provides a global overview and highlights how legal and regulatory frameworks are evolving around the world.
Our experts Moritz Mehner and Dr Christoph Krück contributed the Germany chapter, analysing how the EU AI Act, national initiatives such as the proposed AI Market Surveillance and Innovation Promotion Act (KIMÜG), and questions of data protection and ethics are shaping the regulatory landscape and creating new opportunities for innovation and business.
The guide offers an international comparison covering jurisdictions including the United States, China, Japan, India, France, Belgium and Switzerland. It shows how approaches to Artificial Intelligence differ across regions and where Germany stands in the global context.
With their contribution to this publication, SKW Schwarz reinforces its leading position at the intersection of law, technology and innovation. We support companies in implementing Artificial Intelligence in a responsible, compliant and forward-looking way in line with European and international standards.
The full guide is available on the Lexology website.
Munich I Regional Court: No Damages under the GDPR in Case of Contradictory User Behaviour
On 27 August 2025, the Munich I Regional Court issued an interesting decision on a claim for damages under Art. 82 GDPR (Ref. 33 O 635/25; see here).
The court dismissed the claim brought by a user of a US social media platform, among other reasons, because the plaintiff had acted inconsistently (Sec. 242 of the German Civil Code).
The plaintiff, who had used the platform from within the EU, argued that his personal data had been unlawfully transferred to the US (see para. 43 et seq.).
In the court’s view, a person who knowingly uses a provider’s communication service despite being aware of an alleged legal violation, and then claims damages from that same provider precisely for offering the service, is acting in bad faith.
Background
Pursuant to Art. 82(1) GDPR, any person who has suffered material or non-material damage as a result of an infringement of the GDPR shall receive compensation from the controller or processor for the damage suffered. An infringement of the GDPR and, in consequence, a right to compensation may arise if the data processing is unlawful (in this case: if the transfer of personal data from the EU to the US does not comply with the requirements for international data transfers under Art. 44 et seq. GDPR).
In the judgment of 16 July 2020, Schrems II (C‑311/18), the ECJ declared the EU-US Privacy Shield invalid. Until the EU-US Data Privacy Framework came into force on 11 July 2023, data could therefore no longer be transferred to the US on the basis of Art. 45(1) GDPR.
In the plaintiff's opinion, data transfers from a European subsidiary to the US parent company during this period (2020 to 2023) were therefore unlawful; due to the US authorities’ ability to access the transferred data, the plaintiff suffered a significant loss of control and thus damage within the meaning of Art. 82 GDPR.
Key Findings
1. Lawfulness of Data Transfers to Third Countries under Standard Contractual Clauses
A data transfer to a third country may still be lawful without an adequacy decision within the meaning of Art. 45 GDPR if standard contractual clauses have been concluded between the controller or processor and the recipient, and if effective legal remedies are available (Art. 46(1), (2)(c) GDPR).
2. No Right to Data Processing Only in Europe
Social networks that are ‘globally designed’ (see para. 41) technically require the international exchange of personal data. Users of such platforms are well aware of this fact. There is also no claim against the provider of such a network to operate the service as a ‘purely European platform’:
‘The business decision [...] to offer a global network [...] and to process data in the United States must be accepted by users who voluntarily choose to use it.’
3. No Compensation for Contradictory User Behaviour
In the court’s view, anyone who consciously uses a globally operating US social media platform cannot claim compensation on the abovementioned grounds, as it is common knowledge that data is transferred to the US and that US intelligence services may be able to access this data under certain circumstances. Such conduct violates the principle of good faith.
Outlook
The decision of the Munich I Regional Court is to be welcomed.
With this ruling, the court has taken a clear stance against the wave of ‘largely template-based’ mass claims for damages under Art. 82 GDPR, in which actual impairment is often doubtful and users/plaintiffs act inconsistently – for example, by continuing to use a comparable service from the same provider while claiming serious impairment.
Publication in GRUR-Prax: Preferential Treatment of Partner Pharmacies on Cannabis Telemedicine Platforms Violates Patients’ Right to Free Pharmacy Choice
In the latest issue of GRUR-Prax (19/2025), Margret Knitter, attorney-at-law and partner at SKW Schwarz, analyses a recent decision of the Higher Regional Court (OLG) Frankfurt a.M., Germany, dated August 14, 2025 - 6 W 108/25, concerning the design of telemedicine platforms for the distribution of medical cannabis.
The court examined whether certain platform structures and ordering processes may unduly influence patients’ freedom to choose their pharmacy, particularly when cooperation pharmacies of the platform operator are favored.
In her commentary, Margret Knitter discusses the legal boundaries of commercial cooperation between pharmacies and digital health platforms under German law, and explains the implications for transparent and non-discriminatory ordering processes in compliance with the German Pharmacy Act (Apothekengesetz). The article offers practical insights for companies active in telemedicine, e-prescriptions, and digital health, illustrating key compliance considerations for platform operators in Germany.
GRUR-Prax is a professional journal published by the German Association for the Protection of Intellectual Property (Deutsche Vereinigung für gewerblichen Rechtsschutz und Urheberrecht – GRUR), focusing on the practical application of intellectual property and competition law.
Read the full article (available in German only):
Margret Knitter, “Privilegierung von Kooperationsapotheken auf Cannabis-Telemedizin-Plattform verletzt freies Apothekenwahlrecht,” GRUR-Prax 2025, Issue 19, pp. 682–683.
View on beck-online
Data Act in Force: New Whitepaper by ITK Engineering & SKW Schwarz
ITK Engineering and SKW Schwarz have jointly published the second whitepaper on the Data Act. Since September 12, 2025, the new EU regulations have been in effect—bringing extensive obligations, but also new opportunities for manufacturers of connected products and companies leveraging data-driven business models. The whitepaper demonstrates how organizations can meet regulatory requirements and turn them into innovation and competitive advantage. It focuses on practical action paths, compliance tips, and proven governance frameworks.
Download the new whitepaper now for free – available via the ITK website!
For those who want deeper insights, the first whitepaper (“Data Act – The EU Revolutionizes the Data Market”) provides the key foundations from the outset of the regulation. Together, both publications offer valuable guidance for a sustainable data strategy. (Free download via the ITK website.)
Online Banking Fraud in Connection with Sales on “Kleinanzeigen”: Schleswig-Holstein Higher Regional Court Rejects Appeal
By decision dated 29 September 2025 (Case No. 5 U 27/25), the Schleswig-Holstein Higher Re-gional Court dismissed the appeal of a bank customer who had sought reimbursement from his payment service provider after unauthorized credit card transactions. The proceedings concerned the same facts I had already reported on in my blog post of 5 February 2025 regarding the dis-missal of the claim by the Regional Court of Itzehoe (judgment of 28 January 2025 – 7 O 114/24; available at No Monitoring Duty for Banks – Recent Judgment of the Regional Court of Itzehoe in the Context of Online Banking Fraud Cases).
1. Gross Negligence of the Customer
The Senate confirmed the lower court’s assessment that the claimant had acted with gross negli-gence in several respects. Decisive was that he followed a link sent outside the “Kleinanzeigen” communication system and entered personal credit card details there, despite being in the role of payment recipient. This alone should have raised strong suspicion of fraud.
In addition, he registered his credit card in the S-ID-Check procedure using Face ID/PushTAN. According to the court, the claimant ignored clear warnings that pointed to the misuse of his data. Disclosing sensitive authentication credentials under such circumstances constituted an objectively serious and subjectively inexcusable breach of the duty of care under § 675l (1) BGB as well as of the relevant contractual online banking terms and conditions.
2. No Exclusion of Liability under § 675v (4) BGB
The Senate also denied an exclusion of liability under § 675v (4) No. 1 BGB. Contrary to the claimant’s view, the savings bank had required strong customer authentication for the transaction. This was carried out—in conformity with EU law—based on two-factor authentication comprising knowledge (online banking credentials), possession (credit card data), and inherence (Face ID). Accordingly, the requirements for a liability exclusion were not met.
The claimant’s argument that there was a dispute between the parties as to whether strong cus-tomer authentication was required merely for logging into online banking was deemed immaterial by the Senate and therefore disregarded.
3. No Contributory Negligence on the Part of the Bank
Finally, the Higher Regional Court rejected any reduction of the claim due to contributory negli-gence of the defendant pursuant to § 254 BGB. There were neither indications of inadequate sys-tem security nor had any contractual protective or warning duties been breached. According to the consistent case law of the Federal Court of Justice, banks only have warning obligations in excep-tional circumstances, e.g., where objectively obvious indications of misuse are present. No such exceptional case existed here.
Conclusion
With its decision, the Schleswig-Holstein Higher Regional Court confirmed the first-instance as-sessment that the claimant’s conduct must be classified as grossly negligent, thereby excluding his claims for reimbursement. The ruling underscores that bank customers bear a high degree of personal responsibility when disclosing security credentials, whereas banks are not required to scrutinize every potentially suspicious transaction individually.
It is also noteworthy that the Senate considered the claimant’s disputed allegation—that strong customer authentication had already been required for mere login to online banking—to be irrele-vant to the decision and therefore disregarded it (cf. the related discussion in OLG Dresden, judgment of 5 May 2025 – 8 U 1482/24, BKR 2025, 850 with my annotation, and most recently BGH, judgment of 22 July 2025 – XI ZR 107/24, BKR 2025, 843).
Product Liability and Artificial Intelligence
The reform of product liability law in light of the government draft of 11 September 2025
On 8 December 2024, Directive (EU) 2024/2853 on liability for defective products (ProdHaftRL) came into force, replacing the almost 40-year-old Directive 85/374/EEC, on which the Product Liability Act (ProdHaftG) is also based. Already on 11 September 2025 – and thus comparatively early – the Federal Ministry of Justice and Consumer Protection presented the draft bill for the implementation of the ProdHaftRL (ProdHaftG-E). The law is to come into force at the end of the implementation period on 9 December 2026.
The background to the comprehensive modernisation of product liability law includes developments in connection with new technologies, including Artificial Intelligence (AI). The application of the previous product liability law had led to inconsistencies and legal uncertainties with regard to the interpretation of the term ‘product’. In addition, it is often difficult for injured parties to assert claims for damages in view of the increasing technical complexity of products.
The ProdHaftG-E now aims to strike a balance between promoting the development of new technologies on the one hand and ensuring effective legal protection for injured parties on the other. A key element in this regard is the inclusion of software – and thus also AI systems – within the scope of the ProdHaftG. Against this backdrop, companies are faced with the question: Who is responsible for ‘defective’ software – the manufacturer or another player along the value chain? What obligations do market participants have and how can they protect themselves against liability risks?
The following analysis examines the key changes to the ProdHaftG according to the government draft and highlights the consequences, particularly for companies that develop, distribute or use AI systems.
SKW Schwarz has already reported on the new ProdHaftRL.
Key changes in product liability law
The proposed Product Liability Act (ProdHaftG-E) introduces a number of changes compared to the previous Product Liability Act (ProdHaftG):
1) Software & AI systems as products
In future, software will be included in product liability regardless of how it is provided or used, i.e. regardless of whether it is embodied in or connected to physical objects and thus also regardless of whether the software is used ‘on-premise’ or accessed via the cloud, for example (Section 2 No. 3 ProdHaftG-E).
AI systems are also to be covered by the term ‘software’ (cf. Recital 13 of the ProdHaftRL), which is to be understood as technology-neutral and deliberately not legally defined.
Free and open-source software plays a special role: it is generally excluded from the scope of product liability law (Section 2 No. 3 ProdHaftG-E, second half-sentence), but only if it is developed or provided outside of a commercial activity. If, on the other hand, it is provided in return for payment or personal data that is used for purposes other than solely improving the security, compatibility or interoperability of the software, this constitutes a commercial activity and the exemption does not apply. This also means that If open source software that was originally provided outside of a commercial activity is integrated into a product by a manufacturer as a component within the scope of its commercial activity, this manufacturer is liable for damages caused by errors in the software – but not the original manufacturer of the open source software (see recitals 14 and 15 of the ProdHaftRL and Begr. RefE, p. 26).
2) Types of compensable damage
In addition to damage resulting from death, bodily injury, damage to health or property damage, damage resulting from the destruction or damage of data not used for professional purposes will also be compensable in future (Section 1 (1) No. 3 ProdHaftG-E). Conversely, this means that damage to data that is used – at least in part – for professional purposes is not compensable under the ProdHaftG-E (see Recital 22 of the ProdHaftRL).
The claim under Section 1 ProdHaftG-E differs from the claim for damages under data protection law under Article 82 GDPR in that the latter does not require the processing of personal data in violation of data protection law and places the obligation on the manufacturer (and not the data controller).
3) Adjustment of the concept of defect
Section 7 sentence 1 ProdHaftG-E standardises the principle that a product is defective if it does not offer the safety required by law or that may be expected.
Section 7 (2) nos. 1–8 ProdHaftG-E lists, among other things, the reasonably foreseeable use (no. 2), the effects of the product's learning ability (no. 3), interactions with other products (No. 4) and cybersecurity requirements (No. 5).
4) Expansion of the group of liable parties
The central liable party under the ProdHaftG-E remains the manufacturer, including those acting as manufacturers (so-called quasi-manufacturers). In this respect, the legal definition in Section 3 ProdHaftG-E essentially corresponds to that of the supplier under Article 3(3) of the AI Regulation.
Furthermore, Sections 10–13 of the ProdHaftG-E provide for a cascade of liability which, in addition to the manufacturer (or supplier), also covers importers, agents, fulfilment service providers, suppliers and providers of an online platform within the meaning of Art. 3 lit. i) DSA as liable parties under certain conditions.
In this context, it should be noted that in the event of product defects caused by a faulty component, both the manufacturer of the product and the manufacturer of the component may be liable (Section 4 ProdHaftG-E). Components also include ‘connected services’ such as the temperature monitoring service that monitors and regulates the temperature of a smart refrigerator (see Recital 17 ProdHaftRL).
5) Shift in the burden of proof and disclosure
Section 19 ProdHaftG-E provides for a rule on the disclosure of evidence in court proceedings, modelled on the US ‘disclosure of evidence’ rule. This is intended to ensure that the plaintiff and defendant have comparable knowledge.
Finally, Section 20 ProdHaftG-E contains presumptions and assumptions regarding the existence of a defect and its causality for the infringement of a right or legal interest within the meaning of Section 1 (1) ProdHaftG-E.
Implications for companies
For companies that develop or distribute AI systems or other software, the ProdHaftG-E results in a significant increase in liability risks, not least because there is sometimes a considerable gap between the legal requirements and the actual possibility of implementation. If the ProdHaftG-E is based on the ‘state of the art’ (Section 9 (1) No. 3), this presupposes corresponding norms or standards that not only offer practical assistance but also define a lower limit – however, these are simply not available across the board. Manufacturers are thus faced with the challenge of designing products and components without corresponding guidelines in such a way that ‘expectable’, ‘reasonably foreseeable’ or even – in the case of self-learning products – ‘unexpected’ negative effects are avoided (Section 7 (2) No. 3 ProdHaftG-E).
In addition, other parties besides manufacturers may now also be potentially liable. The liability cascade provided for in Sections 10–13 ProdHaftG-E is intended to ensure that injured parties always have a defendant based in the European Union, even if the manufacturer itself is based outside the EU. This means that importers and agents, fulfilment service providers, suppliers and online platform providers may also be liable if the upstream party in the (supply) chain cannot be held accountable because it is not based in the EU.
On the other hand, the scope of protection of the law remains limited due to the link to certain rights and legal interests in Section 1 (1) ProdHaftG-E and, in particular, the exclusion of pure financial losses.
Recommendations for internal implementation and best practices
In order to avoid liability cases, it is advisable to implement the following general and
actor-specific guidelines. In particular, there is a comprehensive need for action on the part of manufacturers as the central liable parties under the ProdHaftG-E.
1) Recommendations for all parties involved
- Product liability insurance: Companies should check whether existing insurance policies cover damage caused by software and AI systems.
- Review of agreements with third parties: Contracts with suppliers, for example, should be reviewed to ensure that liability risk is distributed. Recourse clauses may need to be introduced.
- Compliance systems: Providers must ensure that their systems meet legal requirements and that risks are minimised.
- Documentation, including of supply chains.
2) For manufacturers
- Security by design: When developing a product, relevant safety aspects, such as the risks arising from the use of the product, its learning ability or its interactions with other products, must be taken into account.
- Furthermore, codified technical knowledge in the form of harmonised norms and standards must be taken into account as far as possible in order to enable an exclusion of liability (Section 9 (1) No. 3 ProdHaftG-E).
- Documentation: Regardless of the risk classification of the AI system, compliance with the requirements of the AI Act is recommended, or at least complete documentation of the development and ‘AI lifecycle management’ in order to prove its accuracy in liability proceedings or to refute the presumptions and assumptions in section 20 of the draft Product Liability Act; Companies may also be required to disclose how the AI system works.
- In addition, cybersecurity requirements in particular must be taken into account (SKW Schwarz has already reported on the CRA and the NIS 2 Directive here and here).
- Update management: Manufacturers should provide products that have been placed on the market/put into service with the necessary security updates (section 9 (2) sentence 2 ProdHaftG-E); appropriate internal processes must be established for this purpose.
3) For suppliers and providers of online platforms
- Information management: Suppliers in particular should document supply chains transparently and set up systems so that they can name a primarily liable party to a creditor within one month of being requested to do so (Section 12 (1) ProdHaftG-E). The same applies to providers of online platforms through which consumers can conclude contracts with businesses. The obligations of suppliers apply to them accordingly.
- Clear labelling: Providers of online platforms should clearly identify products that are not provided by the provider itself or by a user under the provider's supervision as belonging to the manufacturer or seller in order to avoid liability (cf. Section 13 No. 2 ProdHaftG-E in conjunction with Art. 6 (3) DSA).
Conclusion and outlook
The modernisation of product liability law marks a turning point in the handling of AI systems. For the first time, software is comprehensively recognised as a product, meaning that AI applications are also subject to product liability. Although it will be some time before the law is expected to come into force, companies should prepare for the upcoming changes in good time in view of the significant increase in liability risks.
ECJ Confirms the Concept of Relative Personal Data
Pseudonymous Data May Be Anonymous for Third Parties Without (Additional) Knowledge
On 4 September 2025, the Court of Justice (ECJ) delivered its landmark judgment in European Data Protection Supervisor v. Single Resolution Board (Case C-413/23 P). In that judgment, the ECJ clarified the conditions under which data must be regarded as personal in nature and, consequently, when its processing falls within the scope of data protection law. The full text of the judgment is available here.
In particular, the ECJ held that the question of whether data relates to an identifiable natural person must be assessed from the perspective of the controller and at the time the data is collected. Further, the ECJ ruled that pseudonymisation may, depending on the circumstances of the case, effectively prevent a third party (a person other than the controller) from identifying the data subject. If a third party receives (a subset of) pseudonymized data and does not have additional information that would enable it to be attributed to a particular person, that data is generally to be regarded as anonymized for the third party within the meaning of EU data protection law.
The ECJ Ruling
According to the ECJ, pseudonymized data transferred by a controller to a third party must not, in principle, be regarded as constituting personal data for that third party, provided that:
- the third party does not have access to the additional information enabling the identification of the data subjects, and
- the technical and organizational measures taken effectively prevent such identification.
SKW Schwarz previously published an article on the (overturned) judgment of the General Court of 26 April 2023 (Case T-557/20) in CR 2023, p. 532 et seq. We also contributed to the discussion paper "Anonymization in Data Protection as an Opportunity for Business and Innovation" by the Industry 4.0 Platform on the position paper of the Federal Commissioner for Data Protection and Freedom of Information (BfDI) on “Anonymization Under the GDPR With Special Consideration of the Telecommunications Industry”.
A. The Background
Following the resolution of Banco Popular Español, S.A. based on Regulation (EU) 2018/1725, the Single Resolution Board (SRB) collected personal information from the affected shareholders and creditors to verify their legal status and, in addition, obtained their written comments through an online form. Subsequently, the SRB separated the comments from the identifying information of the respondents and pseudonymized the comments by assigning to each a unique alphanumeric code. Only the pseudonymized comments, together with the corresponding codes, were transmitted to the third-party recipient (Deloitte). Deloitte had no means of linking the alphanumeric code to the author of the comment.
Some data subjects lodged complaints with the European Data Protection Supervisor (EDPS), which found that the SRB had infringed its information obligations under Article 15(1)(d) of Regulation (EU) 2018/1725 by not mentioning Deloitte in its privacy statement as a potential recipient of the personal data collected. Since this provision mirrors Articles 13(1)(e) and 14(1)(e) GDPR, the judgment has direct implications for the interpretation of the GDPR.
Initially, the General Court annulled the EDPS's decision (Case T-557/20). On appeal, however, the ECJ overturned that judgment, holding that “the General Court disregarded the objective nature of the condition relating to the ‘identifiable’ nature of the data subject, by holding […] that the EDPS should have examined whether the comments transmitted to Deloitte constituted, from Deloitte’s point of view, personal data”.
In particular, the ECJ ruled that – with regard to the data protection information obligations and the assessment of whether data is personal in nature at the time of collection – the relevant perspective is that of the controller (here, the SRB) rather than that of a subsequent third-party recipient. From the SRB's perspective, the data at issue constituted personal data, which triggered the information obligation, including disclosure of Deloitte as a potential recipient.
Consequently, the ECJ referred the case back to the General Court for a new decision in accordance with this ruling.
B. Key Legal Findings on the Concept of Personal Data
1. Interpretation of the Concept of Personal Data
First, the ECJ emphasized that the definition of the concept of "personal data" set out in Article 3(1) of Regulation (EU) 2018/1725 and Article 4(1) GDPR must be interpreted broadly.
As the European legislator has used the expression “any information” in defining the concept of “personal data,”this reflects the intention to assign a wide scope to that concept, which potentially encompasses all kinds of information, not only objective but also subjective, in the form of opinions and assessments, provided that it “relates” to the data subject.
2. Relative Nature of Personal Data
In the first step, the ECJ noted that, as is usually the case for controllers who have pseudonymized data, where the controller has additional information enabling the pseudonymized data transmitted to a third party to be attributed to the data subject, in its view, such data, despite pseudonymisation, remains personal in nature.
In the second step, the ECJ clarified that pseudonymized data transmitted by the controller to a third party who does not have additional information to attribute it to the data subject does not constitutepersonal data for that third party. Rather, for the third party, such data is considered anonymous.
According to the fifth sentence of Recital 26 GDPR, the principles of data protection should not apply to anonymous information, namely information that does not relate to an identified or identifiable natural person or to personal data rendered anonymous in such a manner that the data subject is not or no longer identifiable.
However, that presupposes that the third party cannot lift the technical and organizational measures of pseudonymisation. In fact, these measures must be sufficient to prevent the third party from attributing the data to the data subject, including by recourse to other means of identification such as cross-checking with other factors, so that, from the third party’s perspective, the person concerned is not, or is no longer, identifiable.
According to the third sentence of Recital 26 GDPR, when assessing whether a natural person is identifiable, "all the means" reasonably likely to be used — either by the controller or by another person (e.g., a third party) to identify the natural person directly or indirectly — must be considered.
In this regard, the ECJ has already ruled, in particular in Breyer (19 October 2016, Case C‑582/14) and IAB Europe(7 March 2024, Case C‑604/22; commentary by SKW Schwarz here), that a means of identifying a natural person is not “reasonably likely to be used” if, in light of general experience, the risk of identification appears to be de facto negligible. This may be the case, for example, if the means of identifying the person is prohibited by law or because it would require a disproportionate amount of time, cost, or personnel.
In line with its prior case law, the ECJ confirms that the mere existence of additional information enabling identification does not, by itself, mean that pseudonymized data must be regarded as personal data for the purposes of Regulation (EU) 2018/1725 (or the GDPR) in every case and for every person.
Finally, the ECJ reiterated that a controller with the means to identify a data subject cannot escape its obligations by arguing that the additional information is held by a third party, as such a division of knowledge does not negate identifiability from the controller’s perspective; the data subject remains identifiable to the controller even if the controller does not itself hold the additional information.
3. Information Obligations – In Particular from the Perspective of the Controller
Lastly, the ECJ emphasized that the obligation to provide information under Article 15 of Regulation (EU) 2018/1725 and Articles 13 and 14 GDPR rests with the controller. Accordingly, the SRB should have disclosed Deloitte as a potential recipient of the personal data, because, from the controller's perspective, the data remain personal in nature and are therefore subject to the information obligation – irrespective of whether they were personal in nature from Deloitte's perspective.
A third party that cannot establish any link to an individual cannot fulfill data protection information duties or facilitate data subject rights in relation to those data. By contrast, the controller can – and must – provide the required information (immediately, i.e., at the time of collection) and ensure the exercise of data subject rights.
Since the obligation to provide information applies only if the data remains personal for the controller, the controller is not required to disclose information about recipients if the data is fully anonymized from the outset (for example, when incorporated into statistical analyses).
Practical Relevance
With its judgment in EDPS v. SRB, the ECJ strengthens the position of controllers and third parties in the anonymization of personal data, while also clarifying the obligation to inform data subjects.
Although the assessment depends on the individual case, the ECJ has provided guidelines that also apply to European Data Protection Authorities. Through appropriate technical and/or organizational measures, a data record that is “personal” in nature for one party may be “anonymous” for another party. This can encourage companies to make greater use of pseudonymisation and anonymization to develop new business models and better data analysis. It can also help ensure compliance with the EU Data Act by preventing the provision of personal data to third parties (for example, if there is no legal basis under data protection law).
Even though the ECJ referred the final decision back to the General Court, it confirmed that data sets can be regarded as de facto anonymized data if the recipient has no means of (re-)identification or if there is no sufficient likelihood that the data could be linked with additional information to identify individuals, for example, if the recipient has no legal access to the additional information (cf. Schweinoch/Peintinger, CR 2023, 532 (538 et seq.)).
It is important to note that the ECJ requires a case-by-case assessment. In the case of complex or large data sets, it must be carefully examined whether identification of individuals from the data set itself is possible. In such cases, additional measures (e.g., aggregation of data) must be applied to make identification of the data subjects significantly more difficult or effectively impossible.
From the perspective of the controllers, the obligation to provide information to data subjects can be particularly challenging when the transfer to third parties is not yet concretely planned at the time of data collection. Recipients of pseudonymized data sets must be documented to enable responses to potential information requests.













































