Skip to main content
All
February 26, 2026

EU Digital Omnibus: What the Proposed Reforms Mean for Pharma and MedTech

Advisory

On November 19, 2025, the European Commission introduced two proposals aimed at simplifying and streamlining various digital and data laws across the European Union (EU). The Digital Omnibus on AI Regulation Proposal sets out amendments to the EU AI Act (Regulation (EU) 2024/1689), while the more comprehensive Digital Omnibus Regulation Proposal includes changes to several EU regulations, including but not limited to the GDPR (Regulation (EU) 2016/679), ePrivacy Directive (2002/58/EC), EU Data Act (Regulation (EU) 2023/2854), and NIS2 Directive ((EU) 2022/2555) (together Proposals).

Following the publication of the Proposals, Arnold & Porter shared an initial blog post highlighting key considerations for Life Sciences companies. In this more in-depth contribution, we take a comprehensive look at the proposed amendments across the EU AI, digital and data regulatory framework. This advisory analyses the practical implications for pharmaceutical and MedTech companies and assesses the views expressed by the EDPB and EDPS in their Joint Opinions (on the Digital Omnibus on AI Regulation Proposal and the broader Digital Omnibus Regulation Proposal). We also offer practical recommendations to help companies navigate what comes next.

EU AI Act

Through the Digital Omnibus on AI Regulation Proposal (AI Omnibus Proposal or Proposal) the European Commission proposes to simplify and streamline several procedures. These include:

  • To facilitate compliance with data protection laws: allowing AI providers and deployers to process special categories of personal data for bias detection and correction, subject to data protection laws and safeguards
  • The AI Omnibus Proposal introduces a new Article 4a to the EU AI Act, permitting, on an exceptional basis, the processing of special category personal data (which includes data concerning health) for bias detection and mitigation within the context of high-risk AI systems. Such processing is contingent upon stringent requirements and safeguards, including demonstrating that the intended objective cannot reasonably be accomplished with alternative data, implementing robust technical and organizational security measures, and ensuring the deletion of the data once the bias-related purposes have been achieved.

    The Proposal also contemplates extending this provision, under specified conditions, to other AI systems beyond those designated as high-risk.

    In their Joint Opinion, the EDPB and EDPS support the establishment of a dedicated legal basis for bias detection and mitigation in principle, yet express reservations regarding the scope of extension to AI systems outside the high-risk category. They underscore the importance of adhering to a strict necessity and proportionality standard.

    Practical impact: For Life Sciences companies, where reliance on health data may be essential to identify and remediate bias in AI systems, the Proposal could provide a clearer legal basis for limited bias-related processing activities involving special categories of personal data. Nonetheless, an application of this provision will likely be narrowly interpreted and subject to rigorous scrutiny by the EU competent authorities, necessitating careful justification and the implementation of comprehensive safeguards.

  • Targeted changes clarifying the interplay between the EU AI Act and other EU legislation and adjusting the EU AI Act’s procedures to improve its overall implementation and operation
  • The AI Omnibus Proposal clarifies how the EU AI Act is intended to apply alongside existing EU product legislation, in particular the frameworks governing medical devices and in vitro diagnostic devices (IVDs) under the Medical Device Regulation (MDR) and In Vitro Diagnostic Regulation (IVDR). It confirms that EU AI Act requirements for high-risk AI systems that are also regulated products should be applied within the existing conformity assessment procedures, rather than through separate or parallel certification processes (please also review our advisory on proposed changes to the MDR and IVDR).

    The Proposal also streamlines the designation of notified bodies by allowing conformity assessment bodies to submit a single application and undergo a single assessment for designation under both the EU AI Act and the relevant Annex I product legislation. This is intended to simplify procedures and support more efficient implementation of the EU AI Act in regulated sectors.

    Practical impact: For manufacturers of AI-enabled medical devices and IVDs, these changes should reduce duplication and regulatory complexity by enabling a single, integrated conformity assessment (please note that further simplification for high-risk AI systems regulated under the MDR and IVDR is also expected, as further detailed in our advisory above). This may also help accelerate the availability of suitably designated notified bodies, making early standards planning and engagement with notified bodies even more important.

  • Expanding AI regulatory sandboxes and real-world testing to support key European industries
  • The AI Omnibus Proposal broadens the framework for testing AI systems by strengthening regulatory sandboxes and expanding opportunities for testing in real-world conditions. In particular, it enables the AI Office to establish EU-level sandboxes and introduces additional governance clarifications.

    The Proposal also extends the possibility to conduct real-world testing outside formal sandbox environments, including for high-risk AI systems subject to Annex I product legislation, such as medical devices. Under the current framework, real-world testing is mainly available for the Annex III use cases.

    In their Joint Opinion, the EDPB and EDPS support the development of EU-level sandboxes but emphasize the need for greater legal certainty, including clear involvement of data protection authorities and well-defined cooperation mechanisms under the GDPR.

    Practical impact: For regulated and safety-critical sectors, including the Life Sciences sector, these changes could provide a clearer and more flexible legal framework for piloting, validating and refining AI systems in real operational environments. In particular, the extension of real-world testing to Annex I products may facilitate performance validation, bias assessment, data collection and iterative improvement of clinically relevant AI models before or alongside market deployment. However, access to these testing pathways is likely to remain conditional on robust governance arrangements, with data protection compliance and regulator engagement acting as key prerequisites.

  • Linking high-risk AI obligations timelines to the availability of standards
  • The AI Omnibus Proposal introduces flexibility in the timing of the EU AI Act’s high-risk obligations by linking their application to the availability of key support tools, such as European harmonized standards, common specifications, or European Commission guidance. Instead of automatically applying from August 2, 2026, the Chapter III requirements would start to apply only once the European Commission confirms that these measures are in place, followed by a transitional period of six months for Annex III systems and twelve months for Annex I systems integrated into regulated products, including medical devices. Long-stop dates apply in any event, with compliance required by December 2, 2027, for Annex III systems and by August 2, 2028, for Annex I systems.

    The Proposal also clarifies the treatment of legacy high-risk AI systems. Where at least one unit1 has been lawfully placed on the EU market before the relevant cut-off date, additional units of the same type and model may continue to be placed on the EU market without new conformity assessment, provided the design remains unchanged. Any significant design modification would trigger full EU AI Act compliance.

    In their Joint Opinion, the EDPB and EDPS caution that delaying the application of high-risk obligations may affect fundamental rights protection and suggest that certain requirements, such as transparency obligations, should remain subject to the original timelines as currently set out in the EU AI Act.

    Practical impact: These changes may ease short-term compliance pressure for legacy AI systems and align obligations more closely with the availability of standards. However, the flexibility is time-limited, excludes significant design changes, and may be narrowed during negotiations, meaning companies should continue preparing for full EU AI Act compliance well ahead of the final deadlines.

  • Extending regulatory simplifications from SMEs to SMCs
  • The AI Omnibus Proposal extends several regulatory simplifications currently available to small and medium-sized enterprises (SMEs) to small mid-cap companies (SMCs). These include, in particular, simplified technical documentation requirements and more tailored consideration of company size and economic capacity when determining administrative penalties.

    In their Joint Opinion, the EDPB and EDPS caution that company size should not dilute protections where AI systems pose significant risks.

    Practical impact: This extension is particularly relevant for scale-up companies operating in the AI and Life Sciences space, which may fall outside the SME definition but still face comparable resource constraints for compliance.

  • Reframing AI literacy obligations
  • The Proposal revises the approach to AI literacy by shifting responsibility from an open-ended obligation on AI providers and deployers to a requirement for the European Commission and EU Member States to foster AI literacy through appropriate measures.

    In their Joint Opinion, the EDPB and EDPS stress that AI systems providers and deployers should not be released from their obligation to ensure that their staff have a sufficient level of AI literacy, as it helps raise ethical and social awareness on AI benefits and risks.

    Practical impact: This change reduces legal uncertainty around the scope of AI literacy obligations for providers and deployers, while preserving targeted training requirements where AI systems present higher risks. In practice, companies should still expect AI literacy to be treated as a governance expectation, particularly in regulated or high-risk contexts.

  • Greater flexibility in post-market monitoring
  • The AI Omnibus Proposal introduces additional flexibility in post-market monitoring by removing the requirement to follow a prescribed harmonized post-market monitoring plan. Providers would retain responsibility for monitoring system performance and risks, but with more discretion as to how this obligation is operationalized.

    Practical impact: This may allow companies to design post-market monitoring processes that are better aligned with existing quality management and vigilance systems, particularly in regulated sectors such as medical devices. However, robust monitoring will remain essential, and regulators are likely to scrutinize whether alternative approaches provide equivalent oversight.

  • Reducing registration obligations for certain Annex III use cases
  • The Proposal reduces registration burdens for AI systems used in areas listed in Annex III where the provider has concluded, based on a documented assessment, that the system does not qualify as high-risk because it is used only for narrow, procedural or preparatory tasks.

    Practical impact: This change may alleviate administrative burdens for providers of low-impact AI tools operating in high-risk sectors, such as decision-support or workflow optimization systems. At the same time, providers will need to maintain clear and well-reasoned internal documentation to substantiate non-high-risk classifications if challenged by regulators.

Changes to Data Protection and Cybersecurity (GDPR, ePrivacy and NIS2)

The Digital Omnibus Regulation Proposal (Digital Omnibus Proposal or Proposal) proposes several changes to the GDPR, ePrivacy Directive, and NIS2 Directive. These include:

  • Intention to clarify the notion of personal data
  • The Digital Omnibus Proposal intends to clarify when data should be considered personal data for a given company, focusing on whether identification of individuals is reasonably likely in light of the means available to that specific company. Under the Proposal, information would not be considered personal data for a particular company where that company cannot identify the individual, taking into account the means reasonably likely to be used by it. The Proposal further specifies that such information would not become personal data for that company merely because a potential subsequent recipient has the means reasonably likely to identify the individual. The clarification is presented as codifying CJEU case law, in particular the EDPS v. SRB judgment (see our blog on this judgment for more)2, and as aiming to reduce over-classification of data as personal data where identification is only theoretical or would require disproportionate effort.

    In their Joint Opinion, the EDPB and EDPS express serious reservations, considering that the proposed wording goes beyond a targeted codification of case law and would significantly narrow the concept of personal data. They emphasize that the definition lies at the core of EU data protection law and caution that selectively incorporating elements of a single judgment, while omitting the broader CJEU jurisprudence, risks undermining rather than enhancing legal certainty. They also warn that the “negative” formulation (defining what is not considered personal data) may create confusion and potentially incentivize attempts to circumvent the scope of the GDPR. For these reasons, they urge the co-legislators not to adopt the proposed amendment.

    Practical impact: For Life Sciences and MedTech companies handling large datasets (e.g., clinical, real-world, or sensor data), the proposed clarification could, if adopted, strengthen arguments that certain pseudonymized or indirectly identifiable datasets fall outside the GDPR in the hands of companies that lack reasonably likely means of re-identification. This may reduce compliance burdens in low-risk scenarios and support more flexible data-sharing models within research and innovation ecosystems. However, in light of the strong reservations expressed by the EDPB and EDPS and the likely continued interpretative debate (including forthcoming guidance on pseudonymization and anonymization), the extent to which the Proposal would deliver greater legal certainty remains unclear.

  • Targeted flexibility for scientific research of personal data
  • The Proposal introduces a harmonized definition of “scientific research” to address national fragmentation and legal uncertainty. For an activity to fall within the scope of this definition, it must contribute to existing knowledge or apply it in novel ways, aim to advance society’s general knowledge and wellbeing, and adhere to relevant ethical standards. The Proposal clarifies that research does not lose its qualification merely because it also pursues commercial interests. The Proposal also confirms that further processing for scientific research purposes is to be considered compatible with the initial purpose, with the aim of ensuring more consistent application of research-related derogations across the EU.

    In addition, the Proposal introduces a new transparency derogation for scientific research. It extends the existing exemption (applicable where data are not obtained directly from the data subject) to certain cases of direct collection, where providing information proves impossible or would involve a disproportionate effort, or would seriously impair the research objectives, subject to the safeguards of Article 89(1) GDPR.

    In their Joint Opinion, the EDPB and EDPS welcome the objective of harmonization but call for a more precise delineation of the concept. In their view, scientific research should explicitly require a methodological and systematic approach, autonomy and independence, and verifiable and transparent results. They recommend moving references to innovation and commercial interests to the recitals. They also stress that compatibility of further processing must be clearly distinguished from the separate requirement of a valid legal basis under Article 6 GDPR.

    Practical impact: These clarifications may benefit pharmaceutical and MedTech companies engaged in clinical research, post-market surveillance, and real-world evidence generation by reducing fragmentation and facilitating secondary use of data across the EU. The confirmation of compatibility for further research processing creates clearer pathways for data reuse, provided safeguards under Article 89 GDPR are respected and a valid legal basis under Article 6 (and, where applicable, Article 9(2)) GDPR is identified. The new transparency exemption may also be relevant in secondary research or certain clinical trial contexts, for example, where contact details are no longer retained or where individual notification would seriously impair the research, subject to strict conditions and safeguards.

  • Legitimate interest and AI-related processing (including residual special category data)
  • The Digital Omnibus Proposal addresses the use of legitimate interest as a legal basis in the context of the development and operation of AI systems. Proposed Article 88c GDPR clarifies that processing for the development and operation of AI models or systems may, in principle, be pursued on the basis of legitimate interest under Article 6(1)(f) GDPR. The Proposal also introduces specific safeguards, including an “unconditional” right to object and enhanced transparency obligations. In addition, it provides for a new derogation under Article 9 GDPR for the incidental and residual processing of special categories of data in the context of AI development and operation, subject to conditions and safeguards.

    In their Joint Opinion, the EDPB and EDPS acknowledge that legitimate interest may already serve as a legal basis for AI-related processing under the current GDPR and question the need for a specific new provision. They stress that controllers must continue to carry out the full three-step legitimate interest test and caution that the proposed wording may not enhance legal certainty. They also call for clearer drafting of the “unconditional” right to object and the transparency requirements.

    As regards the new derogation for incidental and residual processing of special categories of data, they generally welcome the objective but recommend clarifying that it applies only where such processing is genuinely incidental, alongside stronger safeguards and clearer limits.

    Practical impact: For companies developing or deploying AI-driven tools (e.g., analytics or decision-support systems), the Proposal could offer greater flexibility in relying on legitimate interest, particularly where consent is impractical. The new derogation for incidental and residual processing of special categories of data may also reduce legal risk during AI training and testing phases. However, reliance on legitimate interest will continue to require a documented and case-specific balancing test, meaningful transparency, and effective objection mechanisms. Where health or other special category data are involved, strict safeguards and careful assessment of the applicability and limits of the new exemption, if adopted, will remain essential.

  • Exemption to allow the processing of biometric data
  • The Digital Omnibus Proposal introduces a new derogation under Article 9 GDPR permitting the processing of biometric data for the sole purpose of confirming a data subject’s claimed identity. The exemption applies only where the biometric data, and the means necessary for verification, remain under the sole control of the data subject — for example, where biometric templates are stored locally on a device or protected by a key exclusively held by the individual. The stated objective is to facilitate secure and privacy-enhancing authentication mechanisms while maintaining appropriate safeguards.

    In their Joint Opinion, the EDPB and EDPS welcome the limited scope of the exemption and its focus on verification (rather than identification) scenarios. However, they emphasize that processing of biometric data remains inherently high-risk and must comply with the principles of necessity and proportionality. They underline that less intrusive alternatives should be used where available and that appropriate technical and organizational safeguards must be implemented. The authorities also caution against characterizing such processing as inherently low-risk and stress that data protection risk assessments will remain necessary in practice.

    Practical impact: For pharmaceutical and MedTech companies, the exemption may support the deployment of biometric authentication tools in secure digital health platforms, clinical trial portals, or professional access systems, particularly where authentication mechanisms are designed to keep biometric data under the user’s control. However, given the sensitivity of biometric and health-related data, companies will still need to conduct careful necessity assessments, implement robust security safeguards, and ensure alignment with broader GDPR obligations, including risk assessments and data minimization requirements.

  • Streamlining Data Protection Impact Assessments
  • The Proposal seeks to further harmonize Data Protection Impact Assessment (DPIA) requirements across the EU. In particular, it introduces the development of common EDPB DPIA lists (covering both processing operations that require a DPIA and those that do not), as well as a common template and methodology for conducting DPIAs. The objective is to reduce divergent national practices and enhance legal certainty for companies operating across multiple EU Member States.

    In their Joint Opinion, the EDPB and EDPS welcome the aim of increased harmonization and simplification, noting that common EU-level DPIA lists and a shared template could significantly reduce compliance burdens and promote consistency. However, they express reservations regarding the proposed governance mechanism. In particular, they caution against granting the European Commission unilateral power to modify DPIA lists and templates prepared by the EDPB through implementing acts. They recommend that the EDPB be exclusively entrusted with the preparation and approval of common DPIA lists and templates, in order to preserve institutional independence and supervisory expertise.

    They also emphasize that the common methodology for DPIAs should be understood in a broad and practical sense — as a structured process and set of principles — rather than as a rigid checklist. This is intended to preserve flexibility across sectors and use cases, while enabling continued reliance on established risk assessment frameworks and tools.

    Practical impact: For multinational pharmaceutical and MedTech companies, EU-wide DPIA lists and templates could significantly streamline compliance by reducing the need to navigate divergent national guidance and approaches. This is particularly relevant for high-risk AI systems, clinical data processing, large-scale health data analytics, and post-market surveillance activities. However, companies should anticipate that DPIAs will remain a central risk-governance instrument, subject to continued regulatory scrutiny.

  • Simplifying personal data breach notification obligations and setting up of a single-entry point for reporting incidents
  • The Proposal introduces several procedural simplifications regarding personal data breach notifications. Most notably, it raises the threshold for notifying supervisory authorities of breaches “likely to result in a risk” to those “likely to result in a high risk” to individuals’ rights and freedoms. It also extends the notification deadline from 72 to 96 hours and provides for the development of a common EU-level breach notification template and a harmonized list of circumstances indicating a high-risk breach. In addition, the Proposal introduces a new Article 23a to the NIS2 Directive, setting up a “single-entry point” (SEP) in order to streamline notification processes under several regulatory frameworks (e.g., GDPR, NIS2, DORA, and CER Directive).

    In their Joint Opinion, the EDPB and EDPS broadly support these changes. They consider that increasing the notification threshold and extending the deadline are unlikely to materially reduce the level of protection for data subjects, while potentially alleviating administrative burdens — especially for smaller companies. They note that supervisory authorities currently receive a very high volume of breach notifications, including minor incidents, and that a higher threshold may enable authorities to focus resources on more serious cases. At the same time, they stress that controllers must continue to document all personal data breaches — so supervisory authorities can verify compliance with the notification regime — and to implement appropriate measures to mitigate or remedy any adverse effects of the breach.

    However, the EDPB and EDPS again raise institutional concerns regarding the proposed role of the European Commission in reviewing and modifying the EDPB’s common notification template and risk lists. They recommend that the EDPB retains the primary responsibility for preparing and approving these instruments. Finally, they promote the use of a SEP for reporting incidents across multiple regulatory regimes, but at the same time call for greater harmonization between GDPR breach notification timelines and parallel reporting obligations under NIS2, DORA, and other sectoral frameworks.

    Practical impact: For pharmaceutical and MedTech companies — which often operate in complex IT environments and are subject to overlapping incident reporting obligations — these changes could reduce the frequency of reportable GDPR breaches and allow more time to conduct internal forensic assessments before notification. The introduction of a common EU template may also facilitate more consistent reporting across EU Member States. Nevertheless, internal detection, documentation, and remediation obligations remain unchanged, and parallel NIS2 or sector-specific reporting deadlines (often shorter) will continue to require careful coordination.

  • Clarifying the scope of automated decision-making
  • The Proposal seeks to clarify Article 22 GDPR by reframing the current “right not to be subject to” fully automated decision-making into a provision laying down an exhaustive list of cases where such types of automated decisions are legitimate. The Proposal also clarifies the “necessity” test in the contractual legitimization for automated-decision making, stating that automated decision-making may be considered necessary even where a human alternative theoretically exists.

    In their Joint Opinion, the EDPB and EDPS stress that Article 22 GDPR must remain a prohibition in principle, as confirmed by recent CJEU case law.3 They recommend maintaining wording that clearly reflects a general ban on solely automated decisions producing legal or similarly significant effects, subject only to narrowly interpreted exceptions. They caution against any formulation that could be read as permitting automated decision-making by default in contractual contexts and stress that the decision should be genuinely required for entering into or performing a contract. In particular, they emphasize that “necessity” must continue to be assessed strictly and in conjunction with the data minimization principle, requiring controllers to consider whether equally effective but less intrusive alternatives exist.

    Practical impact: For Life Sciences companies, the practical benefit of the proposed revision of Article 22 GDPR would mainly arise in relation to non-special-category data. In such cases, reliance on the contractual legitimization for automated decision-making — where genuinely necessary for entering into or performing a contract — may provide operational flexibility, for example, in distributor management or supply chain processes. By contrast, many core sector activities involve health or other special category data. In those cases, automated decision-making may only be permitted where explicit consent has been obtained or where it is necessary for reasons of substantial public interest, and must in any event be accompanied by appropriate safeguards in accordance with Article 22(4) GDPR.

  • Data subject rights: limitation to data subject access requests and exemption to transparency obligation
  • The Proposal introduces targeted amendments to certain data subject rights, with the stated aim of reducing administrative burdens and preventing misuse. In particular, it proposes to clarify Article 12(5) GDPR by further defining when access requests may be considered “manifestly unfounded” or “excessive,” and by linking the notion of abusive requests to situations where the right of access is exercised for purposes not related to data protection. The Proposal also introduces a more structured exemption from transparency obligations under Article 13 GDPR where personal data is collected directly from the data subject.

    With regard to access requests, the Proposal seeks to provide controllers with greater legal certainty when refusing or charging a reasonable fee for requests deemed abusive. By suggesting that requests not linked to data protection purposes may qualify as abusive, it attempts to address concerns about strategic or litigation-driven requests.

    In their Joint Opinion, the EDPB and EDPS support the objective of clarifying abusive requests but caution against linking the notion of abuse to the data subject’s motives or to the exercise of rights for purposes other than data protection. They recall that the right of access serves broader fundamental rights and must not be restricted merely because it is exercised in a broader legal or commercial context. They recommend that “abuse” be tied to demonstrable abusive intent and that any refusal of a request be based on an objective and properly documented assessment. They also emphasize that supervisory authorities should enjoy the same level of discretion as controllers when handling complaints.

    As regards transparency, the Proposal introduces a new exemption allowing controllers, not to provide certain information where the data are directly collected from the data subject in situations where there are reasonable grounds to assume that the data subject already has the information, unless the controller transmits the data to other recipients or categories of recipients, transfers the data to a third country, carries out automated decision-making or the processing is likely to cause a high risk to data subject’s rights. To apply this exemption, the personal data should be collected in the context of a clear and circumscribed relationship between the data subject and a controller who exercises an activity that is not data-intensive.

    The EDPB and EDPS broadly welcome efforts to simplify transparency obligations but stress that exemptions must remain narrowly framed and clearly defined. They express concern that concepts such as “not data-intensive activity” or a “clear and circumscribed relationship” lack precision and could lead to divergent interpretation. They recommend ensuring that data subjects retain the ability to obtain full information upon request and that any reliance on the exemption be based on objective criteria and careful documentation.

    Practical impact: For pharmaceutical and MedTech companies — often subject to high volumes of access requests in litigation, employment, pharmacovigilance, and clinical research contexts — the clarification of abusive requests may provide limited relief where requests are clearly excessive or strategically abusive. However, the threshold for refusal is likely to remain high, requiring careful internal assessment and documentation.

    With respect to transparency, companies operating patient support programs, digital platforms, or healthcare professional portals may, in principle, benefit from reduced duplication of information in low-risk settings, when the criteria would be fulfilled. However, in the Life Sciences sector, processing frequently involves health data or other special categories of personal data, which is often regarded as likely to give rise to a “high risk” to data subjects’ rights and freedoms. In such circumstances, the proposed exemption would be difficult to rely on. Given the narrow interpretation advocated by the EDPB and EDPS, companies should therefore approach any use of the transparency exemption with caution and ensure that governance frameworks preserve the ability to provide full information where required.

  • Changes relating to the ePrivacy Directive
  • The Digital Omnibus Proposal introduces significant amendments to the ePrivacy Directive, primarily by integrating certain rules on the protection of terminal equipment into the GDPR. In particular, it creates a new Article 88a GDPR governing the storage of and access to personal data in terminal equipment, accompanied by additional exceptions to the consent requirement. These include broader exemptions for services explicitly requested by the user, audience measurement conducted solely by the provider for its own use, and certain security-related processing. The Proposal also introduces rules on automated and machine-readable indications of user preferences (e.g., browser-based signals), with the aim of addressing “consent fatigue” and reducing repetitive cookie banners.

    In their Joint Opinion, the EDPB and EDPS strongly support the objective of simplifying the rules and addressing consent fatigue, as well as entrusting supervision to data protection authorities to ensure regulatory consistency. However, they express concerns that splitting the regime between the GDPR (for personal data) and the ePrivacy Directive (for non-personal data) may create legal uncertainty. They recommend clearly delimiting the scope of new consent exceptions — particularly for audience measurement and security purposes — to ensure they remain to what is strictly necessary and proportional. The authorities also emphasize that automated preference signals must not default to consent, that standards should apply uniformly across relevant actors (including browsers and operating systems), and that effective enforcement powers must be ensured.

    Practical impact: For pharmaceutical and MedTech companies operating patient portals, digital health applications, professional platforms, or corporate websites, the proposed changes could reduce administrative burdens associated with cookie consent management, particularly through machine-readable preference signals and certain limited consent exemptions. However, many sector activities involve the processing of health data, meaning consent will often remain necessary and any reliance on exemptions is likely to be interpreted restrictively. Companies should therefore monitor the final scope of the consent exceptions and ensure that any reliance on them is carefully assessed, particularly where tracking technologies interact with health-related or other special category data.

EU Data Act and broader Data Acquis reforms

In addition to proposing amendments to the GDPR, ePrivacy Directive, and NIS2 Directive, the Digital Omnibus Proposal introduces significant changes to the EU Data Act and the broader “Data Acquis.” The Proposal seeks to streamline overlapping instruments and eliminate outdated or duplicative provisions—including by repealing and consolidating elements of the Data Governance Act (Regulation (EU) 2022/868) (DGA) and the Open Data Directive (Directive (EU) 2019/1024) (ODD) into the revised EU Data Act. The overarching objective is to enhance legal certainty, reduce fragmentation, and simplify compliance obligations across the EU data regulatory framework.

The most relevant changes for Life Sciences companies include:

  • Enhanced protection for trade secrets in international access scenarios
  • The Proposal also introduces an additional ground under the EU Data Act for data holders to refuse access requests involving trade secrets. In addition to the existing possibility to deny access where disclosure would be likely to cause serious economic harm, data holders would be entitled to refuse a request where there is a high risk that the trade secrets could be unlawfully accessed, used, or disclosed in a third country lacking adequate or effectively enforced legal protections. Importantly, this assessment is not limited to a formal review of the legal framework on paper, but extends to situations where a jurisdiction’s rules may appear robust but are not meaningfully enforced in practice.

    Practical impact: For pharmaceutical and MedTech companies, this amendment is particularly significant. These sectors rely heavily on proprietary data, including manufacturing processes, formulation details, device specifications, algorithms, and clinical development strategies, much of which could qualify as trade secrets and underpins long-term R&D investment. Given the global nature of clinical research, manufacturing, supply chains, and partnerships, access requests may entail onward transfers or exposure in jurisdictions with uneven trade secret protection. The new refusal ground provides a more realistic and operationally workable safeguard, enabling companies to better protect sensitive technical and commercial know-how from misappropriation risks outside the EU. At the same time, it will require companies to conduct and document jurisdiction-specific risk assessments when evaluating access requests, integrating trade secret protection more closely into their EU Data Act compliance processes.

  • Making data available in case of public emergency
  • The Proposal amends the EU Data Act rules governing the obligation of data holders to make data available to public sector bodies in situations of “exceptional need,” including public emergencies. Notably, it removes the current requirement that personal data may only be shared in pseudonymized form and only where non-personal data is insufficient. Under the proposed text, personal data may be requested and, “where possible,” provided in pseudonymized form.

    In their Joint Opinion, the EDPB and EDPS express concern about the removal of the explicit pseudonymization safeguard. They recommend maintaining the current structure whereby non-personal data should be shared by default and personal data — limited to pseudonymized form — only where strictly necessary. They also call for clearer delineation of the circumstances under which personal data can be requested, as well as strengthened safeguards and supervisory involvement.

    Practical impact: For pharmaceutical and MedTech companies holding large volumes of clinical, trial, or real-world data, the revised provisions could expand the scope of public authority access in emergency scenarios. However, given the EDPB and EDPS’ position, companies should expect continued emphasis on data minimization and pseudonymization. Robust internal procedures for assessing data access requests and documenting compliance will remain essential.

  • Re-use of data held by public sector bodies
  • The Proposal integrates rules from the DGA and the ODD into the EU Data Act, creating a consolidated regime for the re-use of data and documents held by public sector bodies. It aims to clarify the relationship between access regimes and reduce fragmentation.

    In their Joint Opinion, the EDPB and EDPS welcome the streamlining objective but recommend maintaining explicit provisions clarifying that the EU Data Act does not itself oblige public bodies to grant access to personal data, nor does it provide a legal basis for processing.

    Practical impact: For Life Sciences companies relying on access to public health datasets, regulatory data, or public research repositories, consolidation under the EU Data Act may enhance procedural clarity. However, the re-use of personal data will remain subject to GDPR constraints, and companies should not assume that EU Data Act access equates to lawful processing without an appropriate legal basis under GDPR.

  • Data intermediation services and data altruism organizations
  • The Proposal modifies the framework for data intermediation services and recognized data altruism organizations, including replacing mandatory prior notification with voluntary registration and relaxing certain governance and record-keeping requirements.

    The EDPB and EDPS acknowledge the intention to reduce administrative burden but caution that reduced transparency and oversight may undermine trust. They recommend maintaining certain safeguards, including clearer accountability, record-keeping obligations, and supervisory visibility, particularly where personal data processing is likely to pose high risks.

    Practical impact: For pharmaceutical and MedTech companies participating in data-sharing ecosystems, research consortia, or patient-driven data altruism initiatives, the revised framework may reduce procedural formalities. However, due diligence obligations when engaging with intermediaries will remain critical, especially where sensitive health data is involved. Companies should verify that data-sharing partners maintain adequate safeguards, even if formal regulatory requirements are simplified.

  • Enforcement and cooperation mechanisms
  • As part of the broader integration of the Data Acquis into the EU Data Act, the Proposal extends the enforcement provisions in the EU Data Act horizontally and modifies cooperation mechanisms between competent authorities and data protection supervisory authorities.

    In their Joint Opinion, the EDPB and EDPS stress the need for clear delineation of responsibilities between EU Data Act authorities and GDPR supervisory authorities. They recommend explicit legal bases for information exchange and cooperation across regulatory domains to ensure consistent enforcement, particularly where personal data processing is concerned.

    Practical impact: For Life Sciences companies operating cross-border data infrastructures, enforcement under the EU Data Act may increasingly intersect with GDPR supervision. Clear internal governance structures and coordinated engagement strategies with multiple authorities will be important, particularly where product data, health data, and commercial data intersect.

What to Expect Next and How to Prepare

The Proposals are now firmly in the hands of the EU co-legislators, and early negotiating positions are beginning to take shape. The European Parliament has published its first rounds of amendments to the Digital Omnibus on AI Regulation Proposal — signalling parliamentary priorities across areas such as prohibited AI practices, high-risk classification, conformity assessment, and transparency obligations. On the broader Digital Omnibus Regulation Proposal, the Council's early position is beginning to emerge through reports of internal discussions. According to draft compromise texts circulated among EU governments, the Cyprus Presidency is considering the removal of the European Commission's proposed amendment to the definition of personal data in Article 4 GDPR from the legislative text itself, in response to criticism that it would unduly narrow the scope of EU data protection law. Instead, the underlying “entity-specific identifiability” concept would reportedly be preserved in a revised recital rather than operative text — a more cautious approach that, if confirmed, would reflect the influence of the EDPB and EDPS Joint Opinions. These positions have not yet been formally published and remain subject to change as EU Member State discussions continue.

Formal compromise texts from both institutions are still awaited, and once each has consolidated its position, trilogue negotiations can begin — a process expected to be contentious, particularly on provisions touching fundamental rights and the scope of simplification measures.

For pharmaceutical and MedTech companies, the message is one of cautious preparation rather than anticipatory compliance. While the Proposals could deliver meaningful benefits, many of the most significant provisions remain contested and may be substantially revised, as the Council’s early retreat on the personal data definition already illustrates. Companies should continue building robust compliance frameworks under the current rules, monitor developments closely, and prioritize early engagement with notified bodies and regulators regardless of how negotiations conclude.

© Arnold & Porter Kaye Scholer LLP 2026 All Rights Reserved. This Advisory is intended to be a general summary of the law and does not constitute legal advice. You should consult with counsel to determine applicable legal requirements in a specific fact situation.

  1. ”Unit” refers to a single, identifiable instance of an AI system of a given type and model, that is placed on the market or put into service as an individual product, whether supplied as standalone software, embedded in hardware, or made available through digital means.

  2. Judgment of the Court of Justice of 4 September 2025, EDPB v SRB, C-413/23 P, ECLI:EU:C:2025:645.

  3.   Judgment of the Court of Justice of 7 December 2023, C-634/21, SCHUFA Holding, ECLI:EU:C:2023:957, paragraph 52.