Skip to main content

This digest covers key virtual and digital health regulatory and public policy developments during December 2023 from the United States, United Kingdom, and European Union.

In this issue, you will find the following:

U.S. News

EU and UK News

U.S. News

FDA Regulatory Updates

CDRH's Annual Report Highlights FDA’s Continued Efforts to Foster Digital Health Innovation. In 2023, FDA’s Digital Health Center of Excellence (DHCoE) continued to foster innovation for new and emerging digital health technologies. But just how busy was DHCoE? As reported in the 2023 CDRH Annual Report, DHCoE responded to more than 900 inquiries during the year. Digital health innovation-related accomplishments highlighted in the report include publication of a draft guidance on marketing submissions for predetermined change control plans (PCCPs) for AI/ML-devices, release of guiding principles for PCCPs for ML devices, issuance of final guidance on premarket submissions for device software functions, creation of a digital health advisory committee, and release of new resources on augmented reality and virtual reality devices. Another notable update from the report is that the number of FDA-authorized AI/ML-enabled medical devices is now at over 700, with more under development.

FDA Issues Final Guidance on Digital Health Technologies for Clinical Investigations. On December 21, 2023, FDA issued final guidance titled “Digital Health Technologies for Remote Data Acquisition in Clinical Investigations” (Final Guidance). The guidance provides recommendations for ensuring that remote data acquisition digital health technologies (DHTs) are fit-for-purpose (i.e., that the level of validation associated with the DHT is sufficient to support the use, including the interpretability of its data in the clinical investigation), which involves considerations of both the DHT’s form (i.e., design) and function(s) (i.e., distinct purpose(s) within an investigation). The guidance defines a DHT as “a system that uses computing platforms, connectivity, software, and/or sensors, for health care and related uses.” The guidance outlines recommendations intended to facilitate the use of DHTs in clinical investigations, including regarding (1) selection of DHTs that are suitable for use in clinical investigations, (2) information to include in regulatory applications, (3) verification and validation of DHTs for use in clinical investigations, (4) use of DHTs to collect data for trial endpoints, (5) identification and management of risks associated with the use of DHTs during clinical investigations, and (6) retention and protection of data collected by DHTs. While whether a DHT meets the definition of a medical device is outside the scope of the guidance, certain of the guidance recommendations apply regardless of whether a DHT meets the definition of a device (e.g., verification and validation recommendations).

FDA Issues Warning Letter to Sponsor of Brain Feedback Devices. As reported in prior issues of our digest, over the past year, FDA has issued a number of enforcement letters to sponsors of digital health devices. In the latest example, the agency issued a Warning Letter to Deymed Diagnostics s.r.o. (Deymed), a manufacturer of a variety of Class II neurofeedback hardware and software devices. Among other quality system violations, FDA asserts that Deymed’s Brain Feedback Pro device has yet to be subject to the company’s design control process despite being distributed since 2015. The Warning Letter also enumerates various other deficiencies, including ones relating to complaint handling and device history records.

For more information on prior Warning Letters involving software-based digital health technologies please see the June, July, and August 2023 issues of Arnold & Porter’s Virtual and Digital Health Digest.

Corporate Transactions Updates

Are Brain Implants Digital Health’s New Financing Darling? Neuralink, Elon Musk's company which is developing brain-computer interface (BCI) technology to treat conditions like obesity, autism, depression, and schizophrenia, has raised more than US$323 million over the past six months. Neuralink’s goal is to enable paralyzed people to control a cursor or keyboard with just their thoughts. In September 2023, Neuralink announced that it would start recruiting volunteers for a clinical trial to test its device, which would enable individuals suffering from paralysis to use computers to communicate and do other basic tasks.

The race to bring a BCI implant into the digital health market continues to heat up, and Neuralink is not the only entrée. This past year, Synchron, which has raised over US$145 million in funding to date, made strides by demonstrating its device safely transmitted neural signals from inside a blood vessel in the brain for a year without any serious side effects. Another such competitor is Precision Neuroscience’s BCI, which recently conducted pilot studies in which a thin film array was placed temporarily on the surface of the brain while people were undergoing tumor surgery to record and map the brain’s activity.

The global BCI market size is expected to expand at a staggering compound annual growth rate of 10.32% during the next forecast period, reaching US$2,370,500,000 by 2027.

Provider Reimbursement Updates

Telehealth Prescribing Reforms Stall in Congress. On December 12, the House passed its version of the SUPPORT Reauthorization Act, which seeks to reauthorize key federal programs for patients with substance use disorder (SUD). The Senate Health, Education, Labor, and Pensions (HELP) Committee advanced its version of the reauthorization bill the same day, but it failed to include a provision that would allow providers to prescribe SUD medications via telehealth without an in-person visit.

The Senate version of the SUPPORT Reauthorization Act, however, includes a separate provision that would require the Drug Enforcement Administration (DEA) to issue final regulations within one year outlining a “special registration” pathway. This special registration would enable certain practitioners to prescribe controlled substances via telemedicine without an in-person visit. The agency was first instructed to establish the special registration pathway more than 15 years ago in the Ryan Haight Act of 2008, but it has not yet done so.

As discussed in the December 2023 digest, the DEA allowed physicians to prescribe controlled substances without an in-person visit during the public health emergency (PHE). In February 2023, anticipating the end of the PHE, the agency proposed two rules (88 Fed. Reg. 12875 and 88 Fed. Reg. 12890) that, if finalized, would have significantly curtailed the telemedicine flexibilities permitted during the PHE. The agency also declined to establish a special registration pathway, on the grounds that it would be “potentially burdensome for both telehealth providers and patients.” 88 Fed. Reg. 12875, 12883.

After receiving thousands of comments in opposition to the proposals, the agency issued a temporary rule extending the PHE telemedicine flexibilities through December 31, 2024. 88 Fed. Reg. 69879. The DEA also stated it is “open to considering” implementing the special registration pathway for telemedicine prescribing of certain controlled substances. 88 Fed. Reg. 52210, 52212. Depending on the final version of the SUPPORT Reauthorization Act passed by Congress, the DEA may soon face increased pressure to establish the special registration pathway in 2024.

The DEA is expected to propose new rules governing telehealth prescriptions of controlled substances later this year.

Policy Updates

Federal Spending Deadlines Approach, and House Passes Large Health Package. The current “laddered” continuing resolution (CR) extends federal funding through January 19 and February 2. While funding for federal health programs, including Community Health Centers, the National Health Service Corps, Graduate Medical Education programs, and the Pandemic All-Hazards Preparedness Act is set to expire on January 19, other health-related investments made via Labor, Health and Human Services, Education, and Related Agencies will need to be addressed ahead of the second CR deadline on February 2. In December, the House passed sweeping health reforms in the Lower Costs, More Transparency Act (H.R. 5378), amended on September 8, 2023 by a vote of 320-71. Following House passage of H.R. 5378, Sens. Mike Braun (R-IN), Bernie Sanders (I-VT), Tina Smith (D-MN), and John Hickenlooper (D-CO) introduced the Health Care Prices Revealed and Information to Consumers Explained Transparency Act 2.0 (S. 3548), which contains provisions similar to those included in the Lower Costs, More Transparency Act.

House Committee Holds Hearing on AI. On December 13, the House Energy & Commerce Committee held a hearing titled, “Leveraging Agency Expertise to Foster American AI Leadership and Innovation.” The committee hosted government officials from the U.S. Department of Health and Human Services (HHS), the Department of Commerce, and the Department of Energy to discuss ways AI can assist health providers, bolster the workforce, and strengthen clinical drug development.

GAO Releases Report on AI Implementation. On December 12, the Government Accountability Office released a report titled, “Artificial Intelligence: Agencies Have Begun Implementation but Need to Complete Key Requirements.” The report includes 35 recommendations to 19 federal agencies, including directing the Office of Management and Budget to fully implement new federal AI requirements.

HHS Releases Guiding Principles for AI-Enabled Technologies. On December 15, HHS’ Agency for Health Care Research and Quality released five “guiding principles” for health providers to follow when using AI-enabled technologies: (1) promoting health and health care equity during all health care algorithm life cycle phases; (2) ensuring health care algorithms and their uses are transparent and explainable; (3) authentically engaging patients and communities during all health care algorithm life cycle phases and earn trustworthiness; (4) explicitly identifying health care algorithmic fairness issues and tradeoffs; and (5) establishing accountability for equity and fairness in outcomes from health care algorithms.

Senator Markey Urges for More Diversity in FDA’s Digital Health Technologies Committee. On December 19, Senate Health, Education, Labor, and Pensions Subcommittee on Primary Health and Retirement Security Chair Ed Markey (D-MA) led a letter to FDA Commissioner Robert Califf asking that individuals with a background in civil rights, medical ethics, and disability rights be added to FDA’s Digital Health Technologies Committee, which advises on the development, regulation, and implementation of DHTs, such as AI-enabled technologies, telehealth, and wearable medical devices. The letter was also signed by Senate HELP Chair Bernie Sanders (I-VT) and Sens. Bob Casey (D-PA), Amy Klobuchar (D-MN), Tammy Duckworth (D-IL), and Alex Padilla (D-CA).

Privacy Updates

ONC Publishes Final Rule Addressing Health Data Algorithm Transparency. On January 9, the HHS Office of the National Coordinator for Health Information Technology (ONC) published in the Federal Register its final rule titled “Health Data, Technology, and Interoperability: Certification Program Updates, Algorithm Transparency, and Information Sharing” (the Final Rule). The Final Rule, is designed to enhance the access to, exchange, and use of electronic health information while also advancing equity, innovation, and interoperability in health information technology (HIT). In addition to imposing detailed new requirements on HIT developers, the Final Rule makes significant changes to access and support for source attributes and transparency in the use of predictive decision support interventions (DSIs).

Under the Final Rule, the existing clinical decision support (CDS) tools certification criteria are modified “to reflect an array of contemporary functionalities, data elements, and software application that certified Health IT Modules support to aid decision-making in healthcare.” Developers of Health IT Modules will need to submit their CDS tools to real-world testing and provide test results to demonstrating the real-world use of each type of DSI.

The Final Rule also establishes a new intervention type, referred to as “Predictive DSI.” This includes, for example, algebraic equations, machine learning, and natural language processing. Per ONC, predictive DSI will likely encompass models that are trained on relationships in large data sets and that predict, for example, whether a given image contains a malignant tumor or a whether a given patient is at risk for sepsis. Large language models (LLMs) and other forms of generative AI would, as ONC states, likely meet the definition of Predictive DSI to the extent the LLMs are used to support decision-making.

ONC expects that increased transparency surrounding the development and use of AI in health care will allow users to make better-informed decisions about whether and how to use emerging software.

See more Arnold & Porter coverage of ONC’s Final Rule in our recent Advisory.

EU and UK News

Regulatory Updates

Provisional Agreement Reached on the EU AI Act. On December 9, the European Parliament (EP) and European Council announced that it had reached a provisional agreement on the text of the EU AI Act that seeks to harmonize rules on AI (see our November 2022, January 2023, and June 2023 digests, as well as our July 2023 Advisory, for details of how the negotiations have progressed). As we have previously set out, all AI medical devices will be classified as high risk, meaning they will be assessed before being put on the market and also throughout their lifecycle. In addition, some compromises that could be relevant to digital health companies are:

  • All high-risk AI systems (which includes all health care applications) will be subject to a mandatory fundamental rights impact assessment prior to their use.
  • AI systems can be tested in real-world conditions subject to specific conditions and safeguards being fulfilled.

Now, the EP and European Council must formally adopt the text, with the act expected to become law early in 2024 and to apply two years after its entry into force, except for some specific provisions which will apply earlier. Furthermore, the commission has announced it will be launching an AI Pact to encourage AI developers to implement key obligations of the AI Act ahead of the legislative deadlines.

The HMA and EMA Publish Their Joint AI Workplan 2023-2028. On December 18, the joint Heads of Medicines Agencies and European Medicines Agency Big Data Steering Group (BDSG) published the 2023-2028 AI workplan, on behalf of the European medicines regulatory network (EMRN). The plan aims to utilize the benefits of AI in the regulation of medicines while managing the associated challenges and risks. The workplan is divided into four arms:

  1. Guidance, policy, and product support. The EMRN will consider feedback from the public consultation on the AI reflection paper (discussed in our August 2023 digest) and use that to provide continued support on the development and evaluation of AI in the medicines lifecycle, including releasing guidance on specific parts of the lifecycle (e.g., pharmacovigilance). The EMRN will also start preparation for implementation of the AI Act in mid-2024.
  2. Tools and technologies. The EMRN aims to develop and implement knowledge mining tools, start a phased roll-out of large language models, and publish an AI tools policy for open and collaborative AI development.
  3. Collaboration and change management. The EMRN will continue to collaborate with international partners to share knowledge and keep abreast of the evolving field of AI.
  4. Experimentation. The EMRN will conduct six monthly experimentation cycles and undertake “technical deep dives” to gain new insights from AI.

The BDSG states that the workplan will be regularly updated to reflect the evolving technologies and policies, and that key stakeholders will be involved in the implementation of the plan.

Privacy Updates

European Council and European Parliament Adopt Positions on the Regulation Creating a European Health Data Space. On December 6, the Council of the European Union adopted its position on the text of the regulation on creating a European Health Data Space, which was followed by the adoption of a position by the members of the European Parliament on December 13. The European Council and the EP must now reach a common provisional agreement on the regulation before it can be formally adopted.

On December 1, the European Federation of Pharmaceutical Industries and Associations (EFPIA) reacted to the position of the members of the European Parliament on the regulation creating a European Health Data Space (EHDS), showing a particular concern over the inclusion of opt-in and opt-out mechanisms. According to EFPIA, for certain categories of data such as data concerning rare disease patients, including such mechanisms would discourage the submission of data to the EHDS. Less data means less information, which would hinder the evolution of personalized treatments. In addition, EFPIA calls for clarification on how provisions on IP and trade secret data will coexist with the EHDS regulation, and on each of the types of data and their intended scope.

Product Liability Updates

Provisional Agreement Reached on the New EU Product Liability Directive. On December 14, a provisional political agreement was reached between the European Parliament and European Council on the commission’s proposal for a revised Product Liability Directive (revised PLD) to replace the existing Product Liability Directive (85/374/EEC) and expand its scope to include AI systems and software. Both the EP and the European Council have previously set out their positions (see our November 2022, July 2023, and November 2023 digests) and now a compromise text has been announced. Key compromises include:

  • The definition of “product” will include digital manufacturing files and software, but the rules will not apply to free and open-source software developed or supplied outside of a commercial activity.
  • A company that makes substantial modifications to a product can be held liable as a manufacturer, as well as importers, authorized representatives, or fulfilment service providers if a manufacturer is established outside of the EU.
  • The burden of proof for claimants has been simplified such that defectiveness may be presumed in certain circumstances, for example where the claimant faces excessive difficulties in doing so due to technical or scientific complexities.
  • An extended liability period of 25 years will applyr in exceptional cases.

The revised PLD is set to enter into force once formally approved by the EP and the European Council, with the new rules applying to products on the market 24 months after entry into force. The AI Liability Directive is still being examined by the EP and the European Council.

Intellectual Property Updates

UK Supreme Court Rules That AI Cannot Be Patent Inventor. On December 20, the UK Supreme Court ruled that an AI system cannot be the inventor of a patent within the meaning of the Patent Act 1977. The case concerned an AI system called DABUS, created and owned by Dr. Stephen Thaler. Thaler filed two patent applications listing DABUS as the inventor. He argued that the AI devised new inventions independently of him, including a food container and a flashing beacon. The UK Supreme Court unanimously rejected Thaler’s appeal, holding that an inventor must be a natural person, and an AI system does not meet this definition. The UK Supreme Court also held that Thaler himself was not entitled to apply for the patents; the patent applications did not have a natural person inventor and Thaler had not acquired any rights to the invention through his ownership of DABUS. The case has brought increased certainty to the application of the Patent Act 1977 to AI outputs and highlights a clear barrier to obtaining patent protection for AI outputs in the UK.

*The following individuals contributed to this Newsletter:

Amanda Cassidy is employed as a Senior Health Policy Advisor at Arnold & Porter’s Washington, D.C. office. Amanda is not admitted to the practice of law.
Eugenia Pierson is employed as a Senior Health Policy Advisor at Arnold & Porter’s Washington, D.C. office. Eugenia is not admitted to the practice of law.
Mickayla Stogsdill is employed as a Senior Policy Specialist at Arnold & Porter’s Washington, D.C. office. Mickayla is not admitted to the practice of law.
Katie Brown is employed as a Policy Advisor at Arnold & Porter’s Washington, D.C. office. Katie is not admitted to the practice of law.
Heba Jalil is employed as a Trainee Solicitor at Arnold & Porter's London office. Heba is not admitted to the practice of law.

© Arnold & Porter Kaye Scholer LLP 2024 All Rights Reserved. This Advisory is intended to be a general summary of the law and does not constitute legal advice. You should consult with counsel to determine applicable legal requirements in a specific fact situation.