• linkedin
  • Increase Font
  • Sharebar

    Where's the infrastructure for precision medicine?

    The remarkable advances over the past 15 years that began with the first draft sequence of the human genome can’t be brought to full fruition without an infrastructure to support biomarker tests for molecularly targeted therapies, according to a new report from the National Academy of Medicine (NAM).

    See also: Pharmacogenomics: Right med, right dose, every time

    To establish that infrastructure, major changes are needed in policy and practice, said the expert panel that wrote the report, titled “Biomarker tests for molecularly targeted therapies: Key to unlocking precision medicine.”

    Those changes range from better consensus as to what constitutes evidence of a test’s clinical utility to electronic health records that can reflect those tests and allow for “rapid learning.”

    Common standards

    The panel called on the Department of Health and Human Services to facilitate the development of common evidentiary standards for tests, perhaps through the convening of multi-stakeholder bodies. The current lack of such standards is a significant drawback to getting tests into clinical practice. At the same time, biomarker tests with limited evidence could lead to patient harm, said the report.

    Evidentiary standards and study approaches should be developed to simultaneously accommodate various types of decisions, including clinical, regulatory, coverage, and reimbursement decisions, as well as guideline recommendations, the panel said.

    Partly because of the variety of considerations, an extensive range of stakeholders should be involved in utility studies, the report suggested, with participants to include patients, public and private payers, federal agencies, guideline developers, test developers, pharmaceutical companies, molecular pathologists, clinical laboratory geneticists, and research funders.

    See also: Pharmacogenomics could boost community pharmacy sales

    Evidence over time

    Critical to the process, the panel asserted, is the concept of the evolution over time of evidence for test utility, something that is not always considered.

    The report envisions a rapid learning system that “would systematically collect and analyze data on biomarker tests, molecularly targeted therapies, patient management, and outcomes.”

    Consistent process

    David Veenstra, PharmD, PhD, professor in the Pharmaceutical Outcomes Research and Policy Program at the University of Washington, called the proliferation of these tests quite impressive. The great challenge, he noted, is that there is no way to generate gold-star evidence for everything, “so we need some kind of process by which we can make reasonably consistent decisions about when we can move forward with implementation vs. evidence generation.”

    The fact that this must be an ongoing process is one of the most important considerations. What that will look like with so many stakeholders involved is the question many people are struggling to answer, he noted.

    “I think that it is really about identifying some common processes and concepts, frankly, at a pretty high level,” said Veenstra.

    Important to the process, he said, is the concept of “risk sharing,” meaning there is a decision to move forward with a technology but those involved generate evidence while using it.

    0 Comments

    You must be signed in to leave a comment. Registering is fast and free!

    All comments must follow the ModernMedicine Network community rules and terms of use, and will be moderated. ModernMedicine reserves the right to use the comments we receive, in whole or in part,in any medium. See also the Terms of Use, Privacy Policy and Community FAQ.

    • No comments available