Department of Health and Human Services

NATIONAL COMMITTEE ON VITAL AND HEALTH STATISTICS

Subcommittee on Standards and Security

May 20-22, 2003

Arlington, VA


Meeting Minutes

The Subcommittee on Standards and Security of the National Committee on Vital and Health Statistics (NCVHS) held hearings on May 20-22, 2003, at the Crowne Plaza Hotel in Arlington, VA.  The meeting was open to the public.  Present:

Subcommittee members

Staff and Liaisons

Others

EXECUTIVE SUMMARY

May 20-22, 2003

The Subcommittee on Standards and Security held hearings May 20-22, 2003.  The Committee is the main public advisory committee to the Department of Health and Human Services (HHS) on national health information policy.  During three days, the Subcommittee heard 40 presentations on the status of the Health Insurance Portability and Accountability Act (HIPAA) implementation, including the administrative and financial transactions final rule, the Consolidated  Healthcare Informatics initiative, healthcare terminologies and clinical issues, a federally funded ICD cost impact study undertaken by RAND, and the interim enforcement rule. 

HIPAA Update

Ms. Trudel provided a brief update on regulations. 

Compliance Readiness Update:  What We Know So Far

Mr. Cunningham said the practices had taken first steps towards transactions compliance and were poised for next steps.  But he cautioned they couldn’t do much until their trading partners, particularly vendors, were ready.  He noted the need for a clear-cut action plan and a set of steps. 

Mr. Rishel detailed a Gartner survey of physicians’ practices dividing HIPAA preparation into: awareness, risk assessment gap analysis, planning, implementation, testing and rolling out education.  Responses indicated many payers and providers hadn’t finished planning. 

Overview of WEDI/HIPAA Contingency Planning Proposal

Mr. Jones reported on the Workgroup for Electronic Data Interchange’s (WEDI) April letter to the Secretary addressing HIPAA and implementation issues.  Concerned that half the covered entities wouldn’t be prepared by October, WEDI recommended that compliant covered entities continue to accept non-standard transaction communications. 

Provider Reactor Panel

Mr. Cunningham conveyed the American College of Physicians’ (ACP) concern that practices risked missing the testing deadline.  He cautioned about the disastrous consequences of cutting off payment of claims, noting it would be unfair and irresponsible to blame practices.  He recommended linking claims payment cutoffs to enforcement and applying WEDI’s provisions to Medicare payments. 

Expressing hospitals’ concern about harm from disruption of claim submission and payment cycles, Mr. Arges and American Hospital Association (AHA) urged HHS to issue industry guidance directing plans to have ready a comprehensive contingency plan guaranteeing appropriate cash flow to providers. 

The Medical Group Management Association (MGMA) supported WEDI’s recommendations and urged the government to initiate an industry survey looking at what could drive policy.  MGMA said health plans should be allowed to accept proprietary claim formats for six months, reporting back missing data elements.  Mr. Tennant advised plans to prepare for additional paper claims and an influx of providers testing at the last minute. 

Mr. Bechtel conveyed AFEHCT’s concern that the industry and HIPAA hadn’t made enough progress.  He noted three problems: testing, data content, and overall project management.  Mr. Bechtel supported WEDI’s proposal.  AFEHCT believed operational compliance would help resolve data content issues and emphasized the need to sequence and plan with other transactions.

Mr. Daley cautioned that up to 30 percent of Blue Cross/Blue Shield Association’s (BCBSA) institutional and 40 percent of their professional providers wouldn’t be ready.  He noted concern that some clearinghouses were near capacity and might not accept new customers.  BCBSA considered WEDI’s suggestion an effort and expense that wouldn’t be viable on a trading-partner-by-trading-partner basis.  He urged HHS to announce contingency action.

Mr. Wilder expressed the American Association of Health Plans’ (AAHP) concern over covered entities not being ready in October and how to work together.  Even if CMS had authority to order plans to set up a payment system, which AAHP doubted, they didn’t support AHA’s approach. 

Consolidated Health Informatics Initiative Roundtable:  Outreach Plan and Portfolio

Ms. Adair said Consolidated Health Informatics (CHI) identified a target portfolio of 24 messaging and vocabulary domains and adopted four messaging and one vocabulary standard government-wide.  Six cross-agency vocabulary teams defined domains and identified candidates for standards to build into the enterprise’s IT architectures.  CHI sought industry input and agreement about their portfolio’s domains, prioritization, and approach. 

Consolidated Health Informatics Initiative Roundtable:  Specific Standards

Five CHI workgroups (i.e., laboratory result content team, medications group, demographics workgroup, immunizations group, and the interventions and procedures group) shared background and preliminary analyses, highlighting challenges CHI faced with harmonization. 

- Day Two -

Panel 1 - Terminology Users: Healthcare Providers

Dr. Oliver noted that criteria for the National Drug File Reference Terminology (NDF-RT) used by PharmGKB included: low cost, few proprietary constraints, generic names, trade names, and a rational classification structure.  She said NDF-RT satisfied their requirements. 

Dr. Larsen said the prime purpose of Intermountain Health Care (IHC) dictionary and vocabularies was a robust vocabulary.  IHC created concepts within their health data dictionary by: manual creation, importing available vocabularies, and combining both methods.  NDDF was stable and a consistent source of knowledge bases and vocabulary.  LOINC was used to generate and build concepts in HDD, for lab ordering and clinical observations. 

Dr. Guay set two requirements for clinicians: systems recording information must clearly record essentials, and the patient record must be the sole source of administrative/clinical data.  He said SNODENT was rich enough that it didn’t need interpretation or extrapolation of other codes and allowed easy data entry and manipulation.  SNODENT allowed distinction of a broad, rich terminology related to specific diagnoses. 

Panel 2 - Terminology Users: Healthcare Vendors

Dr. Levy noted SNOMED was an ideal core for a language engine.  It followed basic principles of concept-based terminologies, and those concepts and hierarchies served as a hub from which new terminologies mapped.  Dr. Levy shared his perspectives on multiple terminologies, noting it was important to subset terminologies. 

Dr. Chuang identified issues Cerner faced providing terminologies to end-users.SNOMED CT was an excellent definitional knowledge for data interoperability, comparability, aggregation abilities and quality.  He said terminologies didn’t solve issues of comparability and data quality.

Dr. Faughnan said McKesson used NDDF Plus extensively for many years with good results.  McKesson plans to use SNOMED CT for conditions, diagnosis, procedures, findings, interventions, and for lab results in collaboration with labLOINC.  McKesson chose SNOMED because it’s architected for growth, evolution, and maintenance.  McKesson liked SNOMED’s capacity to map to the administrative code sets and its healthcare coverage, particularly for nursing and allied healthcare.  SNOMED enabled McKesson to have a reference terminology, yet deploy in their applications the richness of the nursing terminologies and integrate them.  Dr. Faughnan urged the government to encourage collaboration and interoperability between SNOMED and RX NORM. 

Dr. Lau discussed lab LOINC, which provided observation ID for a Health Level Seven (HL7) transaction.  She said 3M Healthcare Data Dictionary (3M HDD) was known for mapping and used the LOINC dictionary as an intermediary.  Observing that everyone had been focused on building, she said it was time to pay attention to using.  She emphasized the challenge of sustaining interoperability with multiple people in multiple places determining codes.

Panel 3 - Terminology Users: Healthcare Providers

Dr. Zinder said the Department of Defense (DOD) formally tested and limitedly deployed Medcin in their computerized medical record.  He emphasized its framework, which he viewed as a nomenclature, and focused on patient/provider interaction, noting Medcin balanced being expressive, usable, and tractableHe said Medcin’s power was its ability to do symptom surveillance.  Medcin was useful at the point of care for capturing clinical documentation, but SNOMED was a more granular, atomic and post-coordinated tool and better at the database layer, even without a good interface.  Dr. Zinder emphasized distinguishing between: tools, content, and processes. 

Dr. Peterson said Auraria Campus’s Student Health Center used Medcin in a paperless system for about a year.  Physicians easily used Medcin at the point of care and multiple people documented the clinical encounter from the front desk to checkout.  Medcin’s allowed real time effortless gathering of clinically relevant data without the healthcare provider investing extra time. 

Dr. Warren told how the University of Kansas School of Nursing (KUSN) turned a live patient information system into an educational tool with a corporate sponsor: the Simulated E-hEalth Delivery System (SEEDS).  She said SNOMED RT and CT put together a suite of languages and terminologies nurses needed; no nursing terminologies captured findings this well.  SNOMED CT provided the foundation for bringing in all the concepts used by the disciplines.

Dr. Madden said SNOMED was the standard terminology for anatomic pathology for years.  However, coding a report with multiple diagnoses required other terminologies from which SNOMED could take ideas and extensions.  Dr. Madden said HL7 architecture was an additional important ancillary technology to permit referencing medical statements in context. 

Dr. Dash said LOINC was explicit and specific about generating codes and provided true post-coordination.  SNOMED provided a hierarchy on top of LOINC that allowed expression of laboratory data in a common universal manner.  LOINC provided post-coordination; SNOMED provided aggregability.  Dr. Dash recommended promoting: SNOMED CT as a base standard, mapping to domain specific standards, and development of specific post-coordination requiring SNOMED CT and making the standard usable for messaging and communication HL7.

Panel 4 - Terminology Users: Healthcare Vendors

Dr. Mays said Apelon was familiar with the NCI Thesaurus, FRT, and SNOMED CT.  All had or were adopting features Apelon found useful in software and content delivery: rich hierarchies, compositional definitions, change management, and ability for clinicians to navigate with the terminology.  Dr. Mays said NDF-RT was interesting but unproven.  He suggested the Committee recognize a few core, non-overlapping clinical terminologies.  He encouraged the government to promote harmonization of core models and common reference taxonomies. 

Dr. Brandt said SNOMED’s ontology was useful in having a term; modeling system intelligence and knowledge at term level; and having this drive alerts, reminders, construction of order sets and documentation templates.  The issue that providers had to specify clinical knowledge at the granular level, and that was impractical for vendors to model, requires a hierarchical ontology, allowing knowledge to be modeled at a more general level, and to apply it to patients with more specific conditions.  Also, patients often had multiple diagnoses that hindered constructing an interface.  Dr. Brandt urged the federal government to establish scope for each effort, make implementation practical and affordable by implementing national standards and dealing with licensure, and determine priorities for modeling terms.  He advocated point of care decision support and creating incentives for suppliers to fill gaps.

Mr. Wood said the Defense Medical Logistics Standard Support System (DMLSS) was developed and deployed to hospitals within DOD.  During Desert Storm, three services used different systems that couldn’t communicate together.  Today they use one.  DMLSS adopted ECRI’s manufacturer and nomenclature systems.  Users seeking equipment used a review and approval process.  Guidance was automated.  Users could view equipment and quality assurance records.  Mr. Wood said standard nomenclature enabled them to tie the system together.

Organizational Activities Relative to Clinical Terminologies

Speaking on behalf of the Markle Foundation and Connecting for Health, Dr. Cimino presented 16 preliminary consensus statements about terminology and process requirements for developing integrated healthcare and terminology models. 

Mr. Larsen presented the Health Information Management Systems Society’s (HIMSS) summary analysis of the terminology questionnaires.  Even if the core terminology group described in Mr. Sujansky’s analysis was technically accurate and met the needs, Mr. Larsen doubted that it would move them closer to achieving a universal health record.  He considered EHR terminology the most global.  But he pointed out that moving forward would be difficult until they’d defined EHR, decided feasibility and phased it.  He said they’d welcome standardization if: flexibility allowed for incorporating it into a system, an EHR mandate backed it, and it was implemented as part of an international standard.

- Day Three -

Panel 5 - Terminology Users: Health Care Providers & Vendors

Dr. Butler said EPIC used LOINC, Medcin, and had begun using SNOMED.  Donald Chiolus explained that LOINC gave EPIC’s customers ability without disrupting existing work codes and test knowledge.  EPIC’s customers mostly used Medcin to link to medical documentation and intended to use SNOMED terminology in the EPIC application to document clinical information.  He said granularity was a major advantage with SNOMED and he hoped it became the national standard.  EPIC recommended that NCVHS recognize and adopt one or more clinically specific terminology to serve as core of the national PMRI standards. 

Dr. Dolin said Kaiser Permanente was committed to the continued use of SNOMED CT and Laboratory LOINC.  SNOMED CT, LOINC, and Kaiser First Data Bank drug terminology for pharmacy were classified together as one cohesive core.  CMT was their lingua franca of interoperability. 

Ms. Meadows said the Southwest Oncology Group (SWOG) worked with the PMRI terminology in the context of cancer clinical trial data collection and management.  The National Cancer Institute (NCI) thesaurus was developed by NCI to provide a centralized reference terminology based on working vocabularies used by NCI and collaborating groups.  She expressed two concerns about structuring a vocabulary to facilitate clinical research: flexibility to respond to the client’s dynamics essential for research, and offering timely inclusion of new terms into the vocabulary without creating obstacles for the research process.

Dr. Pittelkow said SNOMED CT was becoming more widely adopted and utilized at the Mayo practice.  SNOMED CT was used extensively for indexing clinical notes with automated data abstraction.  SNOMED CT also was used in dermatology to index Web pages.  Mayo was developing pilot projects to automate the assignment for E and M coding, in fraud and abuse auditing, and various quality assurance programs.

Discussion on Previous Terminology Testimony

The Subcommittee reviewed trends regarding SNOMED CT, LOINC, MedCin First Data Bank, and NDDF.  Dr. Sujansky said the most important next step was refining their set of candidate terminologies into a preliminary recommendation for a core terminology group.  Other steps were the analysis of questionnaire responses and letter of recommendations to the Secretary. 

ICD-10 Impact Study

Dr. Libiwick said the Science and Technology Institute of Rand expected to present their impact study at the August Subcommittee meeting.  RAND was studying three questions: the costs and benefits of switching from ICD-9-CM diagnostic codes to ICD-10-CM, and ICD-9-CM procedure codes to ICD-10-PCS and--if it was advisable to switch to both--whether they should be done sequentially or simultaneously.   . 

Enforcement Rule

Dr. Cohn noted the Department issued an interim enforcement rule that expired on September 16, 2004.  The Subcommittee discussed topics for the next full Committee meeting on June 24-25 including a letter related to their PMRI testimony, early recommendations and communications with the Secretary; a brief update on the claims attachment rule and changes to that standard; the annual HIPAA report; work underway on the NCVHS Web site; and the ASCA report. 

DETAILED MINUTES

- Day One -

HIPAA Update

Ms. Trudel reported working through the process of clearing the national provider identifier final rule and the health plan identifier proposed rule.  The claims attachment proposed rule was in a holding pattern so HL7 and X-12 could perfect and ballot a version with another architecture.  The enforcement procedural rule setting out legal processes was published, but she said the substantive rule and its definitions would be extensive, probably receive considerable comments, and wouldn’t be published until the next fiscal year as a proposed rule.

Compliance Readiness Update: What We Know So Far

Mr. Cunningham said 70 percent of practicing physicians were in small practices (20 or less), half in practices of one-to-five physicians, and a third in one-and-two doctor practices.  Smaller settings didn’t retain financial reserves and administrative functions were left to an office manager who struggled to sustain patient flow, put out fires, and tend to basic business essentials

Mr. Cunningham said everything physicians did tried to minimize distractions in their business operations, and their first reaction to the complex privacy rule was rage followed by denial, but he emphasized that they did it.  He depicted their reaction to the transactions rule as an almost eerie calm.  Physicians didn’t know they had a role and had to act internally.  They still expected that transactions rule updates would be taken care of by their practice management system vendor who wasn’t a covered entity or compelled by law to meet a deadline. 

At ACP’s annual meeting in April, 92 respondents filled out a survey form.  Most had determined whether they were covered entities.  A sizable percentage had discussed implementing the transactions rule with their vendor.  The majority said they’d verified they could send an 837.  A high percentage indicated they’d updated their software.

Mr. Cunningham doubted vendors could have updated over half the practices’ software.  More people reported testing than had updated.  And those who’d taken a HIPAA course had the lowest percentages, suggesting an inverse relationship between knowledge and implementation that seemed counterintuitive.  This feeling was reinforced by a show of hands at an ACP seminar on HIPAA where most attendees were staff.  Few had done any investigating of the required content, educated their staff, or tested.  Only 18 percent of the office managers in a March survey by the Professional Association of Health Care Office Management had begun testing.

Mr. Cunningham said the practices had taken first steps towards transactions compliance.  With privacy behind them, they were poised for the next steps, but he cautioned that they couldn’t do much until their trading partners (particularly vendors) were ready.  Recommending that, together, they educate and prod, he noted the need for a clear-cut action plan and a set of steps. 

Compliance Readiness Update: What We Know So Far

Mr. Rishel described a Gartner three-year random-sample survey of practices of 30 or more physicians dividing HIPAA preparation into: awareness, risk assessment, gap analysis, planning, implementation, testing and rolling out education.  Responses indicated planning wasn’t always as far along as implementing; many payers and more providers hadn’t finished planning by March.  Mr. Rishel cautioned that those who reported testing might only have tested one application, noting rework often was needed because a gap analysis wasn’t done before remediation.  Some 55 percent of health plans and 39 percent of providers had completed a detailed implementation plan.  Almost all reported working on one.  Application-specific decisions were lowest for providers.  Only 23 percent of plans and half as many providers had completed vendor contractual commitments.  Noting fewer entities (particularly providers) reported completing a cost benefit analysis, Mr. Rishel suggested it was increasingly seen as a mandate. 

Mr. Rishel reported that only 68 percent of health plans and 61 percent of providers identified the employee training methodology--a synecdoche for privacy.  Reassignment of need-to-know classifications was down 28 percent; Mr. Rishel suggested many organizations felt job classifications already were in place.  Eight percent of plans and two percent of providers had implemented system changes, interfaces and conversions.  Some 94 percent of health plans and 77 percent of providers reported starting; 32 percent of the plans and 29 percent of providers were finished.  Few reported implementing security components. 

Some 78 percent of the plans and 52 percent of providers reported internal testing.  About 67 and 26 percent, respectively, had a trading partner for testing.  Remittance advice, eligibility, claims status, and referrals were lower; premium payment and enrollment/disenrollment were higher.  About 14 percent of providers said they’d begun internal testing of premium; 31 percent reported beginning internal testing of enrollment/disenrollment.  He noted that, as employers, provider organizations were interested in using EDI components to improve personnel policies. 

Responses showed completion up through level four (implementation) but only the start of level five.  Mr. Rishel noted steady progress with finishes for levels three.  Some 55 percent of the plans and 48 percent of providers had begun four.  He cautioned that focusing on those who’d finished level three and providers who’d begun level five and testing conveyed a picture more optimistic than random sampling.

About 55-61 percent of the respondents said less than ten percent of their trading partners were in contact; 24 percent of the providers with under a billion dollars of annual revenue said no one contacted them.  Mr. Rishel noted that more providers would be able to test after updates to provider systems were entered and tested at leading-edge sites, problems found, and corrections made and distributed.  Anecdotal information indicated that testing showed moderately good results conforming to the format, but relatively poor results in having sufficient data present for payment.  Citing the adage that “Design and coding comprised 90 percent of a project and testing the other 90 percent,” Mr. Rishel said the other 90 percent had barely begun.

Overview of WEDI/HIPAA Contingency Planning Proposal

Mr. Jones said WEDI’s 6,500 members participated in the strategic national implementation process initiative that addressed HIPAA and implementation issues.  Dave Miller, director of a task group formed in response to problems with October implementation of the transaction code set that looked at contingency planning/transitional issues, noted an April letter to the Secretary about WEDI’s deliberations. 

Concerned about impact on the industry without 95 percent compliance, the group gleaned that half the covered entities (particularly providers) wouldn’t be prepared, and considered alternatives.  Their first recommendation dealt with data content: providers could achieve compliance through a clearinghouse, but were concerned about not having certain data elements.  Cautioning that some covered entities couldn’t receive transactions and resulting paper claims would impact providers and payers, increasing administrative expense and delaying processing, Mr. Jones recommended that compliant covered entities continue to accept non-standard transaction communications.  Emphasizing that without resolution by mid-June, it would be difficult for the community to prepare for October implementation, the group asked the Secretary to respond within 60 days.

Steve Lazarus, WEDI’s immediate past chair, noted the industry had to deal with a family of transactions in various formats required to be supportive of legislation and regulations.  Although WEDI didn’t reach consensus about the duration of the transition, they agreed it would be less than a year.  Mr. Jones reiterated that the intent was to ensure a smooth, brief transition enabling implementation and testing while avoiding further delays. 

Provider Reactor Panel

Mr. Cunningham conveyed ACP’s belief that limited internal resources, scant understanding of the technical task, uncertainty about trading partner’s readiness to test, and only a five-month window meant some practices risked missing the testing deadline.  He cautioned about the disastrous consequences of cutting off payment of claims because of a missed deadline: it could take months to replace noncompliant practice management systems and disruption in cash flow would send small practices into insolvency. 

Pointing out that delays might result from vendors’ and health plans’ inability to test, Mr. Cunningham said it would be unfair and irresponsible to blame practices for procrastination.  Noting WEDI outlined severe potential systemic consequences, he contended this was the result of a conscious federal government decision over which HHS exercised discretionary control.  ACP supported both WEDI recommendations and asked HHS to ameliorate those consequences. 

Mr. Cunningham recommended two modifications to WEDI’s proposal.  Noting HHS announced and followed a reasonable, progressive enforcement process on privacy and intended the same with transactions, he advised linking claims payment cutoffs to the enforcement process.  Payment should continue so long as a practice met those steps.  ACP also recommended that WEDI’s provisions apply to Medicare payments, which represent about 40 percent of most internal and general/internal medicine practices’ receipts.  Mr. Cunningham cautioned that cutting off receipts would deal a devastating blow and many practices, whose only recourse would be paper claims, would be stymied by the Administrative Simplification Compliance Act that stated any entity above a certain size had to file electronically.  Clarification on that waiver was needed. 

Provider Reactor Panel

Mr. Arges said hospitals’ greatest concern about implementing HIPAA transaction standards was the harm that could occur from disruption of claim submission and payment cycles.  AHA urged CMS to focus on developing a system-wide implementation plan outlining correction actions plans must be ready to execute so providers had adequate cash flow.  Mr. Arges urged plans to develop a rigorous testing schedule that included end-to-end testing directly with providers and gave notice and time to correct reporting and formatting deficiencies.  AHA also urged CMS to recommend routine use of various X-12 acknowledgement transactions as an adjunct to the submission of standardized transactions.

Mr. Arges said HHS had to issue industry guidance to ensure that all stakeholders, health plans, and providers were prepared for processing failures.  Guidance should direct plans to have ready before the deadline a comprehensive contingency plan guaranteeing appropriate cash flow to all providers with whom the plan did business.  Mr. Arges said creating this safety net as providers worked with plans and clearinghouses to iron out bugs would minimize the incentive for providers to abandon electronic claims submission. 

AHA believed only HHS was in a position to provide the leadership necessary for development of a system-wide implementation plan.  AHA said the plan must communicate clearly to providers the course of action each plan would adopt to prevent disruptions to the historical payment cycle.  The communication should: (1) identify individuals or departments responsible for coordinating continuation of payment volumes and capable of applying corrective measures to restore pre-HIPAA volumes and (2) describe documentation or information providers must supply to receive contingency payment.  Every health plan must establish a baseline for each provider including the average number of claims processed per day for the year prior to the deadline and the average daily payment.  Providers would be eligible for payment if their daily payment or volume of claims processed/received fell more than five percent below the baseline.  Providers submitting a corrected mechanism would be eligible to receive payment.  HHS would require contingency payments to providers for six months after the compliance date.  Thereafter HHS should use this enforcement discretion to ensure that plans or providers that continued to have reoccurring contingency-triggering events addressed deficiencies.  Plans and clearinghouses must notify providers of missing data elements so corrections could be made. 

Provider Reactor Panel

Mr. Tennant reported that MGMA’s recent survey supported much that they’d heard.  Providers weren’t ready, identified vendors as the bulk of the problem, and hadn’t identified potential missing data elements that were a roadblock. 

MGMA supported WEDI’s recommendations and urged the government to initiate a survey of what was happening in the industry that could drive policy.  Mr. Tennant recommended optional operational compliance.  So long as a HIPAA-formatted claim could be adjudicated, MGMA believed a plan should be able to accept it, while excluding certain data elements. 

MGMA concurred that health plans should be permitted to accept proprietary claim formats for six months, reporting back missing data elements.  MGMA suggested that plans be able to keep non-compliant claims in a way station while reporting missing data so providers could complete and resubmit claims.  Noting some plans offered clearinghouse software so providers could submit non-compliant claims that the clearinghouse converted to X-12 format for adjudication, Mr. Tennant urged providers that knew they wouldn’t be compliant in time to adopt that practice and develop a method of interim payments.

Mr. Tennant advised health plans to prepare for additional paper claims and an influx of providers trying to test at the last minute.  He recommended that plans not in compliance by the deadline shouldn’t be permitted to impose clearinghouse fees on providers submitting X-12 formatted claims.  MGMA also recommended an expansion of CMS educational activities aimed at small and rural providers and outreach to vendors.

Discussion

Mr. Arges reported that some providers had submitted claims for testing and initially many items failed.  He pointed out that new items required in the transaction standard weren’t a routine part of the adjudication system.  For instance, the insured’s birthday on a UB92 hadn’t been routinely collected and the clearinghouse probably couldn’t fill that in.  He said the Department needed to give guidance to health plans that they could relax this edit because the data wasn’t needed for adjudication.  Mr. Tennant noted an operational problem getting patient encounter data from multi-specialty clinics with a central billing office.  He said MGMA’s survey indicated that the majority of members hadn’t received information from health plans regarding testing schedules. 

Mr. Arges said AHA supported using the 4010-A for submission, rather than non-standard transactions.  He said it might be appropriate for plans to adjudicate deficient claims as though they were doing so pre-HIPAA.  The important thing was that providers used the 4010 and that plans identified deficiencies so they could be corrected.  Mr. Tennant said MGMA’s position was that the system had to flow smoothly and cash flow disruptions didn’t impact patient care.

Recalling that the earlier Gartner presentation suggested activity in other transactions and that they’d heard that the payment component made 835 and 837 transactions most important, participants discussed whether there wasn’t a priority for everybody to be doing 835 and 837 transactions first and tabling other transactions, at least from a testing perspective.  Then they could look at other transactions in a tiered order (e.g., eligibility, then enrollment- disenrollment.)

Mr. Arges said guidance was needed about other problems with different transactions (e.g., when providers did an eligibility inquiry they needed to know the type of deductibles and co-pays).  While agreeing that the focus had to be on the claim, Mr. Tennant said they didn’t want to penalize providers who’d adopted other standards, but encourage that as quickly as possible.

Industry Reactor Panel

Mr. Bechtel conveyed AFEHCT’s concern that the industry and HIPAA hadn’t made enough progress.  He noted three problems: testing, data content, and overall project management.

Mr. Bechtel concurred that trading partners weren’t ready to test.  Association for Electronic Health Care Transactions (AFEHCT) vendors and clearinghouses had remediated their systems and were ready, but some vendors on the practice management side reported problems with the lack of data content and struggled with remediation.  He noted that their development work depended on the addenda and companion guides that had been delayed.  Mr. Bechtel suggested that many didn’t realize these were content issues and still relied on clearinghouses to do the job.

Mr. Bechtel said re-identifying providers to trading partners, particularly health plans, so they could be re-enrolled in new systems took administrative time and further delaying testing.  He emphasized that they were testing changing technology as well as transactions, content, and systems performance with remediations. 

COB was an issue for vendors.  Mr. Bechtel noted a need for quick answers, so vendors could make transactions flow.  Those supporting new systems had to learn other methodologies, many supporting older systems were in transition, and resource issues were a problem. 

Mr. Bechtel said timetables and milestones were essential for monitoring and managing the process.  Noting the statistics they’d heard indicated they’d only seen a small sample and didn’t have a grip on the problem, he suggested that they were alpha testing as they went along.

Mr. Bechtel supported WEDI’s proposal allowing ongoing use of existing transactions until the transaction could be completed.  AFEHCT believed operational compliance would help resolve data content issues and emphasized the need to sequence and plan with other transactions.

Industry Reactor Panel

Mr. Daley said BCBSA would be compliant by October, but he expressed concern that up to 30 percent of their institutional and 40 percent of their professional providers wouldn’t be ready.  Mr. Daley reported that BCBSA and CMS did outreach and that BCBSSC was releasing a HIPAA Transactions and Code Sets tool kit to help providers understand the regulations and steps.  He noted concern that some clearinghouses were reaching capacity and might not accept more customers. 

BCBSA considered WEDI’s suggestion to permit compliant entities  accept non-standard transaction communications an additional effort and expense that wouldn’t be viable on a trading-partner-by-trading-partner basis.  Noting there might be flaws in the transactions, but that the system got claims in, payments out, and the cash flowed, Mr. Daley supported permitting covered entities that tried but couldn’t become compliant by October to accept current transactions for a brief period.  He urged HHS to announce the contingency action soon.

Industry Reactor Panel

Mr. Wilder expressed AAHP’s concern about whether covered entities would be ready in October and how to work together.  Noting that electronic transactions were quicker to process, less expensive, more accurate, and that providers and health plans had made progress in transitioning to electronic commerce, Mr. Wilder considered reverting to paper a step backwards.  AAHP believed there were significant advantages to HIPAA standards and didn’t support a delay.  Members wanted flexibility to process transactions and do business after October 16.  Even if CMS had authority to order plans to set up a payment system, which AAHP doubted, they didn’t support AHA’s approach. 

Discussion

Noting WEDI’s proposals for flexibility of relaxed edits or continuing to accept legacy formats only applied to compliant covered entities and some providers trying to be compliant wouldn’t be able to substantiate that by the deadline, participants emphasized that solutions were needed for all entities.  Members observed that, in addition to what the government could do to set strict time deadlines and encourage compliance, there was an economic model (non-compliance triggers a higher charge) that applied incentives rather than solely government regulation.

Responding to the common view that the industry wouldn’t be completely ready by mid-October with all trading partners equally competent at all transactions, participants discussed a requirement for a transition that everyone at least be in testing by October 15 and WEDI’s proposal.  Mr. Daley noted that continuing with the current system provided built in cash flow. 

Noting health plans mostly expected to be ready and that AHA didn’t want a delay, participants discussed targeting small ambulatory and rural providers for additional testing.  Mr. Bechtel pointed out that, even though health plans and large institutional providers were ready, they needed time to complete the process.  Ms. Trudel cautioned about putting extra administrative burden on plans, recalling that in implementing ASCA legislation they consistently heard that providers and plans had too much on their plates to request a waiver.

The sense of the Subcommittee was that testing was a fundamental bar one had to cross before there could be allowance for anything non-standard after October 15.  Somebody testing at least should be able to send the previous transactions and their old formats until they’d completed testing, were certified, and everything was copasetic and working.  On the basis of individual trading partner relationships, there could be decisions about sending standard format without a complete set of data elements.  Members had heard that testing could go on for probably four to six months.  Participants considered whether other transactions needed a legacy allowance.

Remarking that they’d heard a lot about perfection versus turn-arounds for certain transaction errors, the Subcommittee discussed asking HHS and CMS to clarify the meaning of a compliant transaction and what should be done.  Members also noted the industry needed guidance on how to deal with claims already in the pipeline that weren’t processed by October 16.  After being reviewed by the Office of General Counsel, the draft letter will be revisited in a conference call and draft discussion with the full Committee in June. 

Consolidated Health Informatics Initiative Roundtable:  Outreach Plan and portfolio

Ms. Adair said CHI enabled health data interoperability across the federal healthcare enterprise.  CHI identified a target portfolio of 24 messaging and vocabulary domains and adopted four messaging and one vocabulary standard government-wide.  Six cross-agency vocabulary teams defined domains (demographics, medications, interventions/ procedures, immunizations, lab result contents, and clinical encounters) and identified candidates for standards to build into the enterprise’s IT architectures and use as systems were modified and new ones deployed.  CHI sought ongoing input and clarification through NCVHS partnership and public hearing support.  Upcoming teams will focus on diagnosis/problem list, history and physical, text-based reports, population health, physiology, multipedia, nursing, supplies, genes and proteins.  Other teams will validate/harmonize terminologies used by other processes including financial/billing (HIPAA).  Over 20 agencies had joined CHI’s lead agencies: the Veterans Administration (VA), DOD, and HHS. 

CHI sought industry input and agreement about their portfolio’s domains, prioritization, and approach.  Mr. Christopherson noted the process’s voluntary nature, its leadership in adopting standards and demonstrating they were reasonable and doable, and that partnership relationships with many organizations in many places tipped the scale toward national standards.  (Similarly, he said it was in everyone’s interest for systems to be in harmony internationally.)  In these ways, CHI provided a foundation for the National Health Information Infrastructure (NHII). 

Domains were selected to address messaging and vocabulary needs, most critical domains, and develop the federal standards adoption process at multiple levels (adoption, acquisition, nudging, and new starts).  Determining domains and identifying standards also identified SDOs to maintain and foster them.  Ms. Adair emphasized evergreening and harmonization. 

Participants discussed the assumption that new systems would incorporate adopted standards as government projects unfolded and that implementation guides and conformance tests were critical for interoperability.  Remarking that standards were a tool and the goal was optimizing healthcare delivery and health outcomes, Mr. Christopherson asked how they might move from episodic care to person-orientation.  He noted they had to look at populations, caring for well and custodial patients, and delivering clinical and other healthcare practices that maximized accessibility, quality, affordability, “healthability,” and satisfaction.  He said standards, electronic health record systems and resolving exchange issues would help achieve the 2010 timeframe. 

Mr. Christopherson emphasized the role of the personal health record system as a tool to store, move and share trusted information and support services.  Standards enabled exchanging information in a standardized format that made electronic health records possible.  Mr. Christopherson said the significance of the Secretary’s announcement wasn’t the standards but agreement about the process triggering them and a tipping point for national standards.  He noted the Office of Management and Budget (OMB) endorsed the concept. 

Remarking on the convergence of national standards and high-performance systems (e.g., DOD’s CHI II, Kaiser’s Epic) and all the organizations looking at their next generation of systems and how they fit in, Mr. Christopherson emphasized priming the provider community to be more involved.  Noting fiscal concerns, he cited the Blackford Middleton Study that indicated a three-for-one payoff for investing IT dollars in sophisticated systems.  Mr. Christopherson expressed confidence in the path and practical implementation of their goal, the probability of an end-game of 2010, and possibility by then of being primarily paperless. 

Consolidated Health Informatics Initiative Roundtable:  Specific Standards

Five CHI workgroups shared background and preliminary analyses, highlighting challenges CHI faced with harmonization.  Steve Steindel presented preliminary results of the laboratory result content team, noting that result values encompassed both human and non-human domains.  The team identified laboratory result components (number associated, ordinal, cut-off value, alphanumeric or descriptive result).  Reports included reference or normal ranges and unit.  Abbreviations and comments were common.  A description of the testing method made the result understandable.  SNOMED CT was a candidate for terminologies.  HL7 was a contender for units.  Research lacked a harmonized specialty terminology.  The non-human area needed work. 

The medications group narrowed down their scope to: active ingredients, clinical drugs, dosage forms, packaged products, subpopulations relevant to FDA’s needs, drug classes, and section headers for labeling.  Candidates for active ingredients included CAS numbers, IUPAC-specified chemical names, and U.S and international approved and non-proprietary names.  VA product names, RX NORM names, and products of commercial vendors (including FTV, Micromedix, and Medispan) were candidates for clinical drugs.  HL7, Center for Drug Evaluation Research Data Standards’ Manual, and approved drug products from the FDA’s Orange Book were candidates for dosage forms.  NDC, Drug Facts and Comparisons, and knowledge-based vendors contended for the drug product level.  Steve Brown said special populations would be narrow in scope, addressing FDA needs and integrating with the demographics group.  `

The demographics workgroup defined scope by how patient care was delivered, relating it to patients, patient medical records and clinical care.  Specifics were set for collecting and storing specific patient demographic data used primarily for patient care and identification.  Dr. Beth Franchi noted that patient demographic information could be an important component in focusing on reactions to specific healthcare regimens and clinical trials, identifying health trends within population segments, and tracking elimination of health disparities.  Overlapping sub domains included financial, billing, legal and regulatory concerns, research and communication.  The team was considering HL7 2.4, X-12, and ASTM.  Participants noted that all three had inadequacies, but HL7 already was adopted as a message standard; the challenge was getting HL7 to harmonize.  They discussed a hierarchy that could be used at different levels and contexts, providing a compatible subset and deeper granularity. 

The immunizations group noted it was a bi-phasic domain: messages and terminology.  Terminology had clinical and product sides.  The clinical area had both normal descriptive information (e.g., vaccine site and mode) and abnormal information (i.e. regarding adverse reaction to an immunization event).  Candidates included immunization messages CDC and HL7 developed.  Dr. Steindel noted that both immunizations and medication might need support and information for adverse event reporting messaging.  MEDRA was being considered as a terminology.  Dr. Steindel noted that domains sometimes intermixed with messaging and recommendations were needed in that area; some domains overlapped with others in adverse event reporting and there was a need to determine how to resolve domain specific issues.

Evaluating code sets, terminologies, and vocabularies, the interventions and procedures group weighed whether work done at the code set and clinical terminology levels supported a clinical or electronic health record and, if so, how to consider criteria that couldn’t support a code set for billing purposes.  In looking for a clinical terminology that was standard in terms of technical criteria (e.g., concept orientation and permanence, non-ambiguity) and enabled clinical users to invoke control terms corresponding to formal concepts organized by a classification schema, Dr. Jorge Ferrer said the team excluded existing code sets used for administrative, financial or regulatory functions.  Candidates narrowed down to MEDCIN and SNOMED CT. 

Asked if care providers were incorporated in the demographics discussion and about linkage to interventions and procedures, Cynthia Wark said the sixth team deployed did ongoing work in the clinical encounters area.  Noting that providers and signatures were areas where there would probably be cross cutting, she said the encounter group was asked to wait until encounters could be reassessed in the context of the other clinical domains.

- Day Two -

Panel 1 - Terminology Users: Healthcare Providers

Dr. Oliver noted criteria for NDF-RT used by PharmGKB included: low cost, few proprietary constraints, primarily generic names, and a rational classification structure.  PharmGKB collected experimental data and sought capability to index data sets and store indexed pharmacogenetic articles.  PharmGKB looked for mechanism of action, physiologic effects, and conditions treated in classification structure.  NDF-RT features included ingredients and generic and trade names.  Forms, strength, manufacturer, package size or type weren’t used.  NDF-RT hadn’t yet used UMLS, Concept Unique Identifier or MeSH definitions.

NDF-RT was delivered as a text file in XML and had six categories.  Definitions for concepts and information and property values for drugs were in the file.  Because it was in XML and organized in a frame-based system, a relational format was initially sought.  But the structure was more frame knowledge-based and was loaded into Prot_g_.  One could search by term, bringing up mapped information, or navigate the hierarchy and click down to preparation.  Additional trade names could be added by mapping generic terms from NDF-RT to the FDA Orange Book.  The PharmGKB user interface had a simple search box.  One could search for drugs as alphabetized in NDF-RT or based on drugs containing primary experimental data.

Dr. Oliver said NDF-RT satisfied their requirements.  Organized as an inheritance hierarchy of concepts with roles and properties, it required a tool (e.g., Prot_g_) to view it, but data could be converted into relational tables for their database back-end. 

Panel 1 - Terminology Users: Healthcare Providers

Dr. Larsen said IHC dictionary and vocabularies were measured by their purpose and the first was a robust vocabulary.  IHC created concepts within their health data dictionary by: (1) manual creation, (2) importing available vocabularies, and (3) both importing vocabularies and creating relationships manually.  He said NDDF Plus’s strength was adoption of recognized vocabulary practices (e.g., use of numeric identifiers without meaning, use of domains in relationships expressing domains, hierarchies as relationships).  Dr. Larsen differentiated between permanent storable and transitory concepts.  He noted there were multiple therapeutic classifications, primitives that built into composite concepts, and knowledge bases.  He said NDDF was stable and a consistent source of knowledge bases and vocabulary.  IHC had service experience and received a monthly update from FirstDataBank, while others updated weekly.

He noted holes in the system, particularly in the primitives and coded units.  Text, rather than a numeric identifier, made them difficult to use.  They also lacked the concept of a strength associated directly with an ingredient, though they could indirectly make that association.  Dr. Larsen said the only issues with interoperability were units and ingredient strengths.  Concentration units were bundled in composite units, and could be broken out in the system.

Dr. Larsen reported using LOINC to generate and build concepts in HDD.  The code and definitional name were used as surface forms to those concepts.  He noted interoperability issues that didn’t exist with the pharmacy related to how they communicated with a regional lab system.   LOINC also was used for lab ordering and clinical observations.  Dr. Larsen said defined uniqueness criteria in LOINC were well managed.  When submitting, vocabulary tools helped to find and map concepts to LOINC.  Coverage for the lab concepts was good.  Turnaround time for a new concept was about two weeks.  Dr. Larsen said adoption of lab LOINC concepts helped IHC communicate outside.  And he noted that LOINC was free.

Panel 1 - Terminology Users: Healthcare Providers

Dr. Guay set two requirements for clinicians: systems recording information must clearly record essentials, and the patient record must be the sole source of administrative and clinical data.

He said the two tooth numbering systems, National Universal System and ISO, could be mapped to each another.  The ISO system was clearer, unambiguous and easily used by clinicians.  It differentiated between left and right, upper and lower, and permanent versus deciduous.  The data included could clearly and easily populate the data elements HIPAA required. The tooth designation system was simple and straightforward.

Dr. Guay said a diagnostic code had never been required in adjudicating dental claims and no one used the SNODENT diagnostic coding system.  Rarely, when a dental procedure was reported for reimbursement (primarily for oral surgical procedures), ICD-9 diagnostic codes were used.

Dr. Guay told how the profession approved of SNOMED’s logical arrangement ten years ago but found it deficient in dental terminologies and developed the SNODENT diagnostic coding system.  SNODENT diagnostic codes were mapped to ICD-9, but the reverse wasn’t possible.  Dr. Guay said SNODENT was rich enough that it didn’t need interpretation or extrapolation of other codes and allowed easy data entry and manipulation.  Noting that research required definitive descriptions, Dr. Guay said SNODENT allowed distinction of a broad, rich terminology related to specific diagnoses.  He praised the cooperation between SNOMED and the profession.

Discussion

Dr. explained that no one used SNODENT because adjudication of claims didn’t have an administrative need for a diagnostic coding system.  Dr. Guay believed that if someone licensed a SNOMED system they’d already licensed SNODENT.  SNODENT on its own only had the optional cost of the codebook.  Dr. Larsen said NDDF was normally sold to vendors of systems; there was a cost to the vendor and secondary costs to users.  Dr. Oliver explained that she’d selected NDF-RT because it was free and they liked the classification structure and lack of proprietary constraints.  Asked what needs Multum wouldn’t address. Dr. Brown explained that they had been using their own NDF (largely equivalent to Multum) and improving their internal practice.  Dr. Oliver added that her group felt that potentially features in NDF-RT might provide search opportunities.  Asked about the best level of abstraction for clinical decision support and interoperability for control terminology, Dr. Larsen said the routed generic was sufficient for most decision support.  Addressing the timeliness of updates required for their applications from the terminology developer, Dr. Oliver said, once a new medication went into clinical practice, updates had to happen quickly.  Ultimately, updated terminology had to be available as soon as it was on the market.

Dr. McDonald said that whether products were free to the user was an important dimension in a standard and should be weighted.  Dr. Cohn said another issue was a sustainable business model that was maintained, updated, and usable.

Dr. Larsen explained NDDF’s concept about the minimum necessary for drug ordering.  The fact that a drug was routed could be used as an ordering concept.   They could indicate the drug, “oral,” and collect an order dose.  Asked if SNODENT always should be required for dentists in billing, Dr. Guay said that if insurance payers believed there was a need for a diagnostic code to fairly, efficiently, and quickly administer dental claims they’d ask for them.  Historically, there hadn’t been a need.

Asked to clarify how there was a use for SNODENT, yet no one used it, Dr. Guay explained that with increased costs, there was a need to look at outcomes across the population rather than on an individual patient basis.  It was easier to do this electronically.

Dr. Guay clarified that there was no charge for dentists to use SNODENT.  There was a small fee for those who used the system to generate income (e.g., a practice-management software developer) and insurance carriers who used the data to develop customary fee ranges in geographic variation.  Dr. Guay said with SNODENT and SNOMED combined there was enough content for a dental electronic medical record system.

Panel 2 - Terminology Users: Healthcare Vendors

Dr. Levy depicted SNOMED CT within a language engine (i.e., a software tool or product that allowed maintenance, storage, and updating of a multiple number of terminologies within a single mechanism).  Dr. Levy noted SNOMED was an ideal core for a language engine.  It followed basic principles of concept-based terminologies, and those concepts and hierarchies served as a hub from which new terminologies (either other standards or proprietary terminologies) mapped.

The concept base in SNOMED provided the framework within the language engine from which additional terms could be added as well as additional hierarchies and relations.  SNOMED’s object or concept permanence made it easier to store.  Dr. Levy considered the integrity of the updates excellent.  However, he noted that maintenance and upkeep of terminologies required software tools for processing and considerable planning.

He said users of SNOMED CT would likely add local additions (such as local terms or concepts) inbetween SNOMED updated cycles.  As vendors extended the model, Dr. Levy urged users to continue updating, so versions didn’t semantically drift from the core.

Dr. Levy noted that vendors using SNOMED should understand the use cases and rules behind the SNOMED mappings to other terminologies.  He reported that SNOMED had formed a clinical terminology workgroup that addressed standardizing these mappings by defining what a given map could and should be used for, rules for creating them, and how the industry might validate maps.

Dr. Levy shared his perspectives on multiple terminologies, noting it was important to subset terminologies used.  Vendors had begun to break up SNOMED into different hierarchical content structures.  Dr. Levy pointed out that this required considerable software and development effort.

Panel 2 - Terminology Users: Healthcare Vendors

Dr. Chuang identified issues Cerner Corporation faced providing terminologies to client end-users.  Ultimately, terminology content had to work within an information system structure so subsets, synonyms, preference settings, and world-based preferences were accommodated to handle preferences for terminology.  Dr. Chuang emphasized that no reference terminology was wholly adequate on its own. 

He said Cerner’s clinical reference terminology, SNOMED CT, had excellent breadth and depth for certain discortized domains around disease findings and was broad in all areas.  Many of Cerner’s clients’ priorities captured and documented these areas well and things were missing.  Cost also was a barrier to adoption.

Dr. Chuang said LOINC, Cerner’s terminology related to laboratory result names, did well with intermediary expressions and had a high degree of specificity, but making LOINC, with its narrow scope of focus, usable in an electronic health record system was a challenge. 

Multum was a subsidiary of Cerner Corporation that had been packaged with their CPOE solutions.  Although the data model supported alternative drug vocabularies, Cerner defaulted to Multum, which enabled expression of medications at the point of ordering, prescription, and back-end from a dispensing perspective.

Regulatory requirements forced clinicians to deal with CPT-4s and ICD-9s.  But Dr. Chuang said administrative and clinical workflows intersected and from a data capture perspective they had to accept that diverging needs had to interrelate.  Dr. Chuang said CPT did a good job with diagnostic imaging concepts only because doctors tended to not order or document things they didn’t get paid for.  CPT lacked smart hierarchical structure, but had a code and interoperability. However data validity and reliability were compromised. 

Dr. Chuang said SNOMED CT was an excellent definitional knowledge for looking at data interoperability, comparability, aggregation abilities and quality.  Outside of interoperability, terminologies didn’t solve issues about comparability and data quality.

Panel 2 - Terminology Users: Healthcare Vendors

Dr. Faughnan said McKesson used NDDF Plus extensively for everything from dispensing medications at pharmacies to managing packaged information and leveraging it as the basis for acute care medication order entry.  FirstDataBank, who maintained NDDF Plus, was responsive to their requests and he said the terminology had improved.

McKesson looked forward to using SNOMED CT instead of a modified derivative of ICD-9 to manage diagnostic data.  Dr. Faughnan was dissatisfied with ICD-9 as a terminology for supporting medical decision.  McKesson chose SNOMED in part because it was architectural and designed for growth and maintenance.  His concern was that partitioning terminologies through a dozen domains would cause overlapping concepts.  In addition to representing conditions and diagnoses, McKesson expects to use SNOMED CT to represent procedures, findings, interventions, and lab results in collaboration with labLOINC

McKesson liked SNOMED’s capacity to map to the administrative code sets, even though SNOMED has a quite different ontology.  The breadth of coverage, particularly for nursing and allied healthcare, is critical.  SNOMED makes it possible for McKesson to use a well structured reference terminology While leveraging the richness and usability of domain specific nursing terminologies.

Dr. Faughnan stressed the importance of affordability, a significant customer issue.  He suggested that government encourage the nursing and allied healthcare approach, providing a reference terminology where concepts were well maintained and special purpose terminologies with extensions for particular domains. 

Dr. Faughnan asked the government to decrease the penalty of implementing clinical systems by closing the reimbursement and regulatory gaps.  In particular, if reimbursement rules, such as medical necessity, are written using ICD and CPT without regard to SNOMED, it may be challenging to implement SNOMED CT in real world systems.  He asked CMS to consider framing medical and defining necessity rules using concepts from SNOMED CT, then using the administrative mappings to generate ICD and CPT versions of these rules.  CMS could then distribute the rules as expressed in either SNOMED or ICD/CPT, along with the mappings CMS uses.  That would ensure equivalence.  In preliminary testing, Dr. Faughnan found that medical necessity rules were simpler to express and manage using SNOMED CT, so this approach could save CMS money and effort while also greatly facilitating real world adoption of clinical terminologies.

He recommended that cost benefit calculations assume integration of a PMRI terminology when deciding the costs and benefits of going to PCS and ICD-10.  He urged the government to encourage collaboration and interoperability between SNOMED and RX NORM, eliminating different views of medications.  He cautioned about the hidden costs of maintaining a system when mixing and matching too many terminologies.

Panel 2 - Terminology Users: Healthcare Vendors

Dr. Lau discussed lab LOINC, which provided observation ID for an HL7 transaction.  She said 3M HDD was known for mapping (though they’d come to it reluctantly) and used the LOINC dictionary as an intermediary.  They provided numerical concept IDs (NCID) that were stored in the CDR and submitted the term to LOINC in that format.  LOINC assigned a code that 3M HDD added to the dictionary’s back-end.  Whatever representation was sent in HL7 message, 3m HDD worked to store that NCID in the clinical data repository.  These were mapped to a LOINC mean so customers could communicate with anyone mapped to LOINC.

Cautioning that if everyone kept looking for the next perfect terminology, they’d never move, Dr. Lau suggested it was better to pick one and find ways to use it.  Pointing out that everyone had been focused on building, she said it was time to pay attention to using.  She emphasized the challenge of sustaining interoperability with multiple people in multiple places determining codes.

Discussion

The Subcommittee asked how many systems used SNOMED CT and/or LOINC, for how long, and the applications.  Dr. Faughnan explained that it was feasible for McKesson to use SNOMED due to improvements in CT and government agreement.  Prior costs had been unacceptable.  McKesson used LOINC in lab results and hoped SNOMED CT would develop a way to aggregate similar LOINC codes and display them together in the record.  Dr. Chaung said Cerner presently didn’t use SNOMED CT, but a backlog of clients waited for the government/CAP agreement.  Multum was Cerner’s default medication terminology.  3M HDD also didn’t use SNOMED CT.  Dr. Lau said their customers had no choice but to use LOINC.

Dr. Faughnan said McKesson didn’t plan to use clinical LOINC because of its narrow coverage.  The structure was essential; things couldn’t be maintained without ontologic integrity.  Cerner and 3M HDD used clinical LOINC, though Dr. Chuang emphasized that Cerner used it selectively.  He clarified that post coordination was necessary because even extending necessary additional concepts against a reference terminology was a combinatorial nightmare.  Formalizing still relied on definitional knowledge and was a challenge because it involved end-users.  Dr. Levy pointed out that SNOMED CT took an early step towards solving coordination issues: the new version included qualifying relationships versus definitional.

Asked about the importance of mapping to SNOMED, Dr. Chuang said the lack of mappings would be a barrier to user acceptance.  Dr. Faughnan said McKesson’s customers wouldn’t accept disadvantages in terms of reimbursement or regulatory compliance; CMS had to do validated mappings.  As a member of SNOMED’s terminology working group, Dr. Levy emphasized there rarely were direct matches and that they weren’t at the stage of terminology development to auto code.  Dr. Lau noted that newer systems (e.g., ARA) did their own mapping.

Tony Martinez reported that over 10,000 organizations and practitioners registered in the demonstration program on ABC codes for nursing and complementary and alternative medicine.

Panel 3 - Terminology Users: Healthcare Providers

Dr. Zinder said DOD formally tested and limitedly deployed Medcin in their computerized medical record.  He emphasized its framework, which he viewed as a nomenclature, and focused on patient/provider interaction, noting Medcin balanced being expressive, usable, and tractable

In its most basic form, Medcin was a pre-coordinated structured clinical terminology.  Dr. Zinder said the power behind the system was Medcin’s ability to do symptom surveillance.   He noted that providers also needed to be able to input in order to do surveys.

Dr. Zinder emphasized that DOD didn’t see Medcin as a competitor to SNOMED.  Medcin was useful at the point of care for capturing clinical documentation, but SNOMED was a more granular, atomic and post-coordinated tool and better at the database layer, even without a good interface.  He noted that the National Library of Medicine expressed interest in mapping them.

Dr. Zinder stressed the importance of distinguishing between: tools, content, and processes.  Medcin tools met the criteria of a knowledge system.  Medcin allowed templates and was easy to use.  It was tractable, followed the criteria of permanence, and had a low level of redundancy.  It was also scalable, hierarchical and added content.  Dr. Zinder noted that content (like all structure tools and clinical vocabulary) was limited.

Panel 3 - Terminology Users: Healthcare Providers

Dr. Peterson said the Student Health Center on the Auraria Campus, administrated by the Metropolitan State College of Denver, had used Medcin in a paperless system for nearly a year.  Physicians easily used Medcin at the point of care and multiple people documented the entire clinical encounter from the front desk to checkout.  Structured data was automatically recorded without the healthcare provider investing extra time and care wasn’t slowed. 

Dr. Peterson noted weaknesses.  Vendors didn’t always carry out Medcin’s frequent updates regularly.  A programming problem was that users weren’t able to separately bundle data by problem for patients with multiple problems.  

The center used passive surveillance to identify trends.  Structured data was generated through provider-engineered disease-specific clinical outcome documentation forms. 

Dr. Peterson said Medcin’s vocabulary was useful at the point of care and was a structured clinical documentation tool.  He emphasized that documentation tools that were flexible, multi-purposed, and adaptable to use by the entire healthcare team were key to successful bedside acquisition of structured data.  No single structured nomenclature totally met front-line clinicians’ needs.  Dr. Peterson said they needed to identify the strengths of current terminologies, correct existing gaps, and consider a hybrid system that served everyone’s needs.

Panel 3 - Terminology Users: Healthcare Providers

Dr. Warren told how KUSN turned a live patient information system into an educational tool with a corporate sponsor: SEEDS.  Noting she’d served on the ANA Committee for Information Infrastructure that encouraged SNOMED CT to apply as a recognized terminology, Dr. Warren said no nursing terminologies captured findings as SNOMED RT and CT did.  Her interactions with SNOMED and how it put together a suite of languages and terminologies nurses needed led to choosing SNOMED CT.  Maps of NANDA’s and NIC’s acute care software were in SNOMED.  NOC will be added in the fall.  The Perioperative Nursing Data Set’s diagnoses and interventions were mapped into SNOMED.  Another advantage was that, in teaching students to operate in different venues, they also learned to cross between terminologies.

With SNOMED CT, students entered once and got SNOMED, NANDA, Perioperative, and Home Health Care Classification codes as well as synonyms.  In designing the interface, Dr. Warren said she could use a synonym to refer to the SNOMED CT code.  And she knew more about diagnosis, so she had more information to use with students.  Dr. Warren said SNOMED CT provided the foundation to bring in all the concepts used by the disciplines.

Panel 3 - Terminology Users: Healthcare Providers

Dr. Madden said SNOMED had been the standard terminology for anatomic pathology for years.  But coding a report with multiple diagnoses required terminologies from which SNOMED could take ideas and extensions.  Dr. Madden added that HL7 architecture was an additional important ancillary technology to permit referencing medical statements in the context they occur.

SNOMED CT was enriched by cross mappings to other standard terminologies, and its pre-coordination was strong.  However, Dr. Madden noted conceptual linkage mechanisms were needed for expressing medical assertions.  He said they needed to move from a model where codes were appended as a group at the end of a report to where codes could be expressed using a syntax in relation to each other.  Further extension was needed to post-coordination.  They also needed to communicate about assertions by attaching concepts. 

Dr. Madden stressed the importance of linkage between ability in a future medical record to link across multiple documents, link related items in a document, and relate them to timing and patient.  He said SNOMED CT’s richness supported linking, and terminologies needed additional assistance from HL7, clinical document architecture and other components to do the rest.

Panel 3 - Terminology Users: Healthcare Providers

Dr. Dash said LOINC was explicit and specific about how to generate codes and provided true post-coordination.  But even though LOINC had categories for organizing and aggregating, Dr. Dash said LOINC was a relatively flat database.  He said SNOMED provided a hierarchy on top of LOINC that let laboratory data be expressed in a common universal manner.  LOINC provided post-coordination; SNOMED provided aggregability.

Dr. Dash recommended promoting: SNOMED CT as a base standard, mappings to domain specific standards (e.g., LOINC, Medcin), and development of specific post-coordination requiring SNOMED CT and making the standard usable for messaging and communication HL7.

Discussion

Asked if lacking a multi-hierarchy was a practical problem for Medcin, Dr. Zinder said Medcin was transferring its transactional database into a big data warehouse for extraction and they were still learning the reporting aspects.  Dr. Peterson said the problem was one couldn’t get a specific answer to a general question or focus clearly on a specific question. 

Dr. Madden said enhanced post-coordination would be necessary to support a medical record.  Rather than tag a medical diagnosis with a single code, concepts from a standard ontology had to be screened together.  Dr. Madden didn’t think any available comprehensive medical terminology supported such compositional coding.  Dr. Warren noted that SNOMED worked with developers of the nursing classifications to incorporate their concepts into SNOMED.  Dr. Zinder said DOD did negation as well as family history with Medcin, using it as level one coding which went into the next level echelons already in place for their systems.  Dr. Warren said all ANA-recognized developers and SNOMED were working on mapping goals and outcomes.  She said she hadn’t found any functioning or experimental system that described continuum of care so they could see the effect of different decisions on the process of care and impact on patient outcomes.

Dr. Peterson said his clinic was able to bundle single simple problems; the challenge for vendors was determining how to take care of the patient with multiple symptoms one symptom at a time.  Dr. Zinder said that with CHCS-2 all problems were individual and they linked ancillaries so they were part of the note.

Panel 4  - Terminology Users: Healthcare Vendors

Dr. Mays said Apelon was most familiar with the NCI Thesaurus, FRT, and SNOMED CT.  All three had or were in the process of adopting features Apelon found useful in software and content delivery: rich hierarchies, compositional definitions, change management, and ability for the clinician to navigate with the terminology.  Dr. Mays said NDF-RT was extremely interesting but unproven.  Apelon believed all three needed more practical experience with drug terminology.

He noted the NCI Thesaurus wasn’t a clinical terminology, but was oriented towards research and the pre-clinical level.  He said the inclusive process NDF-RT had for including various stakeholders deserved to be studied and replicated.  LOINC and UMVNS needed improvement.

Dr. Mays pointed out that interoperability issues weren’t compromised by the creation of subsets.  The marketplace would address their creation and maintenance once barriers to innovation (primarily around IP considerations) were removed.

Noting there was an entrenched infrastructure around administrative terminologies and that mapping wasn’t a viable solution among clinical terminologies, Dr. Mays suggested that the Committee recognize a small number of core, non-overlapping clinical terminologies.  He suggested that they might be incorporated into larger core clinical terminologies.  Dr. Mays encouraged the government to promote harmonization of the core models and the common reference taxonomies.  Once core terminologies were chosen, Dr. Mays said a gap analysis would be appropriate.  Gaps could be filled as needed.

Panel 4 - Terminology Users - Healthcare Vendors

Dr. Brandt said an ontology such as SNOMED’s was useful for having a term, modeling system intelligence and knowledge at that term level, and having that drive alerts, reminders, and construction of oder sets and documentation templates.  The issue was that providers needed to specify clinical knowledge at the granular level, and that it was impractical for vendors to model knowledge at every level of granularity.  Utilizing a hierarchal ontology like SNOMED CT, patient conditions can be expressed more specifically, and knowledge modeled at a more general level can be applied to the patients.  Additionally, patients often had multiple diagnoses that hindered constructing an interface, unless such a knowledge model can be applied.

Dr. Brandt said there wasn’t yet a complete standardized terminology for the representation of ancillary procedures expressed as actionable items in physicians’ orders.  He said vendors needed a standardized nomenclature to model behavior and create system subsets and he requested help.  Dr. Brandt said SNOMED had all the terms but that he needed to be able to subset them and achieve concurrance with those patient conditions or problems against which the industry could model knowledge.  SNOMED was unique, based on its comprehensiveness and breadth, but Dr. Brandt said that alone wasn’t sufficient.  They were forced to aggregate multiple terminologies and had issues of redundancies and ambiguities. For decision support, they needed to be able to model more than what SNOMED traditionally considered its charge.

Gaps included needing: support for the expression of clinical orders as intended actions or outcomes in the language providers were used to writing, and a combinatorial grammar to put terms together in a way that provided unambiguous meaning.  Dr. Brandt contended that HL7, CDA, and template committees wouldn’t provide the answer.

Noting that Siemans was an international corporation and that most vendors in their space also looked toward the international market, Dr. Brandt advocated establishment of global standards.

Dr. Brandt requested the federal government establish the scope for each effort, make implementation practical and affordable by implementing national standards and dealing with the licensure issue, and determine real priorities for modeling terms.  He advocated point of care decision support and creating incentives for suppliers to successfully fill those gaps.

Panel 4 - Terminology Users: Healthcare Vendors

Mr. Wood said DMLSS was being developed and deployed to hospitals within DOD.  During Desert Storm, three services used different systems that couldn’t communicate together.  Today they used one.

He explained that DMLSS adopted  ECRI’s manufacturer and nomenclature systems.  They got device nomenclature and codes from ECRI, split them into consumables versus equipment (equipment used their own nomenclature system), and added management data.  

Mr. Wood noted that DMLSS couldn’t create centrally managed maintenance plans, tie them to central line device codes, and send those out to satisfy the JCAHO requirements in their peace time hospitals.  Information maintenance and use in a hospital wasn’t readily available.

Users seeking equipment used a review and approval process.  Once equipment was received, a catalog record was established, which formed the core of their logistics system.  Equipment ordered through the catalog record was populated with management data and standardized nomenclature throughout its life.  Logisticians and a local enterprise system used this equipment record.  Guidance was automated.  The computer used the codes and users used nomenclature. 

Even though this was a logistics system, Mr. Wood reported that it had spread throughout MTF hospitals.  Users could view equipment records and see quality assurance records.  Mr. Wood said standard nomenclature enabled them to tie the whole system together.

Discussion

Dr. Brandt said the next version of SNOMED CT was in development.  For deployment, they needed ability to model knowledge against a reference terminology and specific reference terms.  Terms could have mappings to other vocabularies.  He said a normative terminology would be helpful.  Multiple non-overlapping vocabularies incorporated into a coherent whole could be used.

He pointed out a bigger gap for expressing actions.  Thinking about processes in work steps required consistent vocabulary for actions that enabled them to define processes.  When physicians wrote orders usually they were asking someone to do whatever it took to ensure their patient received specific treatment.  This involved steps and different words for actions, which Dr. Brandt considered more complex than simple grammar.  They needed singular source terminology in which terms could be requested and implemented.  Dr. Mays concurred that LOINC and UMDNS needed work on their structure, hierarchies, and an explicit, inexplicit semantic model.  Dr. Brandt emphasized that the priority was point-of-care clinical decision support.  For a decade, they’d addressed nomenclature in terms of outcomes.  Mr. Brandt contended that nomenclature could support a closer, higher impact and more urgent need.  Using system intelligence to support point of care decision support would make it easier to do what was appropriate for the clinician interacting with the patient.  People seeking clinical systems sought successful interventions, not the ability to measure failures.            

Organizational Activities Relative to Clinical Terminologies

Speaking on behalf of the Markle Foundation and Connecting for Health, Dr. Cimino presented 16 preliminary consensus statements about terminology and process requirements reached at their recent meeting.  Terminology requirements included: (1) an interoperable healthcare system required an inclusive information terminology set that was cross domain; (2) healthcare terminology had to be integrated into a single reference terminology for users systems; (3) a single integrated model of healthcare information was needed as a foundation for the integration of current and future healthcare terminologies; (4) healthcare terminology needed to support a single systematized way of describing every aspect of health status, care of individuals, and populations of individuals; (5) the integrated terminology set should be a coherent consensus statement that was transparent to the source so both could be used similarly.

Dr. Cimino presented 11 consensus statements about the process for developing integrated healthcare and terminology models: (1) it should have openness; (2) a comprehensive set of functions with clear roles and responsibilities was required; (3) an oversight group would begin to evaluate and oversee the process; (4) feedback built into the process to inform those building and maintaining the terminology; (5) terminology integration functions that encompass responsibilities including oversight, process management, and repository maintenance; (6) establishment of a terminology integration oversight role; (7) harmonization was the goal of terminology integration; (8) clearly defined terminology domain boundaries; (9) a single terminology within each terminology domain; (10) the integration process needed to encompass linkages between domains; (11) demonstration projects to determine how to optimize development and implementation of integrated healthcare terminology.  

Organizational Activities Relative to Clinical Terminologies

Mr. Larsen presented HIMSS’s summary analysis of the terminology questionnaires. 

Even if the core terminology group described in Mr. Sujansky’s analysis was technically accurate and met the needs for a national standard medical terminology, Mr. Larsen doubted that it would move them closer to achieving a universal health record. 

Mr. Larsen considered EHR terminology the most global.  But he pointed out that the industry hadn’t agreed on what an EHR was or did--which would drive the requirements.  Moving forward would be difficult until they’d defined EHR, decided feasibility, and phased it. 

Mr. Larsen depicted a dilemma.  If the industry was waiting for a recommendation, whatever they did would freeze technology and other approaches--if the industry wasn’t waiting for codification, why wouldn’t they let other initiatives (e.g., Web antibodies, structured documents) play out?

Without explicitly defining EHR’s purpose, Mr. Larsen noted they couldn’t judge a code set’s value.  They could exclusively move a patient’s health record electronically between primary care providers, but that was different than identifying, in real time, a bioterrorism incident or conducting clinical studies.

Mr. Larsen pointed out that the Subcommittee looked at messages and codes but hadn’t examined the structure of the electronic health record.  In doing so, he suggested they’d recognize a necessary issue of trade-offs.

Mr. Larsen said they’d welcome standardization if: flexibility allowed for incorporating it into a system, an EHR mandate backed it, and it was implemented as part of an international standard.

Discussion

Noting that focusing on what one tried to do, why, and a timetable generated criteria for specific codes, Mr. Larsen suggested that federal government initiatives defining EHR might be a better place to begin standardizing codes than terminology.  Industry moved on creating an EMR at the core of an enterprise clinical information system and large healthcare enterprises saw value (e.g., Leapfrog, compliance, improving patient safety outcomes), but Mr. Larsen said this didn’t assure interoperability between enterprises or meet secondary users’ needs.  He expressed concern that codifying things that moved between enterprises implied that, for one silo purpose, things would be put into standard nomenclature in a standard message.  Dr. Sujansky asked if the same couldn’t be said of the HL7 message standard used successfully for a long time.  Mr. Larsen noted that HL7 was done within an organization with more control than independent enterprises. 

Mr. Blair recalled that when they began this process he’d shared the idea that the first appropriate standardization would be content and structure of the record or an information model identifying it.  But he noted that the Committee’s directive from HIPAA was to study uniform data standards for patient medical record information and the electronic exchange of that information, looking at what was available for adoption.  After their brief experience with earlier HIPAA standards, picking standards that demonstrated some success in implementation was compelling.  Message format standards seemed an area with some degree of industry consensus.  Considering the next logical area, the Committee heard the industry say they needed code sets and terminologies so they could go forward with next steps.  Today, they’d heard that decision support and outcomes analysis were high priorities and that terminologies had to support messages and other activities.  Mr. Blair noted that, after waiting years for the federal government to make standardization of patient records a high priority, this happened.  CHI was driving this forward and communicating their priorities and information requirements and the Subcommittee was trying to be responsive to their needs as well as the private sector. 

Mr. Larsen expressed concerns based on experiences with why they hadn’t an electronic health record and problems HIMSS saw.  He suggested starting with codes and terminology as well as technology,

Vivian Coats explained that UMDNS was used in hundreds of hospitals and was imbedded in systems and software for equipment management and procurement.  UMNDS was incorporated in UMLS, mapped to other terminologies (e.g., SNOMED, ICD), and used internationally.  The scope of the terminology comprised radiologic equipment, implantable devices, and reagents invitro diagnostics.  A collaboration with the National Library of Medicine (NLM) aimed to extend UMNDS to encompass emerging technologies not yet on the market and devices important in counter terrorism, bio defense, emergency preparedness, and environmental monitoring.  Ms. Coats noted that in conjunction with their work for NLM they were creating and defining a hierarchical structure to sit above the current data set of control terminology. 

Dr. Mays said mappings from clinical to administrative space presumably would be driven by economic factors; when there was a business case to capture administrative code sets from clinical coding, vendors would provide mapping or CMS would have a business case to provide them.  She noted that even with non-overlapping terminologies there’d be a need to provide correlations among terminologies related to the reference taxonomies.  Dr. Mays said her preference would be to have common reference taxonomies and harmonize amongst non-overlapping terminologies, but just because they chose those terminologies didn’t mean they wouldn’t need mapping. 

- Day Three -

Panel 5 - Terminology Users: Health Care Providers & Vendors

Dr. Butler said EPIC used LOINC and Medcin and was beginning to use SNOMED.  Donald Chiolus explained that LOINC gave EPIC’s customers ability without disrupting existing work codes and test knowledge.  Because the laboratory wasn’t reporting, LOINC terminology was specific enough for their purpose and the mapping utility was good for customers.

Dr. Butler said EPIC’s customers mostly used Medcin to link to medical documentation.  EPIC and their customers considered Medcin’s single hierarchy and redundancy disadvantages because of the number of codes with NOS.  Customers also considered it pricey.

EPIC and their customers intended to use SNOMED terminology in the EPIC application to document clinical information.  Dr. Butler said it was easy to place category lists from SNOMED’s vocabulary into EPIC’s customer system.

He said granularity was a major advantage with SNOMED.  The problem was to allow quick and easy coding to the specific granular level in presenting some 88,000 diagnosis terms in information to physicians and other clinical users.  EPIC was developing interfaces to help physicians and clinicians.  Dr. Butler considered SNOMED’s main coverage good and the dummy ID and permanent multiple hierarchy important to EPIC’s customers.  He hoped it became the national standard soon because their customers were waiting for it.

EPIC urged NCVHS to recognize and adopt one or more clinically specific terminology to serve as the core set of the national PMRI standards.  Dr. Butler said the government had to analyze clinical functions to identify gaps and existing terminologies for filling these functions.  EPIC found SNOMED and LOINC the most promising of the presented terminologies, noting they covered different, disjointed aspects of the medical terminology domain.  Dr. Butler also noted that SNOMED’s drug terminology wasn’t sufficient and asked for that to be addressed.

Panel 5 - Terminology Users: Health Care Providers & Vendors

Dr. Dolin said CMT was comprised of a core and extensions containing what didn’t fit or were big requests that there wasn’t time to integrate.  Kaiser Permanente was 100 percent committed to the continued use of SNOMED CT and Laboratory LOINC. 

SNOMED CT, LOINC, and Kaiser First Data Bank drug terminology for pharmacy were classified together as a cohesive core and CMT was their lingua franca of interoperability.  The more they had CMT mapped and supporting interfaces the more it became the cheapest solution.

Dr. Dolin said Kaiser had needs for all three terminologies.  SNOMED CT needed: (1) further development of qualifiers as a way of constraining allowable post-coordination, (2) standardization of SNOMED CT context sets, (3) refinement of SNOMED/HL-7 reference information model overlap, (4) broad understanding and demonstrations of the power of the description project, (5) underpinnings of SNOMED CT, and (6) from SNOMED’s perspective they needed further enhancement of the term request process.

Laboratory LOINC needed: additional scrutiny over the values used to populate component axis (e.g., the set of measured components), unambiguous definitions with a hierarchical structure for the codes used to define the component axis, and a cross map between these component values and SNOMED CT to facilitate complete integration of those terminologies. 

Kaiser presently used the First Data Bank drug terminology, which Dr. Dolin noted t wasn’t the standard.  He said Kaiser preferred to see a national standard drug terminology such as RX-norm; ideally, vendors would adopt or map to this national standard.

Panel 5 - Terminology Users: Health Care Providers & Vendors

Ms. Meadows said SWOG worked with the PMRI terminology in the context of cancer clinical trial data collection and management.  The NCI thesaurus was developed by NCI to provide a centralized reference terminology based on working vocabularies used by NCI and collaborating groups.  Since their effort standardized questions rather than terminology systems, no match existed in the NCI thesaurus to the unique focus of a common data element.  Sometimes, several terminologies were needed to accommodate the requirements of representing mandates.

She expressed two concerns about structuring a vocabulary to facilitate clinical research: flexibility to respond to the client’s dynamics essential for research, and offering timely inclusion of new terms into the vocabulary without creating obstacles for the research process.

Ms. Meadows suggested that in choosing terminologies as standards, the selection criteria should strongly consider the actual use potential to bridge the gap between vocabulary repository and implementation of these vocabularies in clinical settings.  The actual utility of the standardized vocabulary could only be tested when implemented in an end-user application.

Panel 5 - Terminology Users: Health Care Providers & Vendors

Dr. Pittelkow, in collaboration with Dr. Peter Elkin and the Mayo Clinic, Department of Internal Medicine’s Laboratory of Biomedical Informatics said SNOMED CT was more widely adopted and utilized at the Mayo practice.  SNOMED CT was used extensively for indexing clinical notes with automated data abstraction.

SNOMED CT also was used in dermatology to index Web pages.  Mayo was also developing pilot projects to automate the assignment for E and M coding, in fraud and abuse auditing, and various quality assurance programs.

Dr. Pittelkow noted shortcomings in overlap that he thought would be disambiguated and he said NDF-RH or other terminologies might address the poor coverage of some drug brand names.

With keyboard indexing, Dr. Pittelkow said SNOMED provided better coverage.  NDR-RT could cover the drugs with various oncologic indications, but had room for improvement.

Mayo believed that PMRI terminologies would support near complete coverage content and involve open input from clinical and informatics communities to provide a formal knowledge representational language, express their concepts, and ultimately provide the best decision support.

Ultimately, Mayo hoped to have all developments and advances in natural language processing to support the terminology.  Dr. Pittelkow said potentials for abstracting and linking to a record for the physician to provide accurate E and M coding and ultimately decision support, based on SNOMED CT, would provide significant advances.

Discussion

Dr. Pittelkow said Mayo used SNOMED routinely in several clinical settings.  Dr. Dolin said Kaiser used SNOMED to record information in patient charts and in the national electronic health records.  Dr. Butler said the key for EPIC was educating users about what they could get from a medical vocabulary.  Participants clarified that there was a system connection. 

Dr. Meadows said that when they finished cleaning up and converting the 3,000 terms to their ISO compliance, they’d share the results with the Committee and see where they had convergence.  She said they planned to reach out to other terminology systems.  

Mr. Blair said vendors indicated in written testimony that a highest priority, because of SNOMED’s hierarchy, was the ability to support decision support and outcomes analysis.            Dr. Dolin said Kaiser hadn’t yet implemented decision support applications using SNOMED. They planned a targeted Q/A of the hierarchies and believed the hierarchies within SNOMED stored knowledge useful for decision support. 

Noting a prototype product called Opticode, Dr. Pittelkow said they were approaching a point where SNOMED and all the others together would be as sufficient as the lexicon beneath. 

Discussion on previous terminology testimony

The Committee discussed the testimony.  The 20 participants and written testimony addressed eleven of the twelve terminologies.  Only the ISO 11073 terminology remained.  The committee reviewed the trends regarding SNOMED CT, LOINC, MedCin First Data Bank, and NDDF.

 Dr. Sujansky reviewed what the Subcommittee had done up to this point.  He said their first and most important next step was refining their candidate set of 12 terminologies into a preliminary recommendation for a core terminology group.  Their intent was a small group of terminologies with required domain coverage and no minimal overlap.  Beyond this immediate step were challenges with mappings and the user community’s desire for a tightly integrated and coordinated core terminology group.  The Committee decided they needed more focused hearings on drug terminologies.  They agreed that zero was an acceptable license cost to the end-user.

Dr. Sujansky said the next step was a final draft of the analysis of the questionnaire responses.   The report would include terminology developer feedback and data they’d received and chose to add to ensure completeness.  A composite activity report was scheduled for late June, however it probably wouldn’t include preliminary recommendations, which would be prepared by mid-July.  The Subcommittee needed to look more at device terminology and do a further analysis.

Dr. Steindel observed that selection of terminology had just begun.  Members noted to comment in their report that much remained to be done before achieving interoperability; selection of terminology alone wouldn’t achieve this.  The Subcommittee will have a letter of preliminary progress and the more obvious recommendations for the Secretary to present to the full Committee at the June meeting.  The Subcommittee will work on the rest July through September. 

ICD-10 Impact Study

Dr. Libiwick said the Science and Technology Institute of Rand expected to complete their impact study in time for the August Subcommittee meeting.  RAND was studying three questions: the costs and benefits of switching from ICD-9-CM diagnostic codes to ICD-10-CM and ICD-9-CM procedure codes to ICD-10-PCS, and (if it was advisable to switch to both) if they should be done sequentially.

While ideally one would arrive at a benefits-to-cost ratio, Dr. Libiwick said they couldn’t obtain anything near that level of precision.  But he said they hoped to have a reliable delineation of major categories of costs and benefits, understand whether they were positive or negative, and gain a sense of size and order of magnitude.

Dr. Libiwick clarified what they weren’t addressing.  He reviewed arcana on what constituted costs benefits and he spoke about a simplified view of the life cycle of coding information.  Citing Canada’s new CCI coding system, he noted the automated coding process was taking over.

Dr. Libiwick discussed the codes themselves, noting that one clear feature of the switch was the extent to which one code mapped into another.  He pointed that many depended on ICD-10 being a more logical (or at least more detailed and specific) system.  He suggested that ICD-10 might lead to fewer coding and medical errors.  Dr. Libiwick said potential benefit arose from the tendency of claims to be rejected by payers who didn’t present enough information to judge their validity.

Overall potential costs included requirements for retraining, new software, initial perturbations in the claims process, and possible long run efficiency loss from dealing with codes embedding more information.  Overall benefits included more precise decision support systems and a better understanding of healthcare delivery outcomes.  Many benefits were potentially larger but difficult to estimate with the same degree of precision.

Discussion

Dr Libiwick predicted that, short term, there wouldn’t be changes in DRGs based on changes in codes, but post transition the rate of change in DRGs would accelerate to accommodate changes.  The Committee discussed the probability that many codes might never be used. 

Enforcement Rule

Dr. Cohn said the Department had issued an interim enforcement rule that would expire on September 16, 2004.  The Subcommittee decided to remain silent on this issue.

Discussion

The Subcommittee discussed several topics for the next full Committee meeting on June 24-25  including a letter related to their PMRI testimony, early recommendations and communications with the Secretary, a brief update on the claims attachment rule and changes to that standard, the annual HIPAA report, work underway on the NCVHS Web site, and the ASCA report.  Thanking everyone for his or her hard work, Dr. Cohn adjourned the meeting at 2:47 p.m.

                                      I hereby certify that, to the best of my knowledge, the foregoing
                                      summary of minutes is accurate and complete.

                                               

                                              /s/                                 June 10, 04

                                     ______________________________________________________

                                         Chair                                  Date