Structured Reporting in Radiology

Curtis P. Langlotz, M.D., Ph.D. eDictation, Inc. and the University of Pennsylvania. langlotz@edictation.com.

Article originally appeared in Society for Health Service Research in Radiology, Winter 2000 Newsletter.   Appears here with author's permission.

Introduction

As many of you know, I made a career change in 1998 to pursue my interests in structured reporting and informatics. Although I have retained adjunct appointments at the University of Pennsylvania, I now spend most of my time outside of a University setting. My current work focuses on a structured reporting software business and on consultation with the National Cancer Institute on issues related to informatics, clinical trials, and terminology development. I have been asked often why I decided to venture out of the academic cocoon, so I thought I would use the SHSRR newsletter to provide summarize the health services research issues that motivated my decision, and to provide an overview of structured reporting. As you will see, I have not ventured as far as you might think from academia-there are many intriguing academic issues related reporting.

Structured reporting is a term that encompasses a wide variety of data entry and report generation techniques, Structured reporting software generally permits the use of pre-determined data elements or formats that allow semantically-organized storage and indexing of report elements, often explicitly linked to the image itself.

I became interested in the subject while conducting an evaluation of prostate MRI with one of my research follows; at the University of Pennsylvania (see Manzone, et al. The use of endorectal MR imaging to predict prostate carcinoma recurrence after radical prostatectomy. Radiology 209:537-542, 1998.) Our study used a general approach that is common to most health services research in radiology: assessing the relationship between the results of imaging examinations and the outcomes of patients. Because prostate cancer is an indolent disease for which outcomes arc; slow in coming, and because we had been performing prostate MRI kit Penn for several years, we decided to begin with a retrospective cohort study, with review of imaging reports and correlation to biochemical PSA recurrence.

Unfortunately, we found that many of the prostate MRI reports were at best ambiguous, and it worst internally inconsistent. (At the time, ibis was a surprising notion, but my subsequent literature review, summarized below, indicates that it is a common occurrence.) In some cases we examined the images themselves simply to resolve our own uncertainty about the conclusion,.,, of the report. I have encountered this same difficulty with retrospective report reviews several times since then.

Around the same time, I was asked to lead a discussion group on outcomes research for junior Faculty attending the Picker Faculty Development Program at the Association for University Radiologists. One of the attendees asked me what single thing I would do if I were a new junior radiology faculty member interested in health services research, My answer was to suggest building a database that captures all cases related to the area of clinical interest (e.g., shoulder MRI, interventional neuroradiology, CT angiography), The database then could become a local resource for exploratory data analysis, pilot studies, and collaborative research.

Later, I began to reflect on the inefficiency of multiple individual researchers many studying related topics, each designing local databases to capture information for use in later research, Each researcher Would expend a great deal of energy capturing the information in Structured form for their database, either by asking their clinical colleagues to fill out a form for each relevant case or by doing the clerical work themselves. But each database likely would employ inclusion criteria, data collection forms, and data definitions that were inconsistent with other databases in ways that could not be resolved easily later.

Interestingly, the National Cancer Institute (NCI) has recognized a similar data consistency problem among the cooperative research groups it funds, and is supporting a terminological development effort called "common data elements" (CDEs) that creates consistent data collection methods across NCI funded trials. One of my roles at NCI is to coordinate the development of these CDEs for imaging clinical trials. Our first effort, related to chest CT screening trials, will soon be available for download from the NCI's informatics web site (listed below under related links).

Because I had seen the weaknesses of conventional radiology reports in my health services research, I began to feel that the existing radiology reporting process was broken. But I recognized how difficult it would be to create and maintain a research data collection system in parallel with the existing conventional reporting process. So I imagined a software system for use by a clinical radiologist that would reduce reporting time, save reporting expenses, increase report clarity, and as a by-product, produce a structured database for use by health services researchers and others interested in improving radiology practice. This interest in re-engineering the radiology reporting process led me to form a company called eDict Systems, which provides Structured reporting software to radiology practices. The software allows radiologists to create multimedia reports efficiently for internet distribution, while at the same time Creating a structured database that uses a single consistent terminology.

Before embarking on the project, I analyzed in detail the many structured reporting systems (hat have been developed in the past, dating back to the 1960s. I found that most were hampered by the limited hardware and software technology available at the. time and by the absence of imperatives to measure imaging outcomes and improve imaging practices. We are fortunate today to have a number of exciting new programming environments that can quite rapidly create rich and dynamic visual interfaces. And, there is a renewed interest in assessing and improving the quality of our work as radiologists.

I have provided here a summary of my background research, including a list of the weaknesses of conventional reporting together with lists of the strengths and weaknesses or structured reporting. I have also included a condensed bibliography covering structured reporting and several related topics. The web site created by the laboratory of another SHSRR member, Charles E. Kahn, Jr., M.D., provides another excellent summary of structured reporting, (See the list or related links at the end of this article).

As you will see, there are many interesting informatics and health services research questions whose answers have the potential to improve the quality of our reporting process, These questions have only served to intensify my interest irk structured reporting. I look forward to hearing from you about structured reporting or any other topic.

Disadvantages of Conventional Reporting

The following characteristics of dictation and transcription may affect the cost and quality of care:

Potential Advantages of Structured Reporting

Structured reporting systems may provide some or all of the following improvements over conventional reporting, depending on the usability of the interface, and the degree to which reports are Structured:

Disadvantages of Structured Reporting

Here are some of the key challenges that may slow the adoption of structured reporting systems:

Structured Reporting Bibliography

Here is a brief listing of key articles describing structured reporting systems and related research:

Report Signature Turn-Around

  1. Bluth E, Havrilla M, Blakeman C. Quality improvement techniques: Value to improve the timeliness of preoperative chest radiographic reports, AJR 1993;160:995-998.
  2. Seltzer S. Kelly P. Adams D. Chiango B, Viero M, Fener E, Rondeau R, Kazanjian N, Laffel G, Shaffer K, Williamson D, Aliabadi P, Gillis A, Holman L, Expediting the turnaround of radiology reports: Use of total quality management to facilitate radiologists' report signing, AJR 1994;162:775-781.

Reporting Clarity

  1. Kong A, Barnett G, Mosteller F, Youtz C. How medical professionals evaluate expressions of probability. New Engl J Med 1986;315:740-744.
  2. American College of Radiology. ACR Standard for Communication; Diagnostic Radiology. Reston, VA: ACR, 1995. Impact of Reporting Errors
  3. Holman B, Aliabadi P, Silverman S, Weissman 13, Rudolph L, Fener U. Medical impact of unedited preliminary radiology reports. Radiology 1994;191:519-521.
  4. Clinger N, Hunter T, Hillman B. Radiology reporting: Attitudes of refering phyisicans. Radiology 1988;169:825-826.
  5. Sobel J, Pearson M, Gross K, Desmond K, Harrison E, Rubenstein L, Rogers W, Kahn K. Information content and clarity Of radiologists' reports for chest radiography. Acad Radiol 1990;3:709-717.

Evaluations of Report Quality

  1. Kalbhen C, Yetter E, Olson M, Posniak H, Aranha G. Assessing the resectability of pancreatic carcinoma: The value of reinterpreting abdominal CT performed at other institutions. AJR 1998;171:1571-1576.
  2. Magen A, Langlotz C, Banner M, Orel S, Sullivan D, Birnbaum B, Ramchandani P, Jacobs J. Interpretation of outside examinations: An undervalued service? American Roentgen Ray Society. Boston, MA: ARRS, 1997.
  3. Elmore JG, Wells CK, Lee CH, Howard DH, Feinstein AR. Variability in radiologists' interpretations; of mammograms. New EngI J Med 1994;33:1493-1499.

Foundational Structured Reporting Research

  1. Gagliardi R. The evolution of the X-ray report. AJR 1995;164;501-502.
  2. Clayton P, Ostler D, Gennaro J, Beatty S, Frederick P. A radiology reporting system based oil most likely diagnoses. Comp Biomed Res 1980;13:258-270.
  3. Greenes R, OBUS: A microcomputer system for measurement, Calculation, reporting, and retrieval of obstetric ultrasound examinations. Radiology 1992;144:979-833.
  4. Greene R, Barnett G, Klein S, Robbins A, Prior R. Recording, retrieval, and review of medical data by physician computer interaction. New Engl J Med 1970;282:307-315.
  5. Pendergrass H, Greenes R, Barnett G, Poitras 1, Pappalardo A, Marble C. Ali on-line computer facility for systematized input of radiology reports. Radiology 1969;92:709-713.

Current Structured Reporting Systems

  1. Bell D, Greenes R, Doubilet P. Form-based clinical input from a structured vocabulary: Initial application in ultrasound reporting. JAMIA 1992;Symposium Supplement:789-91.
  2. Kahn C, Wang K, Bell D, Structured entry of radiology reports using world-wide web technology. Radiographics 1996;16:683-691.
  3. Bell D, Pattison-Gordon E, Greenes R. Experiments in concept modeling for radiographic image reports. JAMIA 1994;1:249-262.

Evaluation of Structured Reporting Systems

  1. Bell D, Greenes R, Evaluation of UltraSTAR; Performance of a collaborative structured data entry system, JAMIA 1994;Symposium Supplement:216-222.
  2. Moorman P, van Ginneken A, Siersema P, van der Lei J, van Bemmel J. Evaluation of reporting based on descriptional knowledge, JAMIA 1995;2:365-373.
  3. Gouveia-Oliveira A, Raposo V, Salgado N, Almeida I, Nobre-Leitao C, de Melo F. Longitudinal comparative study: The influence of computers on reporting of clinical data. Endoscopy 1991;23:334-7.
  4. Melson D, Brophy R, Blaine J, Jost R, Brink G. Impact of a voice recognition system on report cycle time and radiologist reading time. In Horii S, Blaine J, cds. Proceedings of Medical Imaging 1998: PACS Design and Evaluation, Bellingham, WA: SPIE, 1998:226-236.

Imaging Lexicons

  1. National Cancer Institute. Dictionary of Common Data Elements http://cii.nci.nih.gov/cde.
  2. D'Orsi C, Kopans D, American College of Radiology's mammography lexicon: Barking up the only tree. AJR 1994; 162:595.
  3. American College of Radiology. Glossary of MR Terms. Reston, VA: ACR, 1995.
  4. Tuddenham W, Glossary of terms for thoracic radiology; Recommendations of the Nomenclature Committee of the Fleischner Society. AJR 1984;143:509-517.

Imaging Decision Support

  1. Nishikawa RM, Doi K, Giger ML, Schmidt RA, Vyborny CJ, Monnier-Cholley L, Papaioannou J, Lu P. Computerized detection of clustered microcalcifications: Evaluation of performance on mammograms from multiple centers. Radiographics 1995; 15:443-452.
  2. Swets JA, Getty DJ, Pickett RM, D'Orsi CJ, Seltzer SE, McNeil BJ, Enhancing and evaluating diagnostic accuracy. Med Decis Making 1991;11:9-18.
  3. Poon A, Fagan L, Shortliffe E. The Pen-Ivory project; Exploring user-interface design for the selection of items from large controlled vocabularies of medicine. JAMIA 1996;3:168-183.
  4. Musen M, Weickert K, Miller F, Campbell K, Fagan L, Development of a controlled medical terminology: Knowledge acquisition and knowledge representation. Meth Info Mod 1995;34:85-95.