Structured Reporting in Radiology
Curtis P. Langlotz, M.D., Ph.D. eDictation, Inc. and the University of
Article originally appeared in Society
for Health Service Research in Radiology, Winter 2000 Newsletter.
Appears here with author's permission.
As many of you know, I made a career change in 1998 to pursue my interests in
structured reporting and informatics. Although I have retained adjunct
appointments at the University of Pennsylvania, I now spend most of my time
outside of a University setting. My current work focuses on a structured
reporting software business and on consultation with the National Cancer
Institute on issues related to informatics, clinical trials, and terminology
development. I have been asked often why I decided to venture out of the
academic cocoon, so I thought I would use the SHSRR newsletter to provide
summarize the health services research issues that motivated my decision, and to
provide an overview of structured reporting. As you will see, I have not
ventured as far as you might think from academia-there are many intriguing
academic issues related reporting.
Structured reporting is a term that encompasses a wide variety of data entry
and report generation techniques, Structured reporting software generally
permits the use of pre-determined data elements or formats that allow
semantically-organized storage and indexing of report elements, often explicitly
linked to the image itself.
I became interested in the subject while conducting an evaluation of prostate
MRI with one of my research follows; at the University of Pennsylvania (see
Manzone, et al. The use of endorectal MR imaging to predict prostate carcinoma
recurrence after radical prostatectomy. Radiology 209:537-542, 1998.) Our study
used a general approach that is common to most health services research in
radiology: assessing the relationship between the results of imaging
examinations and the outcomes of patients. Because prostate cancer is an
indolent disease for which outcomes arc; slow in coming, and because we had been
performing prostate MRI kit Penn for several years, we decided to begin with a
retrospective cohort study, with review of imaging reports and correlation to
biochemical PSA recurrence.
Unfortunately, we found that many of the prostate MRI reports were at best
ambiguous, and it worst internally inconsistent. (At the time, ibis was a
surprising notion, but my subsequent literature review, summarized below,
indicates that it is a common occurrence.) In some cases we examined the images
themselves simply to resolve our own uncertainty about the conclusion,.,, of the
report. I have encountered this same difficulty with retrospective report
reviews several times since then.
Around the same time, I was asked to lead a discussion group on outcomes
research for junior Faculty attending the Picker Faculty Development Program at
the Association for University Radiologists. One of the attendees asked me what
single thing I would do if I were a new junior radiology faculty member
interested in health services research, My answer was to suggest building a
database that captures all cases related to the area of clinical interest (e.g.,
shoulder MRI, interventional neuroradiology, CT angiography), The database then
could become a local resource for exploratory data analysis, pilot studies, and
Later, I began to reflect on the inefficiency of
multiple individual researchers many studying related topics, each designing
local databases to capture information for use in later research, Each
researcher Would expend a great deal of energy capturing the information in
Structured form for their database, either by asking their clinical colleagues
to fill out a form for each relevant case or by doing the clerical work
themselves. But each database likely would employ inclusion criteria, data
collection forms, and data definitions that were inconsistent with other
databases in ways that could not be resolved easily later.
Interestingly, the National Cancer Institute (NCI) has recognized a similar data
consistency problem among the cooperative research groups it funds, and is
supporting a terminological development effort called "common data
elements" (CDEs) that creates consistent data collection methods across NCI
funded trials. One of my roles at NCI is to coordinate the development of these
CDEs for imaging clinical trials. Our first effort, related to chest CT
screening trials, will soon be available for download from the NCI's informatics
web site (listed below under related links).
Because I had seen the weaknesses of conventional radiology reports in my health
services research, I began to feel that the existing radiology reporting process
was broken. But I recognized how difficult it would be to create and maintain a
research data collection system in parallel with the existing conventional
reporting process. So I imagined a software system for use by a clinical
radiologist that would reduce reporting time, save reporting expenses, increase
report clarity, and as a by-product, produce a structured database for use by
health services researchers and others interested in improving radiology
practice. This interest in re-engineering the radiology reporting process led me
to form a company called eDict Systems, which provides Structured reporting
software to radiology practices. The software allows radiologists to create
multimedia reports efficiently for internet distribution, while at the same time
Creating a structured database that uses a single consistent terminology.
Before embarking on the project, I analyzed in detail the many structured
reporting systems (hat have been developed in the past, dating back to the
1960s. I found that most were hampered by the limited hardware and software
technology available at the. time and by the absence of imperatives to measure
imaging outcomes and improve imaging practices. We are fortunate today to have a
number of exciting new programming environments that can quite rapidly create
rich and dynamic visual interfaces. And, there is a renewed interest in
assessing and improving the quality of our work as radiologists.
I have provided here a summary of my background research, including a list of
the weaknesses of conventional reporting together with lists of the strengths
and weaknesses or structured reporting. I have also included a condensed
bibliography covering structured reporting and several related topics. The web
site created by the laboratory of another SHSRR member, Charles E. Kahn, Jr.,
M.D., provides another excellent summary of structured reporting, (See the list
or related links at the end of this article).
As you will see, there are many interesting informatics and health services
research questions whose answers have the potential to improve the quality of
our reporting process, These questions have only served to intensify my interest
irk structured reporting. I look forward to hearing from you about structured
reporting or any other topic.
Disadvantages of Conventional Reporting
The following characteristics of dictation and transcription may affect the
cost and quality of care:
- Transcription services are costly.
Transcription services typically charge 15 cents per line, or between $2 and
$3 per page, corresponding to an average of about $3.50 per report, draining
about 5 percent of practice revenues.
- Preliminary reports are not promptly available.
Transcription is often performed in batches. Even when the transcription
service is fully staffed, it may not be able to absorb the peak volume of
dictated reports. These problems Often delay the final report, thereby
frustrating the referring physician, delaying or preventing reimbursement
for the examination, and degrading practice efficiency.
- Dictation and transcription are error prone.
One study of 4871 radiology reports from the Brigham and Women's Hospital
(Boston, MA) found that 33.8% of reports required post-transcription editing
by radiologists prior to signature (Holman et al, Radiology
1994;191:519-521). Nearly 6% of the corrected errors were substantive, such
as errors of missing or incorrect information iliac would have led to
unnecessary treatment or testing, or that could have caused risk of
complications or morbidity for the patient.
- Report signature causes additional delays.
Even when preliminary (unedited) reports are available on a radiology
information system, they are often of limited utility to the referring
physician, At teaching institutions, these preliminary reports are usually
dictated by trainees, and have not yet been reviewed, edited, and signed by
the senior physician legally responsible for the report. This review is
typically performed in batch fashion, further delaying the final
- The needs of referring physicians are not met.
One survey found that 49% of referring physicians thought that chest X-ray
reports sometimes did not address the clinical question. Forty percent of
these physicians thought the reports were occasionally confusing, Another
systematic analysis of chest X-ray reports in 8,426 Medicare patients with
cardiac problems found as many as 14 different terms to describe a single
common abnormal finding ("interstitial edema/infiltrate"), and 23
synonyms for reporting the presence of a finding (Sobel el al. Acad Radiol
1996;3;709-717). The same study found that 14% of chest X-ray reports on
patients admitted for pneumonia failed to mention the presence or absence of
- A text report is not useful for subsequent research.
Even in the best of circumstances, when the transcribed and signed report is
available on a radiology information system, only an unstructured text
format is available. Text-based searches of those systems can be time
consuming, and have low accuracy in retrieving a desired subset of reports,
Because the semantic Content is not stored or indexed in a
semantically-coherent fashion, decision support is not available at the time
of interpretation, Expensive and inaccurate post-processing or re-coding of
the data is required if subsequent analyses are needed.
Potential Advantages of Structured Reporting
Structured reporting systems may provide some or all of the following
improvements over conventional reporting, depending on the usability of the
interface, and the degree to which reports are Structured:
- Save time dictating.
Routine reports are created faster than with conventional dictation.
- Save time editing reports.
Computer generated reports have fewer errors than speech recognition.
- Turn-around reports promptly.
Reports can be approved and sent at any time.
- Get help with tough cases.
Templates, gamuts, and other forms of real-time decision support are
available on demand,
- Complete, accurate, and appealing reports.
Referring physicians will appreciate clear, focused, multi-media reports.
- Cost savings.
Eliminates transcription expenses from the operating budget.
Disadvantages of Structured Reporting
Here are some of the key challenges that may slow the adoption of structured
- Potential for increased "look-away time".
Some structured reporting systems, like some speech recognition systems, may
cause the radiologist to monitor the report as it is produced, This
additional task may reduce the amount of time the radiologist spends
examining the images, Although the literature regarding the relationship
between interpretation time and accuracy is variable, this increased
look-away time has the potential to affect accuracy.
- Converts a production task into a search task.
Structured reporting systems derive many of their benefits from the use of
consistent imaging terms. This requires radiologists who Use Structured
reporting systems to familiarize themselves with the preferred terms for
imaging findings. For unfamiliar findings, a radiologist must initiate a
search for the proper term, rather than improvising an approximate synonym.
- Imaging lexicons are not yet routinely available.
Although some imaging lexicons, such as the American College of Radiology's
Breast Imaging Reporting and Data System (BIRADS) lexicon, have been widely
adopted, and others are being developed for chest imaging and
cross-sectional breast imaging, the wide adoption of structured reporting
systems will require the availability of lexicons for all imaging
- Difficulty in integrating the report with the image.
Most structured reporting systems, and the DICOM Structured Reporting
standard, provide for the creation of links between imaging findings and
locations on an image. However, efforts such as Integrating the Healthcare
Enterprise (THE), sponsored by the Radiological Society of North America (RSNA)
and the Healthcare Information Management Systems Society (HIMSS), are all
in the early stages of making this integration routine.
Structured Reporting Bibliography
Here is a brief listing of key articles describing structured reporting
systems and related research:
Report Signature Turn-Around
- Bluth E, Havrilla M, Blakeman C. Quality improvement techniques: Value to
improve the timeliness of preoperative chest radiographic reports, AJR
- Seltzer S. Kelly P. Adams D. Chiango B, Viero M, Fener E, Rondeau R,
Kazanjian N, Laffel G, Shaffer K, Williamson D, Aliabadi P, Gillis A, Holman
L, Expediting the turnaround of radiology reports: Use of total quality
management to facilitate radiologists' report signing, AJR 1994;162:775-781.
- Kong A, Barnett G, Mosteller F, Youtz C. How medical professionals
evaluate expressions of probability. New Engl J Med 1986;315:740-744.
- American College of Radiology. ACR Standard for Communication; Diagnostic
Radiology. Reston, VA: ACR, 1995. Impact of Reporting Errors
- Holman B, Aliabadi P, Silverman S, Weissman 13, Rudolph L, Fener U.
Medical impact of unedited preliminary radiology reports. Radiology
- Clinger N, Hunter T, Hillman B. Radiology reporting: Attitudes of refering
phyisicans. Radiology 1988;169:825-826.
- Sobel J, Pearson M, Gross K, Desmond K, Harrison E, Rubenstein L, Rogers
W, Kahn K. Information content and clarity Of radiologists' reports for
chest radiography. Acad Radiol 1990;3:709-717.
Evaluations of Report Quality
- Kalbhen C, Yetter E, Olson M, Posniak H, Aranha G. Assessing the
resectability of pancreatic carcinoma: The value of reinterpreting abdominal
CT performed at other institutions. AJR 1998;171:1571-1576.
- Magen A, Langlotz C, Banner M, Orel S, Sullivan D, Birnbaum B, Ramchandani
P, Jacobs J. Interpretation of outside examinations: An undervalued service?
American Roentgen Ray Society. Boston, MA: ARRS, 1997.
- Elmore JG, Wells CK, Lee CH, Howard DH, Feinstein AR. Variability in
radiologists' interpretations; of mammograms. New EngI J Med
Foundational Structured Reporting Research
- Gagliardi R. The evolution of the X-ray report. AJR 1995;164;501-502.
- Clayton P, Ostler D, Gennaro J, Beatty S, Frederick P. A radiology
reporting system based oil most likely diagnoses. Comp Biomed Res
- Greenes R, OBUS: A microcomputer system for measurement, Calculation,
reporting, and retrieval of obstetric ultrasound examinations. Radiology
- Greene R, Barnett G, Klein S, Robbins A, Prior R. Recording, retrieval,
and review of medical data by physician computer interaction. New Engl J Med
- Pendergrass H, Greenes R, Barnett G, Poitras 1, Pappalardo A, Marble C.
Ali on-line computer facility for systematized input of radiology reports.
Current Structured Reporting Systems
- Bell D, Greenes R, Doubilet P. Form-based clinical input from a structured
vocabulary: Initial application in ultrasound reporting. JAMIA
- Kahn C, Wang K, Bell D, Structured entry of radiology reports using
world-wide web technology. Radiographics 1996;16:683-691.
- Bell D, Pattison-Gordon E, Greenes R. Experiments in concept modeling for
radiographic image reports. JAMIA 1994;1:249-262.
Evaluation of Structured Reporting Systems
- Bell D, Greenes R, Evaluation of UltraSTAR; Performance of a collaborative
structured data entry system, JAMIA 1994;Symposium Supplement:216-222.
- Moorman P, van Ginneken A, Siersema P, van der Lei J, van Bemmel J.
Evaluation of reporting based on descriptional knowledge, JAMIA
- Gouveia-Oliveira A, Raposo V, Salgado N, Almeida I, Nobre-Leitao C, de
Melo F. Longitudinal comparative study: The influence of computers on
reporting of clinical data. Endoscopy 1991;23:334-7.
- Melson D, Brophy R, Blaine J, Jost R, Brink G. Impact of a voice
recognition system on report cycle time and radiologist reading time. In
Horii S, Blaine J, cds. Proceedings of Medical Imaging 1998: PACS Design and
Evaluation, Bellingham, WA: SPIE, 1998:226-236.
- National Cancer Institute. Dictionary of Common Data Elements http://cii.nci.nih.gov/cde.
- D'Orsi C, Kopans D, American College of Radiology's mammography lexicon:
Barking up the only tree. AJR 1994; 162:595.
- American College of Radiology. Glossary of MR Terms. Reston, VA: ACR,
- Tuddenham W, Glossary of terms for thoracic radiology; Recommendations of
the Nomenclature Committee of the Fleischner Society. AJR 1984;143:509-517.
Imaging Decision Support
- Nishikawa RM, Doi K, Giger ML, Schmidt RA, Vyborny CJ, Monnier-Cholley L,
Papaioannou J, Lu P. Computerized detection of clustered microcalcifications:
Evaluation of performance on mammograms from multiple centers. Radiographics
- Swets JA, Getty DJ, Pickett RM, D'Orsi CJ, Seltzer SE, McNeil BJ,
Enhancing and evaluating diagnostic accuracy. Med Decis Making 1991;11:9-18.
- Poon A, Fagan L, Shortliffe E. The Pen-Ivory project; Exploring
user-interface design for the selection of items from large controlled
vocabularies of medicine. JAMIA 1996;3:168-183.
- Musen M, Weickert K, Miller F, Campbell K, Fagan L, Development of a
controlled medical terminology: Knowledge acquisition and knowledge
representation. Meth Info Mod 1995;34:85-95.