Article

Repeatability of manual coding of cancer reports in the South African National Cancer Registry, 2010

DOI: 10.1080/10158782.2013.11441539
Author(s): Nomathemba DubeSchool of Health Systems and Public Health, University of Pretoria, South African Field Epidemiology and Laboratory Training Programme,, Brendan Girdler-BrownSchool of Health Systems and Public Health, University of Pretoria,, Khin TintSchool of Health Systems and Public Health, University of Pretoria, South African Field Epidemiology and Laboratory Training Programme, National Institute for Communicable Diseases,, Patricia KellettSouth African National Cancer Registry, National Institute for Occupational Health,

Abstract

Data validity is a very important aspect of cancer registries in ensuring data quality for research and interventions. This study focused on evaluating the repeatability of manual coding of cancer reports in the South African National Cancer Registry (NCR). This cross-sectional study used the Delphi technique to classify 48 generic tumour sites into sites that would be most likely (“difficult”) and least likely (“not dififcult”) to give rise to discordant results among coders. Reports received from the Charlotte Maxeke Academic Hospital were manually recoded by five coders (2 301 reports, e.g. approximately 400 reports each) for intra-coder agreement; and by four coders (400 reports) for inter-coder agreement. Unweighted kappa statistics were calculated and interpreted using Byrts' criteria. After four rounds of the Delphi technique, consensus was reached on the classification of 91.7% (44/48) of the sites. The remaining four sites were classified according to modal expert opinion. The overall kappa was higher for intra-coder agreement (0.92) than for inter-coder agreement (0.89). “Not difficult” tumour sites refected better agreement than “difficult” tumour sites. Ten sites (skin other, basal cell carcinoma of the skin, connective tissue, other specified, lung, colorectal, prostate, oesophagus, naso-oropharynx and primary site unknown) were among the top 80% misclassifed sites. The repeatability of manual coding at the NCR was rated as “good” according to Byrts’ criteria. Misclassified sites should be prioritised for coder training and the strengthening of the quality assurance system.

Get new issue alerts for Southern African Journal of Epidemiology and Infection