Omnia Health is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Sitemap


Articles from 2019 In June


Multiparametric serological testing in autoimmune encephalitis

Article-Multiparametric serological testing in autoimmune encephalitis

The spectrum of associated autoantibodies has expanded rapidly in recent years, and many new disease subtypes have been defined. The autoantibodies in autoimmune encephalitis are directed against neuronal cell-surface and synaptic antigens. This contrasts with the autoantibodies found in classic paraneoplastic neurological syndromes, which target intracellular antigens. Autoimmune encephalitides generally respond well to immunotherapy. However, treatment must be started promptly to prevent irreversible damage to the brain. The detection of anti-neuronal autoantibodies is a linchpin of diagnosis. Recombinant cell-based immunofluorescence assays enable sensitive monospecific determination of the most important autoantibodies. Since many of the autoantibodies are rare, multiparametric testing is favoured over single-parameter testing to minimise diagnostic gaps.

Autoimmune Encephalitis Syndromes

Autoimmune encephalitis typically manifests with seizures and neuropsychiatric symptoms and may occur with or without cancer. The most frequent and best characterised type is anti-N-methyl-D-aspartate (NMDA) receptor encephalitis. The target antigen of the autoantibodies is the membrane-spanning channel subunit NR1 of the NMDA receptor. Symptoms of anti-NMDA receptor encephalitis encompass psychosis, catatonia, seizures, dyskinesia, autonomic dysfunction and decreased consciousness. It is most common in young adults, and approximately 40-50 per cent of patients present with a neoplasm, predominantly ovarian teratoma.

Other forms of autoimmune encephalitis are linked to autoantibodies against a multitude of further antigens. LGI1 and CASPR2 are specific target antigens of autoantibodies formerly thought to be directed against voltage-gated potassium channels. Anti-LGI1 reactivity is tightly associated with limbic encephalitis, with tumours occurring in 5-10 per cent of cases. Anti-CASPR2 autoantibodies have been described in patients with mostly encephalitis and/or peripheral nerve dysfunction (Morvan’s syndrome). 20-50 per cent of cases are linked to thymoma. Anti-DPPX encephalitis is a multifocal neurological disorder with prominent hyperexcitability of the central nervous system and rare (<10 per cent) association with lymphoma. Autoimmune encephalitis with reactivity against GABAB receptors is characterised by very prominent seizures, memory loss and confusion. Neoplasms, especially small-cell lung carcinoma, occur in about half of patients. Patients with anti-AMPA receptor encephalitis commonly exhibit subacute confusion, memory deficits, seizures and sometimes dementia. 70 per cent of cases are paraneoplastic, affecting the lungs, thymus and breast.

There is evidence that autoantibodies in autoimmune encephalitis play a direct pathogenic role through antibody-driven inflammation and/or functional manipulation of the target antigen, which results in impairment of synaptic signal transduction. This is supported by the fact that these disorders can be effectively treated by immunotherapy.

Diagnostic strategy

The diagnosis of autoimmune encephalitis is based on clinical characteristics, magnetic resonance imaging, electroencephalography, cerebrospinal fluid (CSF) analysis and detection of anti-neuronal autoantibodies in CSF and/or serum. Infectious encephalitis and other autoimmune neurological disorders should be taken into account in differential diagnostics. Particularly in unexplained neurological cases, autoantibody screening can secure a diagnosis and may provide the first indication of a tumour. Broad antibody testing based on the most frequently targeted neuronal antigens helps to rapidly discriminate between different types of autoimmune encephalitis with overlapping pathology, especially limbic encephalitis.

Neuronal autoantibody detection

Autoantibodies against neuronal cell-surface antigens can be detected in serum or CSF by indirect immunofluorescence assays (IFA) using neuronal tissue sections, recombinant cells (Figure 1) and cultured primary neuronal cells. Since the target structures of the antibodies are conformation-dependent and fragile, solid-phase detection methods such as ELISA or immunoblot are unsuitable. Positive reactions on the neuronal tissue and cultured cells give rise to characteristic staining patterns depending on the in situ localisation of the native antigens. These substrates are especially useful for detecting autoantibodies that have not yet been characterised.

Recombinant cell-based IFAs provide efficient monospecific detection of defined autoantibodies. The recombinant cells heterologously express individual antigens on their surface, generally at a higher concentration per cell than in the native tissue, which allows a more sensitive detection of the corresponding autoantibodies. To achieve maximal diagnostic performance, only the relevant epitopes of the autoantigens are expressed.

With biochip mosaic technology, different substrates can be variably combined in one reaction field and incubated in parallel (Figure 2). Thus, a comprehensive antibody profile can be established with one test run. The IFA Autoimmune Encephalitis Mosaic 6, for instance, provides six recombinant cell substrates for simultaneous detection of antibodies against NMDA receptors, AMPA 1/2 receptors, GABAB receptors, LGI1, CASPR2 and DPPX.

Advantages of multiparametric testing

Multiparametric autoantibody screening significantly increases the hit rate compared to single parameter testing. This was demonstrated by comprehensive analysis of antibody prevalences in cohorts of consecutive samples sent to a clinical immunological reference laboratory.

In a cohort of 2716 serum/CSF samples, anti-neuronal IgG were found in 108 samples. Anti-NMDA receptor antibodies were by far the most prevalent finding (38 per cent), followed by anti-LGI1 (11 per cent) and anti-CASPR2 (11 per cent). In total, 67 per cent of the seropositive samples exhibited autoantibodies against neuronal surface antigens, while autoantibodies against classic paraneoplastic (intracellular) antibodies were detected in only 35 per cent. In 31 per cent of the seropositive cases, the antibody finding did not correspond to the parameter requested in the analysis order but was discovered only on account of the broad multiparametric screening.

This effect was even stronger in a cohort of 16,741 samples. 2353 samples were positive for at least one neuronal parameter; about half of them (52 per cent) revealed antibodies of class IgG. Approximately 11 per cent of the positive samples were sent in with a monospecific request. Of these cases, 56 per cent revealed the requested antibody and in 5 per cent a second antibody was found, while 49 per cent were positive for a parameter other than that requested. Thus, the increase in findings due to multiparametric testing amounted to 87 per cent. Again, NMDA receptor was the most frequently targeted antigen of IgG antibodies (26 per cent). IgG antibodies against neuronal surface antigens were found around twice as frequently as those against classic intracellular antigens.

The high prevalence of autoantibodies against cell surface antigens found in these studies underscores the rising relevance of this novel class of neuronal antigens for autoimmune encephalitis diagnostics.

Automation of antibody evaluation

Due to the ever-increasing number of neuronal autoantibodies that need to be analysed, a platform for automated immunofluorescence evaluation was developed to accelerate the analytical process and reduce personnel workload. The EUROPattern system consists of an automated microscope, which takes focused images of the fluorescence patterns for display on the computer screen, and sophisticated classification software based on deep-learning AI for positive/negative interpretation of the recorded immunofluorescence signals. The cell nuclei are counter-stained with propidium iodide, which verifies correct performance of the incubation. This is important for negative samples, which do not show a fluorescence signal.

The high-quality of the automatically acquired images was demonstrated by comparing on-screen appraisal with visual microscopy using 753 incubations of numerous serum samples. The two evaluation strategies revealed a concordance of 100 per cent with respect to positive/negative discrimination (excluding samples with ambiguous signals at the microscope to avoid inter-reader deviations). The computer-aided immunofluorescence microscopy thus considerably facilitates the microscopic analysis, supporting laboratory personnel in the rapid issuance of diagnostic findings.

Perspectives

The discovery in recent years of a myriad of new autoimmune encephalitis syndromes has led to a paradigm shift in the diagnosis and treatment of these neuropsychiatric disorders. Previously, they were frequently misclassified as psychiatric disorders, dementia or epilepsy, with patients often consigned to lifelong intensive psychiatric care. Nowadays, they can be effectively treated with immunotherapy and, if cancer is present, with tumour-directed therapy. Timely treatment often leads to substantial and even full recovery. Multiparametric tests based on recombinant cell technology enable fast detection of the disease-associated autoantibodies.

The assays encompass well-known parameters as well as autoantibodies that few clinicians are aware of, ensuring a broad-ranging analysis. Due to the rapid rate of discovery of new target autoantigens and syndromes, it is anticipated that the spectrum of autoantibody assays will continue to expand in the near future. This would enable the diagnostic net to be cast wider still, enabling more patients to benefit from life-saving immunotherapy.

References available on request.

The discovery in recent years of a myriad of new autoimmune encephalitis syndromes has led to a paradigm shift in the diagnosis and treatment of these neuropsychiatric disorders.

Misclassification risks associated with interferences and assay quality

Article-Misclassification risks associated with interferences and assay quality

Errors caused by interferences are a small fraction of the overall lab test error rate and errors caused by biotin interference are an even smaller fraction of that. While risk of misclassification due to biotin interference is possible, findings from a risk assessment demonstrate it to be significantly lower than the risk from common sources of error laboratories manage with expertise every day. Published clinical studies also show the potential for result misclassification is associated with overall assay performance (e.g., clinical sensitivity and specificity, PPVs and NPVs). Laboratories should continue to have strict and appropriate protocols in place to reduce and prevent lab errors. There is also an opportunity for labs to take a leadership role in educating clinicians about analytical limitations, interferences as well as ways they can partner to further optimise laboratory results and support clinical decision making.

Role of interferences in overall laboratory error rate 

As laboratory medicine is increasingly considered a cornerstone of predictive medical decision-making, understanding and removing or reducing lab errors is paramount.

Among the common causes of test error are inadequate sample collection and transport, patient misidentification, inappropriate test requests and delay in reporting results that fall within the critical limits. Interfering substances are also known sporadic causes of error. These can include heterophilic antibodies, HAMA, therapeutic drugs and high doses of certain supplements, including biotin (vitamin b7). However, the frequency with which interferences occurs and consequently erroneous results that may affect clinical management is difficult to quantify.

Errors arising from interference account for a very small proportion of lab test errors. The overall lab test error rate is 0.012-0.6 per cent; the proportion of these errors that occur in the analytical phase is 7-13 per cent. The percentage of errors in the analytical phase in relation to the overall lab test error rate is 0.078 per cent (Figure 1). Only a very small portion of these errors are due to interferences, and the potential of assay errors caused by biotin interferences is even lower.

Laboratory awareness and successful management of the common causes of errors have helped reduce the overall test error rate. Due to advances in automation and sample handling, fewer errors now occur in the analytical phase of analysis than in the pre-and post-analytical phases.

While all laboratory tests and all immunoassay manufacturers are affected by interferences, laboratories have an opportunity to educate clinicians and hospital administrators about steps undertaken to reduce lab errors from common sources including interferences, and measures in place to minimise risks.

Biotin interference risk in perspective

While the error rates caused by biotin interferences are a fraction of errors caused by overall assay interferences, biotin requires a special effort at raised awareness.

For the last few years, supplement companies have been marketing biotin supplements in 2.5 mg, 5 mg and 10 mg doses, 167 – 333 times the amount contained in a daily multivitamin and the adequate daily intake, for hair, skin and nail health, despite the lack of supportive clinical evidence. High-dose biotin is also currently being used in clinical trial settings as a potential treatment for patients with and secondary progressive multiple sclerosis.

There is no risk of assay interferences associated with the intake of biotin as part of a standard multivitamin (typically 0.03 -0.06 mg). Intake of high doses of biotin, however, has the potential to lead to interference with immunoassays.

U.S. sales data of biotin over a four-year period (2014-April 2018) indicate biotin sales have been trending slightly upwards. The steadiest growth has been in products containing 2.5 mg biotin and under doses, which have little to no risk of interference if taken according to the package insert. Sales in products containing 5 mg dose declined.

While an increase in biotin supplement sales does not necessarily correlate to an increased risk of biotin interference, laboratories and clinicians should be aware of biotin use among their patients and proactively instruct patients to stop taking biotin before a blood draw if very high biotin concentrations (e.g. 10 mg or more) are taken. If lab results seem discordant with clinical observation, clinicians should ask questions, consult with the laboratory, and repeat the test as necessary.

While all laboratory tests and all immunoassay manufacturers are affected by interferences, laboratories have an opportunity to educate clinicians and hospital administrators about steps undertaken to reduce lab errors

Assessment of risk misclassification due to biotin and other sources for variations 

To get a clearer picture of the risk of misclassification due to interferences, Roche analysed and estimated the risk resulting from biotin interference for the Elecsys assays with a risk assessment model.

The estimation of risk was done by analysis of the probability of occurrence in the context of the intended use considering the clinical practice and severity of harm.

In general, this risk calculation includes “extrapolated” data for biotin prevalence (anticipating extremely values that have never been seen in prevalence studies so far), the biotin interference curve, and real-world data distribution of specific patient results derived from the intended use population.

Risk of misclassification for most Roche assays was determined typically in the range of 1 in 10,000,000 or lower. This is approximately 1,000-10,000 times lower than other risks of other errors quantified in the literature such as HAMA at 3-5 in 10,000, assay imprecision error at approximately 1 in 10,000 and biological variation error at approximately up to 1 in 1,000 (Figure 2).

Biotin interference in troponin testing 

Biotin interference probability in troponin assays was also evaluated. If a dose of biotin was taken right before the onset of an Acute Coronary Syndrome (ACS), by the time the patient reaches the ER (mean 3.5-6 hours after onset of symptoms), biotin plasma concentrations will have decreased significantly. Studies show biotin peaks in the bloodstream between 1-2 hours post ingestion.

The misclassification risk for troponin is mitigated by fast wash-out of biotin and the utilisation of serial sampling in clinical practice, in accordance with the American Heart Association (AHA) and European Society of Cardiology (ESC) guidelines (Figure 3). If an AMI is suspected, AHA guidelines recommend baseline and serial troponin testing within 3-6 hours.

The risk of misclassifying a Troponin T test result due to biotin interference was assessed to be < 1 in 10 million in an acute coronary syndrome cohort.

Immunoassay overall quality performance and misclassification risk in cardiac troponin testing

Assay design contributes to its overall quality performance. Biotin has been part of the development of immunoassays for decades due to two primary reasons: a) streptavidin and biotin naturally form a strong, highly specific and stable bond; and b) the addition of biotin to antibodies or other proteins does not alter their biologic properties. This system results in a plethora of highly sensitive, specific and accurate immunoassays supporting informed clinical decision-making in general and emergency settings.

With Troponin assays, for example, the risk of missed AMI is linked to the sensitivity and negative predictive value (NPV) of the test. Clinical studies demonstrate greater risk of misclassification for hs cTnT assays based on overall assay performance (Figure 4).

The risk of misclassifying a patient due to biotin interference (<1 in 10,000,000) is a fraction of the risk due to overall assay performance (6-15 in 1,000). 

Adverse events also impact potential for risk misclassification (Figure 5). Data shows the Elecsys cTnThs (Elecsys, Roche) had 5 adverse events, vs the Abbott hs-cTnl which had 90, and Beckman-Coulter, 806. The goal for future developments should be to further minimise all kinds of potential interferences.  

Opportunity for lab leadership 

Protecting patient safety is global healthcare’s highest priority, and clinical laboratories can partner with clinicians to increase understanding of potential interferences and misclassification risk as well as strategies for mitigating risks and educate patients.

Clinical labs should use assays with overall high-quality performance and reliability, especially in critical settings. Immunoassays should be of the highest available standards in quality and innovation across criteria that matter most including longitudinal precision and accuracy, sensitivity, specificity, lot-to-lot consistency and adverse events.

With the clarity of facts and data, and clinician education, laboratories can confidently take a leadership role on the issue of interferences by educating clinicians about interferences, including biotin, and how labs and clinicians can work together to reduce risks and optimise lab results.  

References available on request.  

Fostering excellence

Article-Fostering excellence

In an interview with Medlab Magazine, Patrick E. Godbey, MD, FCAP, President-elect, College of American Pathologists (CAP), Founder, CEO, and Laboratory Director, Southeastern Pathology Associates, Brunswick, Georgia, U.S., shares some key insights on the current trends transforming the laboratory.

What has been the impact of advancements, such as point-of-care-testing (POCT), big data, automation, and personalised medicine, on the industry?

The CAP is accustomed to advancement and change in an ever-evolving healthcare environment. The impact for the CAP is a continued diligence to monitor how these changes affect laboratories and patients. Not only do we monitor, we also lead and participate in these changes to ensure such advancements do not impede the delivery of high-quality laboratory medicine for the patients we serve. We have dedicated CAP-member pathologists who serve tirelessly on a myriad of committees that address any changes in laboratory medicine. Whether POCT, personalised medicine, automation, or big data, the CAP will ensure its external quality assurance and accreditation programmes reflect current-day laboratory medicine as well as support laboratory professionals in their day-to-day practice.

Has the regional laboratory industry been quick to adopt these trends? 

Rapid change has become a defining feature of pathology. Along with growing regulatory requirements in many markets and the increased need for laboratory services, other factors are driving change in the laboratory market. These factors include the rising prevalence of chronic conditions, such as obesity and diabetes. The resulting increase in routine testing is likely to lead to even greater automation.

How has the industry evolved in the Middle East?

The Middle East and North Africa (MENA) region is seeing an upsurge of new testing methods to support initiatives such as personalised medicine and point-of-care testing. In addition, there is government pressure to see rapid development as investment and demand for healthcare in the Middle East grow. The region is also experiencing new concepts of medical practice characterised by prediction, personalisation, prevention, and patient participation.

There is an increasing demand to reduce costs, which results in pressure to use fewer resources. As in other areas of healthcare, industry consolidation is prevalent in laboratories with the goal of achieving increased economies of scale.

POCT in HIV management: From diagnosis to monitoring

Article-POCT in HIV management: From diagnosis to monitoring

An estimated 3,200 children were newly infected with HIV due to mother-to-child transmission. Since 2010, new HIV infections have decreased by 22 per cent while AIDS-related deaths have increased by 68 per cent. The implementation of HIV management was expanded in 2016 into 34 provinces. It covered comprehensive HIV and sexually transmitted infections (STIs), TB-HIV collaboration, prevention of mother to children transmission (PMTCT), among others.

Laboratory support in HIV management is an essential part for diagnosing, prevention and control of HIV/AIDS. Wide variety of tests are needed in HIV disease management. Investment in laboratories is an investment in health. Quality laboratory services are crucial at every step in responding to epidemics, from accurate testing to effective treatment, care, and monitoring of disease and prevention of new infections.

HIV testing services are considered as the gateway to treatment facilities. The challenges to improve access to HIV testing for ensuring subsequent linkage to care is needed to solve. From a public health perspective, it is advisable to recommend testing to those at risk for HIV and to make testing easily accessible. The idea is to detect every HIV positive, whether it belongs to a high risk group, a pregnant woman, or a patient of tuberculosis or reproductive tract infection approaching the health system for health needs and refer him/her to the nearest antiretroviral therapy (ART) centre. Providing quality laboratory services for HIV testing to all those who need it is a challenging task.

HIV diagnostic tests can either detect virus molecules (HIV RNA and p24 antigen) or antibodies to the virus HIV diagnostic testing in Indonesia based on WHO recommendation using serial algorithm with three different methods to increase the accuracy of the tests; HIV rapid tests used in adults in serial or parallel algorithms using one to three different assays.

For assessing immune status and establishing eligibility for treatment and care, CD4 testing has been used. Treatment eligibility threshold levels have changed over time from 200 cells/μl in 2002 [3] to 350 cells/μl in 2010. More recently, the new consolidated WHO recommendations suggest initiation at CD4 counts of <500 cells/μl. Further suggestions of universal access and test and treat strategies are also being recommended. CD4 count has also been used for regular monitoring of immunological recovery on treatment, generally at six-monthly intervals. CD4 cell counts shed light on the status of the immune system and can help physicians to predict and prevent the development of opportunistic infections.

The HIV viral load (VL) assay, a nucleic-acid-based test, is used to monitor response to treatment; an undetectable viral load defines treatment success. HIV viral load tests provide a picture of viral activity. VL testing is frequently done in centralised facilities and currently requires expensive instrumentation, technical skill, and has relatively high costs per assay. Despite these challenges, this assay has gained its rightful place in guidelines and clinical practice and is thought to be the most reliable marker for treatment success.

Expanding scope

Point-of-care testing is an old approach to testing that has been around for decades and remains as controversial today as it was when first introduced. POCT refers to testing that is performed near to or at the site of patient care, with the result leading to a possible or immediate change to patient care. HIV-related point-of-care testing technologies have become widely available in the last few years through implementation and scaleup of other HIV-related point-of-care testing. Point-of-care testing provides an opportunity to greatly reduce test turnaround time and increase the availability, expanding scope and coverage of testing beyond urban centres to reach the rural population.

Point-of-care testing of HIV refers to the practice undertaken by healthcare professionals at the time of testing outside of a designated laboratory. The standard methods of HIV testing, enzyme linked immunosorbent assay (ELISA) or Western blot with confirmatory testing, can take several days for result availability. A significant proportion of individuals who agree to undergo HIV serologic testing do not return to the HIV testing site to receive their test results. POC testing of HIV attempts to address delay in detection of HIV status by providing preliminary antibody results. POC tests can be most useful in resource limited settings (RLS) or outreach settings where there is lack of well-trained laboratory technicians, poor physical infrastructure, extremes of climate, and lack of uninterrupted power supply, all of which impact the use of laboratory technologies. Point-of-care testing has been shown to reduce patient loss to follow-up and increase access to antiretroviral therapy.

Rapid test devices (RTDs) are typically capillary flow tests for use on whole blood (e.g., fingerprick), plasma, or oral fluid. HIV test kits are designed for HIV antibody and antigen detection. The results can be delivered within 20 minutes after the specimen has been taken. The results are available within a single consultation. The rapid turnaround time associated with its use can guide urgent decision making. This makes it suitable for use in targeted clinical scenarios where the immediate administration of antiretroviral drugs is recommended to reduce the risk of transmission or in cases where the patient’s management may be altered by the availability of a reactive test result.

Access to antiretroviral therapy (ART) has increased dramatically over the past decade in Indonesia. Successful management of HIV requires patients receiving ART to be monitored routinely to assess treatment efficacy and detect treatment failure due to drug resistance. The standard of care to monitor ART is quantitative viral load testing based on plasma HIV RNA concentration. POC test for CD4 count could help clinicians to decide when to start antiretroviral treatment, and a POC test for viral load would be of great value in identifying treatment failure and the need for second-line treatment. These devices could rapidly and accurately identify CD4 counts with minimal operator training, infrastructural setup, and with less cost than standard laboratory-based equipment. Several CD4 POCT analysers can give CD4 count in 20 minutes from a finger stick or venous sample.

HIV viral load testing is the most accurate method to evaluate the response of ART on HIV infected patients. It is considered as gold standard to monitor response therapy and predict clinical prognosis of patients on ART. The threshold used to detect the failure to ART is 1000 virus copies/mL. The nucleic acid based tests detect and measure viral RNA. Several POC devices for viral load are currently available and in the pipeline.

Diagnosing HIV infection in infants early will be very beneficial in reducing the mortality of infants due to the HIV infection. The maternal antibodies present in the system of infants makes the antibody test undesirable for diagnosing HIV in infants less than the age of 18 months. The detection of viral components are required for early infant diagnosis and this can include viral capsid p24 antigen, cell free RNA or viral DNA incorporated into the host detection. The current p24 immunoassays show more sensitivity than the previously used p24. However, similar to nucleic acid amplification-based assays, p24 immunoassays require specialised equipment and skilled technicians to perform the test. In order to perform these tests in resource limited settings, early infant diagnosis rapid tests are required.

In Indonesia, POCT in HIV management has been used for several years. HIV rapid testing, POCT CD4 and VL has been used in all 34 provinces in Indonesia. HIV rapid testing used for initial screening has been used in primary healthcare, while CD4 and VL POCT placed in district hospital or laboratory to cover the testing reference from the whole area.

Rapidly scaling up point-of-care testing without simultaneously building the capacity of quality assurance is counterproductive if service providers are unable to guarantee adherence to minimum quality standards with accurate and reliable results. We found several challenges in implementing POC Testing. In contrast to standard laboratory-based HIV testing, the healthcare workers in the POC setting assume responsibility for specimen collection, testing, and counselling of the patient. Appropriate training on the use of kits, reading of results, detection of errors, quality assurance, counselling, and regular assessment of staff who will be performing POC testing is required for providing point-of-care testing.

RTDs are generally satisfactory for the detection of uncomplicated HIV infection (or its absence) but are less sensitive than lab-based ELISAs and automated systems for detecting early infections (seroconversion). Also, specificity of RTDs is lower than conventional ELISAs although it can be improved by immediate repeat of all RTD positives.

A variety of rapid tests for diagnosing HIV in resource limited settings are commercially available. All the approved HIV POC tests should have a sensitivity and specificity matching the ELISA kits that are used in laboratories. POC devices have in-built internal quality controls to monitor in real time the performance of an instrument and, to some extent, the user. External quality assessment (EQA), often used interchangeably with proficiency testing, challenges the testing environment and allows for an external expert body to examine the processes and provide remedial action. The three components of an effective EQA laboratory programme are site supervision, retesting of specimens, and proficiency testing. Proficiency testing is the most widely-used approach to monitor the performance of the test used and the quality of testing. It involves blinded control material, meant to mimic patient specimens, sent from a reference laboratory to the testing site and results sent back for scoring.

Indonesia’s Ministry of Health is currently providing EQA for rapid HIV test organised by provincial reference laboratory, while CD4 and VL EQA are centrally organised. CD4 EQA is provided by COE CD4 EQAS – Thailand, and QASI – Canada and VL from NRL – Australia.

HIV viral load testing is the most accurate method to evaluate the response of ART on HIV infected patients. It is considered as gold standard to monitor response therapy and predict clinical prognosis of patients on ART. 

Conclusion

POCT will improve access to needed HIV and associated diagnostics, but these assays are not without limitations that should be noted and reported. There is a need to integrate these technologies cost-effectively and efficiently into clinical algorithms and existing laboratory networks.

Can thalassemia be eradicated by proper screening?

Article-Can thalassemia be eradicated by proper screening?

The Alpha and Beta thalassemia genes occur with variable frequency in different regions of the Kingdom of Saudi Arabia, and both B+ & B0 thalassemia have been reported. Screening programme for hemoglobinopathies was implemented first by Aramco Hospital in 1980 followed by Universities and Military Hospitals.

Treatment of thalassemia major is complex, expensive and requires a multi-disciplinary approach. Disease complication is common due to excessive iron overload and sub-optimal chelation. Cardiac constitutes the first cause of death followed by infection and endocrine dysfunction. Iron toxicity is a crucial factor for tissue damage in iron overloaded patients, making an accurate assessment of body iron essential.

The availability of Cardiac Magnetic Resonance Imaging T2* reportedly improved the survival among thalassemia by early detection of iron overload in the heart and liver; early detection of iron overload led to early treatments.

T2*-weighted MRI is a well-validated predictor of both cardiac and hepatic iron content and superior to non-invasive methods such as measurement of serum ferritin. Several methods of estimating T2* values are possible, but the most common technique is to acquire a series of breath-hold gradient-echo (GRE) images at progressively increasing echo times (TE's).

The second cause of death in our study was infections in splenectomised patients; Hydroxyurea, had an effective therapy to avoid splenectomy in Thalassemia disorders.

Comprehensive care by multi-disciplinary approach for early detection of disease complication and treating the underlying cause improve the survival of thalassaemic patients.

After a Royal decree in December 2003, the pre-marital screening programme was implemented in February 2004 in all healthcare regions by Saudi Ministry of Health in all centres of the Kingdom. Six years of pre-marital screening in Saudi Arabia markedly reduced the number of at-risk marriages, which may considerably reduce the genetic disease burden in the Kingdom in the next decades.

Adherent comprehensive care with early detection and recognition of disease complication by applying advanced technology in diagnostic procedures, proper treatment and stem cell transplant, are the keys to control disease morbidity, mortality and improve survival with better quality of life among patients.Therefore, it can be said that thalassemia disorders can be eradicated by good screening and disease control.

References available on request.

Diagnosis and serotyping of dengue virus infection at/near points of care

Article-Diagnosis and serotyping of dengue virus infection at/near points of care

Cases of dengue have increased almost fourfold from 2000 to 2013, and 30 folds over the last five decades, notably with more frequent, larger outbreaks and more severe cases. According to the estimation of World Health Organization (WHO), dengue is endemic in 128 countries with 3.9 billion people at risk per year. Among them, around 96 million may manifest clinically, and approximately 500,000 people per year may require hospitalisation. In 2018, an increase in recognised dengue cases were reported in India, Philippines, and Singapore.

Dengue virus (DENV), the etiological agent of dengue, is a member of the Flavivirus genus in Flaviviridae family, which also includes Yellow fever virus, and Japanese encephalitis virus. DENV infection causes a wide range of clinical signs in humans, from asymptomatic to acute febrile illness (dengue fever, DF), to severe dengue haemorrhagic fever/dengue shock syndromes (DHF/DSS). Although the overall mortality associated with dengue is low, the burden on health services caused by dengue has been significant. Early diagnosis of dengue can help doctors in selecting appropriate measures to reduce morbidity and mortality. High viremia was associated with the development of severe dengue disease, suggesting early administration of antivirals in the course of illness could help lessen severity. However, diagnosis of DENV infection cannot rely solely on clinical signs and symptoms since the majority of the infected individuals are either asymptomatic or present with symptoms similar to those of other febrile-episode-inducing diseases.

Current methods to aid diagnosis of DENV infection include antigen detection (virus isolation, immunofluorescence assay, NS1 immunological test), serological assays (plaque reduction neutralization titers (PRNT), IgM/IgG immunological test), and RNA detection (reverse transcription-polymerase chain reaction [RT-PCR]). Virus isolation and PRNT are two options only for well-equipped laboratories with well-trained staff. NS1 antigen detection and/or serological methods are the most commonly used diagnostic methods. NS1 enzyme linked immunosorbent assay [ELISA] can detect circulating NS1 antigen for up to nine days after symptom onset. IgM appears later but lasts longer than NS1 and viral RNA during primary infection; IgM ELISA has been widely used in public healthcare centres; interpretation of the results can be enhanced by a rise in the titre when testing paired samples from both acute- and convalescent-phases. However, their cross reactivity to other flaviviruses such as Zika virus is still a concern.

NATs generally have greater sensitivity and specificity than antigen and serological methods. Using NATs, e.g. real-time RT-PCR, to detect DENV RNA during the first five to six days after symptomatic onset was recommended by the WHO as an aid for laboratory confirmation of DENV infection. NATs have gradually replaced virus isolation to help identify acutely DENV-infected patients.

Higher prevalence

Dengue disease is caused mainly by four antigenically related serotypes of DENV, namely DENV-1, -2, -3, and -4; about 50 – 70 per cent differences in sequence identities are found among their genomic RNA. Multiple virus serotypes have been found co-circulating in several hyperendemic regions. Numerous studies reported that some serotypes are more often associated with severe diseases or specific clinical manifestation than the other serotypes. For example, severe disease was associated more with DENV-2 than other serotypes in children in several studies, and more with DENV-1 in adults in a Hong Kong study. A study in western South America found a higher prevalence of musculoskeletal and gastrointestinal manifestations in DENV-3-infected patients and a higher prevalence of cutaneous and respiratory manifestations in DENV-4-infected patients. A study of a DENV-2- and DENV-3-infected cohort in Taiwan showed a higher prevalence of respiratory and cutaneous manifestations (rash) in DENV-3- and DENV-2-infected subjects, respectively. In addition, chances for developing DHF-DSS is elevated when infection with one of the four serotypes is followed by a heterotypic serotype; this is likely due to antibody-mediated enhancement of DENV infection. This association has been observed also in a large scale; for example, the replacement of DENV-3 by DENV-1 was associated with a wave of severe dengue epidemic in Sri Lanka in 2009. Hence, DENV serotyping is also important in prevention and control of dengue.  

DENV serotyping by serological methods have been impeded by cross-reactivity of antibodies induced by different DENV serotypes. A DENV envelope protein-based Ig-M ELISA capable of differentiating DENV serotypes during the acute phase of dengue was reported recently. However, its clinical performance needs to be further validated and existing IgM from previous infection remains a challenge in data interpretation. Several CE-IVD marked multiplex real-time RT-PCR methods capable of serotyping are available for laboratory settings. A Luminex-based single DNA fragment amplification assay capable of multiplex DENV serotyping was reported recently.

Rapid on-site detection and serotyping of DENV can potentially alert front-line health professionals of invasion of a new or long-time absent serotype, allowing timely implementation of intervention strategies. Although parallel NS1 antigen and IgM/IgG rapid testing has improved dengue diagnosis at points of need/points of care (PON/POC) in recent years, both tests do not provide serotype information. In addition, they are generally less sensitive and specific than those of ELISA and nucleic acid testing (NAT) methods. Their sensitivity could be further lowered during secondary infection since the levels of detectable NS1 could be reduced by existing antibody, and levels of IgM are not as high as in primary infection and can be undetectable in some cases.

Real-time RT-PCR assays and the Luminex-based assay allow serotyping of DENV with great sensitivity. However, their execution and/or data interpretation require skilled technicians and relatively expensive equipment that are not commonly available in remote areas or developing countries; long-distance transportation of specimens is another major obstacle.

To facilitate timely near-patient serotyping of DENV at low-resource settings, rapid, easy, mobile, NAT methods of high sensitivity and serotype-specificity are still needed to bring accurate detection/serotyping of DENV to PON/POC. On-site serotyping of DENV by a combination of isothermal amplification and DNA sequencing with a portable sequencer was recently demonstrated to be feasible. However, its immediate clinical applications at PON/POC are limited by the costs and technical skills required.

A limited number of methods, including the reverse-transcription-loop-mediated isothermal amplification (RT-LAMP) and the TaqMan probe-based RT-insulated isothermal PCR (RT-iiPCR) methods, became available recently with potential to enable easy near-patient detection and serotyping of DENV. LAMP can be performed with a simple incubator and generally requires only a very simple nucleic acid extraction step. One DENV serotyping RT-LAMP panel including four singleplex DENV-1, 2, 3, and 4 reactions targeting the 2A, NS4B, NS4A, and 3’UTR markers was reported to generate signals visualised by naked eyes or a laboratory fluorescence-monitoring device in 25 minutes. Its clinical performance was demonstrated preliminarily to be similar to that of a real-time RT-PCR. Another RT-LAMP panel also included four reactions all targeting the NS1 gene; SYBR Green I signals can be detected visually or with a field-deployable Genie II flourometer (OptiGene, UK) in about 35 minutes. Preliminary study showed that this assay and CDC real-time RT-PCR assay complemented NS1 to increase the diagnostic coverage of febrile patients to similar degrees.

Another singleplex DENV serotyping RT-PCR panel that works on a mobile PCR system has been evaluated preliminarily to have performance comparable to that of CDC Real-time RT-PCR. The DENV-1, 2, 3, and 4 reagents, targeting the NS5, E, prM, and prM genes, respectively, are based on the TaqMan probed-based iiPCR technology. Automated amplification and detection of iiPCR are achieved consistently in a capillary tube in a simple, insulated heater/detector. CE-marked compact iiPCR devices are commercially available, including a field-deployable model (POCKIT, GeneReach Biotech) and a hand-held series (POCKIT Micro) with built-in rechargeable batteries. Results are detected and interpreted automatically to generate qualitative results within one hour. Its performance has been verified and validated to be comparable to that of several reference laboratory methods (real-time PCR, virus isolation) with various markers and sample types. A CE-marked pan-DENV RT-PCR is already available to detect all four DENV serotypes in human plasma and serum on this platform to aid PON/POC identification of DENV. PCR testing requires nucleic acid extraction, and a compact automated nucleic acid extraction system has been bundled with these PCR devices in a durable suitcase to realise a mobile PCR laboratory. Furthermore, a fully automated, sample-in-answer-out, compact system (POCKIT Central Nucleic Acid Analyser) is available recently with the CE mark to further minimise human error risks and allow easy molecular bio-detection near PON/POC.

Progress in translating novel molecular technology into diagnostics for dengue infection has helped the development of relatively inexpensive, rapid, and simple NATs to meet the needs in early serotyping of DENV near PON/POC. These tools have potential to enable timely management, control, and monitoring of different DENV serotypes especially in under-served communities. Further verification and validation studies of these methods should be expedited to bring them to clinical settings.

References available on request.  

Medlab Middle East 2019 generates business of US$152 million

Article-Medlab Middle East 2019 generates business of US$152 million

The 18th edition of the recently concluded Medlab Exhibition & Congress 2019 generated a total business of US$ 152 million and proved to be an instrumental platform for business exchange and education. His Excellency Humaid Al Qatami, Director General of the Dubai Health Authority (DHA) inaugurated the event, which is the MENA region’s largest medical laboratory exhibition and congress. Organised by Informa Markets – Healthcare, the event was officially supported by the UAE Ministry of Health, Government of Dubai, DHA, Health Authority Abu Dhabi, and Dubai Healthcare City Authority.

The event hosted 608 exhibitors, 51 exhibiting countries, 13 country pavilions, 4,673 delegates and 25,661 professional visits. The four-day event provided the MENA medical laboratory industry a platform to build relationships with international stakeholders and saw clinical laboratory manufacturers from around the world display their latest devices, equipment, innovations and solutions. It enabled companies to showcase progress and achievements in the sector, as well as make the most of new business opportunities in the global medical laboratory field.

A survey at the show found that 97 per cent rated the exhibition as an important platform for their business, 94 per cent will be exhibiting again in 2020, 89 per cent considered the show a success, while 80 per cent of exhibitors were seeking new contacts for future business.

At the show, visitors were able to explore some of the latest laboratory medicine solutions, including diagnostic tests, reagents, disposables and equipment. The event welcomed thousands of medical laboratory professionals right from purchase managers responsible for negotiating supplier contracts, to leading manufacturers, to distributors and trade professionals in search of equipment worldwide to support their clients needs and budgets.

Industry update

According to a report by Colliers International, the MENA region’s clinical laboratory services market is estimated to be worth between US$8 to 10 billion, with approximately 70 per cent of doctor’s decisions regarding a patient’s diagnosis, treatment, hospital admission and discharge, being based on laboratory test results.

The research predicted that the clinical laboratory services market in MENA is expected to grow at a Compound Annual Growth Rate (CAGR) of 6 to 8 per cent until 2025 in line with global projections. Furthermore, figures from the report titled “Clinical Laboratory Services in the MENA Region 2019” and published as part of the Medlab Market Report series, revealed the number of inpatients and outpatients in the UAE (Abu Dhabi and Dubai) is expected to grow from 29.3 million in 2016 to 95.3 million in 2030. Similarly, the UAE is projected to have a bed capacity of 25,300 by 2030, up from 12,500 in 2016.

Reportedly, in Dubai, public sector healthcare facilities conducted around 25.5 million tests in 2017 while the private healthcare sector performed over 6.2 million tests during the same period.

Focus on automation

Laboratory automation, which involves the use of technologies such as robotic devices to achieve greater efficiencies when carrying out diagnostic testing on humans, emerged as a major trend at Medlab 2019. Clinical laboratory automation plays a major role in reducing avoidable human error and diagnosis delays as well as improving turnaround time, increasing productivity, effectively utilising resources and enhancing patient safety.

Commenting on the role of automation in the clinical laboratory on the sidelines of the event, Dr. Shakoor Malik, Chief Scientific Officer, Pure Health, said: “Automation has moved from “nice-to-have” for large reference laboratories to “must-have” for any clinical laboratory. Automation, robotics and laboratory information systems are dominating the industry, which in return leads to quick and independent sample testing and reporting with a reduction in operational costs.”

The Laboratory Management conference featured talks that stressed that as more and more money is being spent on introducing newer technologies in clinical laboratories, automation is expected to present several benefits, including improving resourcing and enhancing the accuracy and efficiency of laboratory operations.

Dr. Shakoor Malik, Chief Scientific Officer, Pure Health, said: “Automation has moved from “nice-to have” for large reference laboratories to “must-have” for any clinical laboratory." 

Multi-disciplinary congress

With 11 conferences and more than 4,673 delegates and 120 international and regional speakers, Medlab Middle East Congress is the only multi-disciplinary CME accredited medical laboratory congress in the region.Day one of the Congress saw the launch of its inaugural Artificial Intelligence (AI) Conference. Exploring the potential for AI to transform the medical laboratory industry in the UAE, the conference also assessed how diagnosis can be revolutionised through futuristic tech such as data robots and “bloodless blood tests”.Speaking during the conference, Dr. Alain Pluquet, Director of Innovation, Institut Merieux, Lyon, France, said: “AI is playing a huge role in microbiology and there are several commercial applications that are already bringing medical and economic benefits to the industry.”

The Immunology Conference also made its debut at the show. The conference provided a space for discussion on the latest trends and important issues concerning the widespread utilisation of immunological techniques for healthcare advancement. Regional and international pioneers within immunology and relevant fields presented their research and shared novel case-based knowledge throughout the exclusive scientific programme.

Other featured topics included microbiology, molecular diagnostics and genetics, laboratory informatics, haematology and blood transfusion, point of care testing (POCT) and cytogenetics and IVF.

A report titled “In Vitro Fertilization (IVF) & Fertility in the MENA region”, by Colliers International, revealed that compared to 10 per cent worldwide, infertility in the MENA region is 15 per cent or higher, with male infertility a growing problem occurring in approximately 50 per cent of the cases in the GCC and Middle East due to lifestyle, diabetes, obesity and genetics related factors, as GCC countries have one of the highest diabetic and obesity rates in the world.

According to the report, new innovations and improved testing techniques are gradually creating paradigm shifts in the field of assistive reproductive technology. These were visible at the Cytogenetics & IVF conference that shed light on topics such as pre-marital screening for consanguineous (relatives) couples and the development of new genetic tests for screening of the embryos that can greatly improve the chance of minimising certain genetic diseases common in this region.

The conference also emphasised advances in molecular cytogenetic diagnostic tests, which includes karyotyping, Fluorescence in situ hybridization (FISH) testing and advanced Chromosomal Microarray testing (CMA) to help improve services in the growing number of fertility centres and genetic labs in the region.

Progress in test strip urinalysis

Article-Progress in test strip urinalysis

Until recently, microscopic sediment analysis was the most popular urinalysis methodology. However, this time-consuming method has been associated with an important analytical error. Although urinary test strips were introduced in 1957, over the past two decades, advancing electronics and informatics have created new technical possibilities. This review summarises the major ongoing developments in urine test strip analysis and future developments.

Although the chemical technology for urinary test strips has only made a moderate progress over time, advances in the field of electronic detectors have considerably increased the analytical sensitivity of commercial test strip readers. As early as 2002, Penders already demonstrated that automated urine test strip reading enables quantitative analysis for red blood cells (RBCs), white blood cells (WBCs), glucosuria and proteinuria. The reciprocal value of the quantitative reflectance result is proportional to the concentration of the urinary analyte. Quantitative applications of such strip readings have been described for both ketones and albumin.

A traditional tetrabromophenol blue dye-binding based albuminuria test strip in combination with a metal oxide semiconductor (CMOS) based urine strip reader could allow quantitative analysis of albuminuria and the determination of the albumin:creatinine ratio. This approach enables for the first-time quantitative albuminuria readings, even in the microalbuminuria range (20 - 200 mg/L). Concomitantly, the creatinine test pad on the same strip allows to correct (micro) albumin results for sample dilution, which is helpful in the clinical interpretation. Applying the same CMOS technology, very sensitive readings can also be obtained for WBC (leukocyte esterase activity) and RBC (peroxidase activity). Also, for WBC and RBC, reflectance data can be used quantitatively, enabling an excellent sensitivity for these analytes. A noteworthy evolution is the potential use of smartphones for interpreting urine test strip results. Mobile platforms have been proposed, combining urinalysis strips and a pocket-sized strip reader, capable of transmitting data via a smartphone.

Urine flow cytometers (UFCs) have improved accuracy and allow significant reduction of labour costs. A popular approach is combining test strips with UFCs for screening purposes either using both strips and UFC or by utilising the urine strips for parameters unrelated to the UFC-analysed particles. As mechanical coupling of UFCs and test strip readers has been introduced, expert systems now take advantage of the combined information and apply user – definable decision criteria. This implementation significantly reduces microscopy review rates and saves time and labour cost.

Rate of diuresis is characterised by an important biological variation. As precision of urinalysis has considerably increased over time, correcting for urinary dilution has become of growing importance in the interpretation of urinalysis test results. As diuresis variability is a major pre-analytical confounding factor in urinalysis, a number of reference parameters have been introduced to assess urinary dilution and the patient’s hydration status. The commonly used reference parameters in urine are conductivity, specific gravity (density), and creatinine. Specific gravity can be measured using either urine test strips or refractometry.

In view of the great demand for the development of cost-effective portable readers, pocket-sized urine strip readers can be combined with dipsticks in a device that is capable to send digital information using a smartphone, offering a solution for detecting urological or nephrological diseases in regions with limited availability of experts. Advances in microfluidics have enabled the development of chip-based assays, which might change urinalysis in the years to come. Alongside conventional urinalysis applications integrated microfluidic chips show promise for the quantitative detection of bladder cancer cells in urine. Similarly, microfluidic paper devices have been described for detecting bacteria causing UTIs and sexually transmitted diseases in urine specimens.

Over the past 20 years, automated urinalysis has undergone a technical evolution. In this respect, automated test strip reading provides added value. Further integration of existing technologies may contribute to reduce turn-around times. Consolidation of clinical laboratories has also led to a reduction of their number and has increased the distance between patient and laboratory. 

References available on request.

Evaluation of Adenosine Deaminase (ADA) level in exudative pleural effusion diagnostic

Article-Evaluation of Adenosine Deaminase (ADA) level in exudative pleural effusion diagnostic

Normally the amount of pleural fluid is around at 0.1 ml/kg to 0.3 ml/kg and acts as a minimal lubricant for both pleural surfaces. As the most common pleural disease, it causes high mortality and morbidity and affects more than 1.5 million patients per year in the U.S. In Indonesia, 3,000 pleural effusions were diagnosed within January 2017 until February 2019, according to the medical record in Cipto Mangunkusumo National Referral Hospital.

Pleural effusion occurs as a result of the varied underlying health problems primarily involving the pulmonary or systemic system, which may affect the clinical symptoms, and further examination based on its aetiologies. Inflammation in pleural effusion often causes localised and severe pain with breathing or a cough; patient with systemic obstruction caused by congestive heart failure complains asymptomatic or breathlessness associated with its fluid volume. The build-up of fluid in pleural effusion can be caused by several mechanisms including increased pleural membrane permeability, increased pulmonary capillary pressure, decreased negative pressure in the pleural cavity, decreased oncotic pressure, and obstruction of the lymphatic vessels.

Classification of pleural effusion

Pleural effusion can be classified by the modified Light’s criteria. If the ratio of pleural fluid protein to serum protein is more than 0.5; the ratio of pleural fluid lactate dehydrogenase to serum LDH is more than 0.6; and/or the pleural fluid LDH level is more than 2/3 the upper limit of the normal level of serum LDH; at least one of the criteria is met and it is considered as exudative effusion. This condition can occur due to infections and inflammatory disorder such as tuberculosis, pneumonia; malignancy; hemothorax; parapneumonic effusion; and lymphatic obstruction. However, transudates are mostly caused by systematic conditions like congestive heart failure, nephrotic syndrome, liver cirrhosis, hypoalbuminemia; which alter the hydrostatic or oncotic pressure in the pleural cavity. Other less common causes are drug-induced, pulmonary embolism, oesophageal rupture, and post-radiotherapy.

Comprehensive diagnostic approach

Some patients presenting a pleural effusion can be asymptomatic or complain about dyspnoea, cough, and/or pleuritic chest pain; a thorough medical history may help to differentiate aetiologies from cardiovascular or other causes such a history of cancer, recent trauma, chronic hepatitis or an occupational history, and their detailed recent medication. A precise physical examination should be performed to determine if the clinical sign suggests an effusion like dullness to percussion.

Meniscus sign of pleural effusion can be detected by chest radiographs in the presence of 200 mL of fluid on posteroanterior radiography and in any case 50 mL of fluid is required on lateral radiography. Computed tomography (CT) can be used to evaluate effusions, which are not apparent in plain radiography, differentiating between the pleural fluid and pleural thickening, and provide clues to select the empyema’s site of drainage. Loculated or small amount pleural effusion, even in detecting pleural effusion in the critical care setting can be detected accurately by Ultrasonography. Its guidance is more sensitive in detecting pleural fluid septations, useful for established diagnosis and for marking a site of thoracentesis.

Thoracentesis is a minimally invasive procedure indicated to all unilateral effusion of unknown origin for diagnosis and relieving symptoms in patient. The effusion sample from this procedure can be analysed to differentiate its mechanisms as transudative or exudative effusions by using modified Light’s criteria.Pleural fluid needed to analyse is approximately 20-40 ml; then it will be extracted within four hours. A reddish pleural fluid appearance shows the presence of the blood like in trauma, pulmonary embolism, or malignant disease; if it comes after a prolonged period it will alter with a brownish tinge appearance. The presence of malignant effusion, tuberculous pleuritis, oesophageal rupture, or lupus pleuritis can be shown by a low glucose amount (30-50mL) and mostly with a low pleural pH level (less than 7.2). In most cases, higher LDH will present in exudative effusion as ongoing inflammation, whereas the decreasing LDH concentration showed in a cessation of the process in inflammation.

The predominance of white cells in differential cell count gives guidance to establish aetiologies; neutrophils caused by pneumonia or empyema, eosinophilia presents in pneumothorax, haemothorax, or asbestosis, mononuclear cell as the sign of chronic inflammatory, then lymphocytes will present in cancer and TB infection. Gram stain and culture in pleural fluid useful for detecting chronic illness or suspect TB. Since Indonesia has become a high burden country for tuberculosis, other promising diagnostic tools may be proposed as a proper index for TB diagnosis by the use of adenosine deaminase (ADA) that are more reliable, easy, rapid, and cost-effective.

Computed tomography (CT) can be used to evaluate effusions, which are not apparent in plain radiography, differentiating between the pleural fluid and pleural thickening, and provide clues to select the empyema’s site of drainage. 

Diagnostic value of Adenosine Deaminase (ADA) in tuberculosis body fluid

The deamination of adenosine to inosine and of deoxyadenosine to deoxyinosine has been catalysed by ADA enzyme as a marker of cell-mediated immunity. ADA will elevate in empyema, whereas ADA was found to be high in M. tuberculosis infection. However, there is no significant difference in the use of total ADA and isoform ADA2 in clinical practice.

ADA analysis can be performed in the presence of exudate in pleural, ascitic and CSF with increased cell count of lymphocytosis using the Giusti method. Approximately 0.02 ml sample of body fluid and 0.2 ml of an adenosine solution were placed into a test tube, subsequently incubated at 37ºC for 60 minutes. Then, a phenol-nitroprusside and a hypochlorite solution will be added and incubated at 37 degree Celsius for 15 minutes. A Spectrophotometer at a wavelength of 620 nm will be used to read the amount of ammonia liberated by ADA action by converted to U/L.

ADA assay for detecting tuberculous pleurisy

The usefulness of ADA in pleural fluid for diagnosing tuberculous pleurisy had been investigated in some studies. Qureshi et. al (2018) reported 330 (93.75 per cent) out  352 TB pleurisy cases showed ADA levels 40 IU/L and above with a sensitivity 93.75 per cent, specificity 91.42 and positive predictive value 98.21 per cent in an area of high prevalence for tuberculosis, whereas in Auckland, New Zealand, as a low incidence setting, the median pfADA in 57 TB pleural effusion remains significantly higher (p<0.001) than in 1580 non-TB pleural effusion, respectively 58.1 U/L and 11.4 U/L presented by Blakiston et. al (2018).

The most widely used cut-off for tuberculous pleurisy diagnosis is approximately 40 IU / L. Study by Yusti G et al (2018) reported HIV patients with diagnosis of TB showed a median value of ADA of 70 IU/L (interquartile range (IQR) 41-89) and the non-TB group a median of 27.5 IU/L (IQR 13.5-52), hence ADA level determination could be useful to diagnose pleural tuberculosis in HIV infected patients.

Measurement of LDH/ADA ratio may be helpful for clinicians in distinguishing between tuberculosis with parapneumonic pleural effusion. Based on study by Wang J et. al (2017), the median pleural fluid LDH and ADA levels and LDH/ADA ratios in the tuberculosis and parapneumonic groups were: 364.5 U/L vs 4037 U/L (P < .001), 33.5 U/L vs 43.3 U/L (P = .249), and 10.88 vs 66.91 (P < .0001) respectively, showed the pleural fluid LDH/ADA ratio in the tuberculosis group was significantly lower than in the parapneumonic pleural effusion group.Prior studies have noted the importance of ADA as an useful diagnostic tool for the management of tuberculous pleurisy as an exudative pleural effusion. 

Conclusion

Pleural fluid adenosine deaminase (ADA) can be a reference guidance to diagnose tuberculous pleurisy according to significant increasing level that has been found in tuberculosis infection as an exudative pleural effusion.

References available on request.

How do medical labs transact knowledge virtually?

Article-How do medical labs transact knowledge virtually?

The aim is to promote better patient care and quality healthcare. The current period is marked by the transition from the traditional lab model to the new clinical lab 2.0. Lab 2.0 is expected to offer premium delivery to patients and provide new synthesis in the lab working. Lab professionals do not work in isolation. The knowledge input to labs is derived from many corners where collective efforts are ensured. The stakeholders function remotely at distributed locations and work collectively in digital mode.

Knowledge in labs is not confined to mere data or information input and rather embraces a wider inclusion such as intellectual property, expertise, learning and skills between academic and the non-academic community. A strong scientific knowledge base is one of the medical labs’ traditional key assets. The innovation in labs is currently being challenged by a rapidly changing research landscape. It is important to promote the transnational dimension of knowledge transfer.

Compared to the U.S., the average university in other countries including Europe generates far fewer inventions and patents. Several reasons are attributed for this condition, out of which the less systematic and unprofessional management of knowledge and intellectual property by these ‘average’ universities is cited. Thus, learning and understanding from the successful universities and bringing all knowledge producers to share, mark the growth of knowledge exploitation in labs.

We can try to understand how the complex knowledge, which produces technologies move from the academic and research world to the labs so that new lab technologies can be supported, enhanced and accelerated. The analysis and efforts explore how these types of complex and un-coded knowledge move from the minds to the labs in order to understand how this process might be optimised.

Medical labs borrow creative knowledge from minds and university research labs and translate such un-coded knowledge into practice. Labs at the same time produce problems set to basic researchers with live and real-data. Medical labs transact knowledge between themselves using remote and distributed systems. Formal protocols exist for such transfer and at the same time, undocumented knowledge is also required.

Creating a knowledge grid in medical labs is required. Knowledge transaction occurs between labs themselves and in between labs and universities. Some of the top U.S. universities get license income, which is equivalent to their R&D budget. The successful U.S. universities obtain license income mainly in biomedical sciences. Here we make a clear distinction between the faculty labs or we call it as idea labs and applied labs where productivity and innovation arise. Idea labs have been emerging in the last few years, which are responsible to develop solutions to the identified problems.  

Knowledge in labs is not confined to mere data or information input and rather embraces a wider inclusion such as intellectual property, expertise, learning and skills between academic and the non-academic community.

Collaboration for knowledge transfer

Knowledge transfer in the virtual labs is not just limited between colleagues and fellow researchers, but include the knowledge transaction between researchers, practitioners and patients. Idea transfer from idea labs emerging from interdisciplinary work, reaches the lab community.

Labs working virtually is characterised by the use of a software platform for collaboration and knowledge transfer that includes wikis for document sharing, discussion boards, mailing lists, conferencing facilities, and so on; it also has a web portal to support the sharing of knowledge, as well as the dissemination of knowledge and expertise; and a virtual education centre that is conceptually linked to the portal mentioned above.

Collaboration increases productivity, which is empirically proved with data. In the same field or theme, people do the same work without knowing others’ work. Many labs work on a specific theme without integrating itself into a large area where research is characterised as fragmented research. The same piece of new work is performed by two more labs without knowing others work. Collaboration enables to offset such limitations.

The Knowledge transfer activities enables the businesses introduce and embed change, such as developing new technologies and streamlining processes. In the successful knowledge transaction process, the links between knowledge transfer activities and innovation performance is proved.  

Impact of computer-mediated communication

Databases, Big data, Robotics systems, Networking, Cloud computing, Internet of Things, Graphical interfaces, Data mining, Machine learning, Semantic technologies, Neurocomputing, Intelligent decision support systems and specialised programming languages are some technologies and research areas influencing medical informatics. In the areas of medical rehabilitation and assistive technology, ICT has contributed greatly to the enhancement of quality of life and ensures complete integration of people into society. The use of Cloud computing and IoT accelerates the flow of the information and improved communication in healthcare.

Lab-on-a-chip technology 

Now Nano-chips are used to pack data in handheld devices, which can send biomarkers found in small amounts and sent to various destinations for deriving insights. The data captured is combined with real-time health data from other IoT-enabled devices, such as smart systems and smart watches, and analysed by using real-time intelligence.

AI systems could ultimately be packaged in a convenient handheld device to allow people to quickly and regularly measure data and send this information securely streaming into the cloud from the convenience of their home. There it could be combined with real-time health data from other IoT-enabled devices, like sleep monitors and smart watches, and analysed by AI systems for insights. When taken together, this data set will give us an in-depth view of our health and alert us to the first signs of trouble, helping to stop the disease before it progresses.

Faster fibre links for data centres

Newer fibre optic systems now can transmit large data that is several thousand gigabits of data per second. Chips can also able to transmit such high volume of a big step up from top speeds of several hundreds of gigabits in today’s data centres. New medical big data is heterogeneous, transactional and unstructured, collected privately and available publicly leading to challenges in data transfer and process.