Omnia Health is part of the Informa Markets Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them. Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

POCT in HIV management: From diagnosis to monitoring

Article-POCT in HIV management: From diagnosis to monitoring

Indonesia is among countries that is still struggling against HIV/AIDS. Based on WHO reports, in 2016, Indonesia had 48,000 new HIV infections and 38,000 AIDS-related deaths. An estimated 3,200 children were newly infected with HIV due to mother-to-child transmission. Since 2010, new HIV infections have decreased by 22 per cent while AIDS-related deaths have increased by 68 per cent. The implementation of HIV management was expanded in 2016 into 34 provinces. It covered comprehensive HIV and sexually transmitted infections (STIs), TB-HIV collaboration, prevention of mother to children transmission (PMTCT), among others.

Laboratory support in HIV management is an essential part for diagnosing, prevention and control of HIV/AIDS. Wide variety of tests are needed in HIV disease management. Investment in laboratories is an investment in health. Quality laboratory services are crucial at every step in responding to epidemics, from accurate testing to effective treatment, care, and monitoring of disease and prevention of new infections.

HIV testing services are considered as the gateway to treatment facilities. The challenges to improve access to HIV testing for ensuring subsequent linkage to care is needed to solve. From a public health perspective, it is advisable to recommend testing to those at risk for HIV and to make testing easily accessible. The idea is to detect every HIV positive, whether it belongs to a high risk group, a pregnant woman, or a patient of tuberculosis or reproductive tract infection approaching the health system for health needs and refer him/her to the nearest antiretroviral therapy (ART) centre. Providing quality laboratory services for HIV testing to all those who need it is a challenging task.

HIV diagnostic tests can either detect virus molecules (HIV RNA and p24 antigen) or antibodies to the virus HIV diagnostic testing in Indonesia based on WHO recommendation using serial algorithm with three different methods to increase the accuracy of the tests; HIV rapid tests used in adults in serial or parallel algorithms using one to three different assays.

For assessing immune status and establishing eligibility for treatment and care, CD4 testing has been used. Treatment eligibility threshold levels have changed over time from 200 cells/μl in 2002 [3] to 350 cells/μl in 2010. More recently, the new consolidated WHO recommendations suggest initiation at CD4 counts of <500 cells/μl. Further suggestions of universal access and test and treat strategies are also being recommended. CD4 count has also been used for regular monitoring of immunological recovery on treatment, generally at six-monthly intervals. CD4 cell counts shed light on the status of the immune system and can help physicians to predict and prevent the development of opportunistic infections.

The HIV viral load (VL) assay, a nucleic-acid-based test, is used to monitor response to treatment; an undetectable viral load defines treatment success. HIV viral load tests provide a picture of viral activity. VL testing is frequently done in centralised facilities and currently requires expensive instrumentation, technical skill, and has relatively high costs per assay. Despite these challenges, this assay has gained its rightful place in guidelines and clinical practice and is thought to be the most reliable marker for treatment success.

Expanding scope

Point-of-care testing is an old approach to testing that has been around for decades and remains as controversial today as it was when first introduced. POCT refers to testing that is performed near to or at the site of patient care, with the result leading to a possible or immediate change to patient care. HIV-related point-of-care testing technologies have become widely available in the last few years through implementation and scaleup of other HIV-related point-of-care testing. Point-of-care testing provides an opportunity to greatly reduce test turnaround time and increase the availability, expanding scope and coverage of testing beyond urban centres to reach the rural population.

Point-of-care testing of HIV refers to the practice undertaken by healthcare professionals at the time of testing outside of a designated laboratory. The standard methods of HIV testing, enzyme linked immunosorbent assay (ELISA) or Western blot with confirmatory testing, can take several days for result availability. A significant proportion of individuals who agree to undergo HIV serologic testing do not return to the HIV testing site to receive their test results. POC testing of HIV attempts to address delay in detection of HIV status by providing preliminary antibody results. POC tests can be most useful in resource limited settings (RLS) or outreach settings where there is lack of well-trained laboratory technicians, poor physical infrastructure, extremes of climate, and lack of uninterrupted power supply, all of which impact the use of laboratory technologies. Point-of-care testing has been shown to reduce patient loss to follow-up and increase access to antiretroviral therapy.

Rapid test devices (RTDs) are typically capillary flow tests for use on whole blood (e.g., fingerprick), plasma, or oral fluid. HIV test kits are designed for HIV antibody and antigen detection. The results can be delivered within 20 minutes after the specimen has been taken. The results are available within a single consultation. The rapid turnaround time associated with its use can guide urgent decision making. This makes it suitable for use in targeted clinical scenarios where the immediate administration of antiretroviral drugs is recommended to reduce the risk of transmission or in cases where the patient’s management may be altered by the availability of a reactive test result.

Access to antiretroviral therapy (ART) has increased dramatically over the past decade in Indonesia. Successful management of HIV requires patients receiving ART to be monitored routinely to assess treatment efficacy and detect treatment failure due to drug resistance. The standard of care to monitor ART is quantitative viral load testing based on plasma HIV RNA concentration. POC test for CD4 count could help clinicians to decide when to start antiretroviral treatment, and a POC test for viral load would be of great value in identifying treatment failure and the need for second-line treatment. These devices could rapidly and accurately identify CD4 counts with minimal operator training, infrastructural setup, and with less cost than standard laboratory-based equipment. Several CD4 POCT analysers can give CD4 count in 20 minutes from a finger stick or venous sample.

HIV viral load testing is the most accurate method to evaluate the response of ART on HIV infected patients. It is considered as gold standard to monitor response therapy and predict clinical prognosis of patients on ART. The threshold used to detect the failure to ART is 1000 virus copies/mL. The nucleic acid based tests detect and measure viral RNA. Several POC devices for viral load are currently available and in the pipeline.

Diagnosing HIV infection in infants early will be very beneficial in reducing the mortality of infants due to the HIV infection. The maternal antibodies present in the system of infants makes the antibody test undesirable for diagnosing HIV in infants less than the age of 18 months. The detection of viral components are required for early infant diagnosis and this can include viral capsid p24 antigen, cell free RNA or viral DNA incorporated into the host detection. The current p24 immunoassays show more sensitivity than the previously used p24. However, similar to nucleic acid amplification-based assays, p24 immunoassays require specialised equipment and skilled technicians to perform the test. In order to perform these tests in resource limited settings, early infant diagnosis rapid tests are required.

In Indonesia, POCT in HIV management has been used for several years. HIV rapid testing, POCT CD4 and VL has been used in all 34 provinces in Indonesia. HIV rapid testing used for initial screening has been used in primary healthcare, while CD4 and VL POCT placed in district hospital or laboratory to cover the testing reference from the whole area.

Rapidly scaling up point-of-care testing without simultaneously building the capacity of quality assurance is counterproductive if service providers are unable to guarantee adherence to minimum quality standards with accurate and reliable results. We found several challenges in implementing POC Testing. In contrast to standard laboratory-based HIV testing, the healthcare workers in the POC setting assume responsibility for specimen collection, testing, and counselling of the patient. Appropriate training on the use of kits, reading of results, detection of errors, quality assurance, counselling, and regular assessment of staff who will be performing POC testing is required for providing point-of-care testing.

RTDs are generally satisfactory for the detection of uncomplicated HIV infection (or its absence) but are less sensitive than lab-based ELISAs and automated systems for detecting early infections (seroconversion). Also, specificity of RTDs is lower than conventional ELISAs although it can be improved by immediate repeat of all RTD positives.

A variety of rapid tests for diagnosing HIV in resource limited settings are commercially available. All the approved HIV POC tests should have a sensitivity and specificity matching the ELISA kits that are used in laboratories. POC devices have in-built internal quality controls to monitor in real time the performance of an instrument and, to some extent, the user. External quality assessment (EQA), often used interchangeably with proficiency testing, challenges the testing environment and allows for an external expert body to examine the processes and provide remedial action. The three components of an effective EQA laboratory programme are site supervision, retesting of specimens, and proficiency testing. Proficiency testing is the most widely-used approach to monitor the performance of the test used and the quality of testing. It involves blinded control material, meant to mimic patient specimens, sent from a reference laboratory to the testing site and results sent back for scoring.

Indonesia’s Ministry of Health is currently providing EQA for rapid HIV test organised by provincial reference laboratory, while CD4 and VL EQA are centrally organised. CD4 EQA is provided by COE CD4 EQAS – Thailand, and QASI – Canada and VL from NRL – Australia.

HIV viral load testing is the most accurate method to evaluate the response of ART on HIV infected patients. It is considered as gold standard to monitor response therapy and predict clinical prognosis of patients on ART.

Conclusion

POCT will improve access to needed HIV and associated diagnostics, but these assays are not without limitations that should be noted and reported. There is a need to integrate these technologies cost-effectively and efficiently into clinical algorithms and existing laboratory networks.

References available on request.

Can thalassemia be eradicated by proper screening?

Article-Can thalassemia be eradicated by proper screening?

The Alpha and Beta thalassemia genes occur with variable frequency in different regions of the Kingdom of Saudi Arabia, and both B+ & B0 thalassemia have been reported. Screening programme for hemoglobinopathies was implemented first by Aramco Hospital in 1980 followed by Universities and Military Hospitals.

Treatment of thalassemia major is complex, expensive and requires a multi-disciplinary approach. Disease complication is common due to excessive iron overload and sub-optimal chelation. Cardiac constitutes the first cause of death followed by infection and endocrine dysfunction. Iron toxicity is a crucial factor for tissue damage in iron overloaded patients, making an accurate assessment of body iron essential.

The availability of Cardiac Magnetic Resonance Imaging T2* reportedly improved the survival among thalassemia by early detection of iron overload in the heart and liver; early detection of iron overload led to early treatments.

T2*-weighted MRI is a well-validated predictor of both cardiac and hepatic iron content and superior to non-invasive methods such as measurement of serum ferritin. Several methods of estimating T2* values are possible, but the most common technique is to acquire a series of breath-hold gradient-echo (GRE) images at progressively increasing echo times (TE's).

The second cause of death in our study was infections in splenectomised patients; Hydroxyurea, had an effective therapy to avoid splenectomy in Thalassemia disorders.

Comprehensive care by multi-disciplinary approach for early detection of disease complication and treating the underlying cause improve the survival of thalassaemic patients.

After a Royal decree in December 2003, the pre-marital screening programme was implemented in February 2004 in all healthcare regions by Saudi Ministry of Health in all centres of the Kingdom. Six years of pre-marital screening in Saudi Arabia markedly reduced the number of at-risk marriages, which may considerably reduce the genetic disease burden in the Kingdom in the next decades.

Adherent comprehensive care with early detection and recognition of disease complication by applying advanced technology in diagnostic procedures, proper treatment and stem cell transplant, are the keys to control disease morbidity, mortality and improve survival with better quality of life among patients.Therefore, it can be said that thalassemia disorders can be eradicated by good screening and disease control.

References available on request.

Medlab Africa makes its debut

Article-Medlab Africa makes its debut

The clinical laboratory industry within the Southern African Development Community (SADC) is estimated to be a lucrative and fast-growing sector showing many developmental opportunities for market expansion. This can be attributed to the presence of developing economies such as South Africa and Botswana, as well as the rising burden of diseases and favourable government policies within the region.

With the South African medical laboratory services market valued at US$ 1.68 billion in 2018 and an expected year-on-year growth of 3.5 per cent, there has been a rising interest in the industry both regionally and globally.

In order to meet the needs of the laboratory industry and to create a platform to enable discussion around Afrocentric laboratory issues, Medlab Africa is launching this year. The event is co-located with Africa Health and takes place between May 28 to 30 at the Gallagher Convention Centre, Johannesburg, South Africa. The developments within the laboratory landscape makes Medlab Africa the ideal environment for buyers and sellers to connect and engage in the country which is known as having one of the world’s leading healthcare infrastructures.

Medlab Africa will provide attendees with an opportunity to explore a dedicated showcase of the latest products, services and technologies in medical laboratory and a chance to create and manage relationships with new vendors and suppliers in this growing market. The exhibition will offer visitors the chance to engage with 80 plus specialist companies including BMS, Randox, Mindray and Fujifilm, while networking with more than 10,500 laboratory and trade professionals attending the show.

Ryan Sanderson, Africa Health and Medlab Africa’s Exhibition Director says: “The power of an event like Africa Health is that it brings the entire healthcare ecosystem together. Launching Medlab Africa alongside Africa Health is a very welcomed step to bringing healthcare and diagnostic professionals closer together in an environment that can only improve patient outcomes across Sub-Saharan Africa.”

Committed to supporting the education of all faculty and medical lab professionals, the event will host Africa’s only CPD-accredited multi-disciplinary medical laboratory congress. It offers two accredited conference tracks that focus on core lab and specialist lab units along with infectious disease.

This year’s congress is being led by a strong scientific committee, ensuring that current issues and innovation are at the centre of all programmes. The conferences are endorsed by associations such as the Society of Medical Laboratory Technologists of South Africa (SMLTSA) and South African Society of Clinical Cytology (SASCC). Furthermore, proceeds from the conferences will be donated to The Reach for a Dream Foundation charity.

Laboratory Medicine Conference

This newly developed conference will cover the latest developments and initiatives in diagnostics, disease surveillance and laboratory management. It features several well-rounded sessions targeted towards professionals in the field.Taking place over the course of two days, this conference will bring together clinical laboratory professionals from all disciplines covering topics ranging from molecular diagnostics to haematology.

As part of the conference, attendees can discuss the importance of lab accreditation in ensuring the quality of results. They can also participate in reviewing strategies to increase the number of clinical decisions based on results, analyse new methodologies for diagnosing malaria, and evaluate innovations in clinical diagnostics.

Infectious Diseases Conference

The new one-day Infectious Diseases Conference will host high profile speakers and provide exposure to vital issues that might not get as much coverage as they should, such as the under-diagnosed hepatitis E virus in the region. The agenda is designed to provide delegates with the latest information in the diagnosis, treatment and prevention of a range of significant infectious diseases. The scope of the presentations will deliver up-to-date information for all medical professionals in the focal area of infectious disease.

For more info visit www.africahealthexhibition.com/medlab-africa/en/home.html

"Launching Medlab Africa alongside Africa Health is a very welcomed step to bringing healthcare and diagnostic professionals closer together in an environment that can only improve patient outcomes across Sub-Saharan Africa"

Progress in test strip urinalysis

Article-Progress in test strip urinalysis

Urinalysis is a major diagnostic screening test in the modern clinical laboratory. Until recently, microscopic sediment analysis was the most popular urinalysis methodology. However, this time-consuming method has been associated with an important analytical error. Although urinary test strips were introduced in 1957, over the past two decades, advancing electronics and informatics have created new technical possibilities. This review summarises the major ongoing developments in urine test strip analysis and future developments.

Although the chemical technology for urinary test strips has only made a moderate progress over time, advances in the field of electronic detectors have considerably increased the analytical sensitivity of commercial test strip readers. As early as 2002, Penders already demonstrated that automated urine test strip reading enables quantitative analysis for red blood cells (RBCs), white blood cells (WBCs), glucosuria and proteinuria. The reciprocal value of the quantitative reflectance result is proportional to the concentration of the urinary analyte. Quantitative applications of such strip readings have been described for both ketones and albumin.

A traditional tetrabromophenol blue dye-binding based albuminuria test strip in combination with a metal oxide semiconductor (CMOS) based urine strip reader could allow quantitative analysis of albuminuria and the determination of the albumin:creatinine ratio. This approach enables for the first-time quantitative albuminuria readings, even in the microalbuminuria range (20 - 200 mg/L). Concomitantly, the creatinine test pad on the same strip allows to correct (micro) albumin results for sample dilution, which is helpful in the clinical interpretation. Applying the same CMOS technology, very sensitive readings can also be obtained for WBC (leukocyte esterase activity) and RBC (peroxidase activity). Also, for WBC and RBC, reflectance data can be used quantitatively, enabling an excellent sensitivity for these analytes. A noteworthy evolution is the potential use of smartphones for interpreting urine test strip results. Mobile platforms have been proposed, combining urinalysis strips and a pocket-sized strip reader, capable of transmitting data via a smartphone.

Urine flow cytometers (UFCs) have improved accuracy and allow significant reduction of labour costs. A popular approach is combining test strips with UFCs for screening purposes either using both strips and UFC or by utilising the urine strips for parameters unrelated to the UFC-analysed particles. As mechanical coupling of UFCs and test strip readers has been introduced, expert systems now take advantage of the combined information and apply user – definable decision criteria. This implementation significantly reduces microscopy review rates and saves time and labour cost.

Rate of diuresis is characterised by an important biological variation. As precision of urinalysis has considerably increased over time, correcting for urinary dilution has become of growing importance in the interpretation of urinalysis test results. As diuresis variability is a major pre-analytical confounding factor in urinalysis, a number of reference parameters have been introduced to assess urinary dilution and the patient’s hydration status. The commonly used reference parameters in urine are conductivity, specific gravity (density), and creatinine. Specific gravity can be measured using either urine test strips or refractometry.

In view of the great demand for the development of cost-effective portable readers, pocket-sized urine strip readers can be combined with dipsticks in a device that is capable to send digital information using a smartphone, offering a solution for detecting urological or nephrological diseases in regions with limited availability of experts. Advances in microfluidics have enabled the development of chip-based assays, which might change urinalysis in the years to come. Alongside conventional urinalysis applications integrated microfluidic chips show promise for the quantitative detection of bladder cancer cells in urine. Similarly, microfluidic paper devices have been described for detecting bacteria causing UTIs and sexually transmitted diseases in urine specimens.

Over the past 20 years, automated urinalysis has undergone a technical evolution. In this respect, automated test strip reading provides added value. Further integration of existing technologies may contribute to reduce turn-around times. Consolidation of clinical laboratories has also led to a reduction of their number and has increased the distance between patient and laboratory. 

References available on request.

Despite continuous improvements in the field of standardisation, the lion share of errors in urinalysis occurs in the pre-analytical phase, which have become more vulnerable. As analytical variation has been greatly reduced, more efforts need to be spent on pre-analytical issues in the future.

Evaluation of Adenosine Deaminase (ADA) level in exudative pleural effusion diagnostic

Article-Evaluation of Adenosine Deaminase (ADA) level in exudative pleural effusion diagnostic

Pleural effusion is an abnormal accumulation of fluid present in the pleural cavity due to an imbalance mechanism of excessive fluid production or decreased lymphatic absorption or both. Normally the amount of pleural fluid is around at 0.1 ml/kg to 0.3 ml/kg and acts as a minimal lubricant for both pleural surfaces. As the most common pleural disease, it causes high mortality and morbidity and affects more than 1.5 million patients per year in the U.S. In Indonesia, 3,000 pleural effusions were diagnosed within January 2017 until February 2019, according to the medical record in Cipto Mangunkusumo National Referral Hospital.

Pleural effusion occurs as a result of the varied underlying health problems primarily involving the pulmonary or systemic system, which may affect the clinical symptoms, and further examination based on its aetiologies. Inflammation in pleural effusion often causes localised and severe pain with breathing or a cough; patient with systemic obstruction caused by congestive heart failure complains asymptomatic or breathlessness associated with its fluid volume. The build-up of fluid in pleural effusion can be caused by several mechanisms including increased pleural membrane permeability, increased pulmonary capillary pressure, decreased negative pressure in the pleural cavity, decreased oncotic pressure, and obstruction of the lymphatic vessels.

Classification of pleural effusion

Pleural effusion can be classified by the modified Light’s criteria. If the ratio of pleural fluid protein to serum protein is more than 0.5; the ratio of pleural fluid lactate dehydrogenase to serum LDH is more than 0.6; and/or the pleural fluid LDH level is more than 2/3 the upper limit of the normal level of serum LDH; at least one of the criteria is met and it is considered as exudative effusion. This condition can occur due to infections and inflammatory disorder such as tuberculosis, pneumonia; malignancy; hemothorax; parapneumonic effusion; and lymphatic obstruction. However, transudates are mostly caused by systematic conditions like congestive heart failure, nephrotic syndrome, liver cirrhosis, hypoalbuminemia; which alter the hydrostatic or oncotic pressure in the pleural cavity. Other less common causes are drug-induced, pulmonary embolism, oesophageal rupture, and post-radiotherapy.

Comprehensive diagnostic approach

Some patients presenting a pleural effusion can be asymptomatic or complain about dyspnoea, cough, and/or pleuritic chest pain; a thorough medical history may help to differentiate aetiologies from cardiovascular or other causes such a history of cancer, recent trauma, chronic hepatitis or an occupational history, and their detailed recent medication. A precise physical examination should be performed to determine if the clinical sign suggests an effusion like dullness to percussion.

Meniscus sign of pleural effusion can be detected by chest radiographs in the presence of 200 mL of fluid on posteroanterior radiography and in any case 50 mL of fluid is required on lateral radiography. Computed tomography (CT) can be used to evaluate effusions, which are not apparent in plain radiography, differentiating between the pleural fluid and pleural thickening, and provide clues to select the empyema’s site of drainage. Loculated or small amount pleural effusion, even in detecting pleural effusion in the critical care setting can be detected accurately by Ultrasonography. Its guidance is more sensitive in detecting pleural fluid septations, useful for established diagnosis and for marking a site of thoracentesis.

Thoracentesis is a minimally invasive procedure indicated to all unilateral effusion of unknown origin for diagnosis and relieving symptoms in patient. The effusion sample from this procedure can be analysed to differentiate its mechanisms as transudative or exudative effusions by using modified Light’s criteria.Pleural fluid needed to analyse is approximately 20-40 ml; then it will be extracted within four hours. A reddish pleural fluid appearance shows the presence of the blood like in trauma, pulmonary embolism, or malignant disease; if it comes after a prolonged period it will alter with a brownish tinge appearance. The presence of malignant effusion, tuberculous pleuritis, oesophageal rupture, or lupus pleuritis can be shown by a low glucose amount (30-50mL) and mostly with a low pleural pH level (less than 7.2). In most cases, higher LDH will present in exudative effusion as ongoing inflammation, whereas the decreasing LDH concentration showed in a cessation of the process in inflammation.

The predominance of white cells in differential cell count gives guidance to establish aetiologies; neutrophils caused by pneumonia or empyema, eosinophilia presents in pneumothorax, haemothorax, or asbestosis, mononuclear cell as the sign of chronic inflammatory, then lymphocytes will present in cancer and TB infection. Gram stain and culture in pleural fluid useful for detecting chronic illness or suspect TB. Since Indonesia has become a high burden country for tuberculosis, other promising diagnostic tools may be proposed as a proper index for TB diagnosis by the use of adenosine deaminase (ADA) that are more reliable, easy, rapid, and cost-effective.

Computed tomography (CT) can be used to evaluate effusions, which are not apparent in plain radiography, differentiating between the pleural fluid and pleural thickening, and provide clues to select the empyema’s site of drainage.

Diagnostic value of Adenosine Deaminase (ADA) in tuberculosis body fluid

The deamination of adenosine to inosine and of deoxyadenosine to deoxyinosine has been catalysed by ADA enzyme as a marker of cell-mediated immunity. ADA will elevate in empyema, whereas ADA was found to be high in M. tuberculosis infection. However, there is no significant difference in the use of total ADA and isoform ADA2 in clinical practice.

ADA analysis can be performed in the presence of exudate in pleural, ascitic and CSF with increased cell count of lymphocytosis using the Giusti method. Approximately 0.02 ml sample of body fluid and 0.2 ml of an adenosine solution were placed into a test tube, subsequently incubated at 37ºC for 60 minutes. Then, a phenol-nitroprusside and a hypochlorite solution will be added and incubated at 37 degree Celsius for 15 minutes. A Spectrophotometer at a wavelength of 620 nm will be used to read the amount of ammonia liberated by ADA action by converted to U/L.

ADA assay for detecting tuberculous pleurisy

The usefulness of ADA in pleural fluid for diagnosing tuberculous pleurisy had been investigated in some studies. Qureshi et. al (2018) reported 330 (93.75 per cent) out  352 TB pleurisy cases showed ADA levels 40 IU/L and above with a sensitivity 93.75 per cent, specificity 91.42 and positive predictive value 98.21 per cent in an area of high prevalence for tuberculosis, whereas in Auckland, New Zealand, as a low incidence setting, the median pfADA in 57 TB pleural effusion remains significantly higher (p<0.001) than in 1580 non-TB pleural effusion, respectively 58.1 U/L and 11.4 U/L presented by Blakiston et. al (2018).

The most widely used cut-off for tuberculous pleurisy diagnosis is approximately 40 IU / L. Study by Yusti G et al (2018) reported HIV patients with diagnosis of TB showed a median value of ADA of 70 IU/L (interquartile range (IQR) 41-89) and the non-TB group a median of 27.5 IU/L (IQR 13.5-52), hence ADA level determination could be useful to diagnose pleural tuberculosis in HIV infected patients.

Measurement of LDH/ADA ratio may be helpful for clinicians in distinguishing between tuberculosis with parapneumonic pleural effusion. Based on study by Wang J et. al (2017), the median pleural fluid LDH and ADA levels and LDH/ADA ratios in the tuberculosis and parapneumonic groups were: 364.5 U/L vs 4037 U/L (P < .001), 33.5 U/L vs 43.3 U/L (P = .249), and 10.88 vs 66.91 (P < .0001) respectively, showed the pleural fluid LDH/ADA ratio in the tuberculosis group was significantly lower than in the parapneumonic pleural effusion group.Prior studies have noted the importance of ADA as an useful diagnostic tool for the management of tuberculous pleurisy as an exudative pleural effusion.

Conclusion

Pleural fluid adenosine deaminase (ADA) can be a reference guidance to diagnose tuberculous pleurisy according to significant increasing level that has been found in tuberculosis infection as an exudative pleural effusion.

References available on request.

How do medical labs transact knowledge virtually?

Article-How do medical labs transact knowledge virtually?

Medical laboratories face pressures to adopt new technology solutions to strengthen the workflow, manage time and manpower to ensure improved healthcare. The aim is to promote better patient care and quality healthcare. The current period is marked by the transition from the traditional lab model to the new clinical lab 2.0. Lab 2.0 is expected to offer premium delivery to patients and provide new synthesis in the lab working. Lab professionals do not work in isolation. The knowledge input to labs is derived from many corners where collective efforts are ensured. The stakeholders function remotely at distributed locations and work collectively in digital mode.

Knowledge in labs is not confined to mere data or information input and rather embraces a wider inclusion such as intellectual property, expertise, learning and skills between academic and the non-academic community. A strong scientific knowledge base is one of the medical labs’ traditional key assets. The innovation in labs is currently being challenged by a rapidly changing research landscape. It is important to promote the transnational dimension of knowledge transfer.

Compared to the U.S., the average university in other countries including Europe generates far fewer inventions and patents. Several reasons are attributed for this condition, out of which the less systematic and unprofessional management of knowledge and intellectual property by these ‘average’ universities is cited. Thus, learning and understanding from the successful universities and bringing all knowledge producers to share, mark the growth of knowledge exploitation in labs.

We can try to understand how the complex knowledge, which produces technologies move from the academic and research world to the labs so that new lab technologies can be supported, enhanced and accelerated. The analysis and efforts explore how these types of complex and un-coded knowledge move from the minds to the labs in order to understand how this process might be optimised.

Medical labs borrow creative knowledge from minds and university research labs and translate such un-coded knowledge into practice. Labs at the same time produce problems set to basic researchers with live and real-data. Medical labs transact knowledge between themselves using remote and distributed systems. Formal protocols exist for such transfer and at the same time, undocumented knowledge is also required.

Creating a knowledge grid in medical labs is required. Knowledge transaction occurs between labs themselves and in between labs and universities. Some of the top U.S. universities get license income, which is equivalent to their R&D budget. The successful U.S. universities obtain license income mainly in biomedical sciences. Here we make a clear distinction between the faculty labs or we call it as idea labs and applied labs where productivity and innovation arise. Idea labs have been emerging in the last few years, which are responsible to develop solutions to the identified problems.  

Knowledge in labs is not confined to mere data or information input and rather embraces a wider inclusion such as intellectual property, expertise, learning and skills between academic and the non-academic community.

Collaboration for knowledge transfer

Knowledge transfer in the virtual labs is not just limited between colleagues and fellow researchers, but include the knowledge transaction between researchers, practitioners and patients. Idea transfer from idea labs emerging from interdisciplinary work, reaches the lab community.

Labs working virtually is characterised by the use of a software platform for collaboration and knowledge transfer that includes wikis for document sharing, discussion boards, mailing lists, conferencing facilities, and so on; it also has a web portal to support the sharing of knowledge, as well as the dissemination of knowledge and expertise; and a virtual education centre that is conceptually linked to the portal mentioned above.

Collaboration increases productivity, which is empirically proved with data. In the same field or theme, people do the same work without knowing others’ work. Many labs work on a specific theme without integrating itself into a large area where research is characterised as fragmented research. The same piece of new work is performed by two more labs without knowing others work. Collaboration enables to offset such limitations.

The Knowledge transfer activities enables the businesses introduce and embed change, such as developing new technologies and streamlining processes. In the successful knowledge transaction process, the links between knowledge transfer activities and innovation performance is proved. 

Impact of computer-mediated communication

Databases, Big data, Robotics systems, Networking, Cloud computing, Internet of Things, Graphical interfaces, Data mining, Machine learning, Semantic technologies, Neurocomputing, Intelligent decision support systems and specialised programming languages are some technologies and research areas influencing medical informatics. In the areas of medical rehabilitation and assistive technology, ICT has contributed greatly to the enhancement of quality of life and ensures complete integration of people into society. The use of Cloud computing and IoT accelerates the flow of the information and improved communication in healthcare.

Lab-on-a-chip technology

Now Nano-chips are used to pack data in handheld devices, which can send biomarkers found in small amounts and sent to various destinations for deriving insights. The data captured is combined with real-time health data from other IoT-enabled devices, such as smart systems and smart watches, and analysed by using real-time intelligence.

AI systems could ultimately be packaged in a convenient handheld device to allow people to quickly and regularly measure data and send this information securely streaming into the cloud from the convenience of their home. There it could be combined with real-time health data from other IoT-enabled devices, like sleep monitors and smart watches, and analysed by AI systems for insights. When taken together, this data set will give us an in-depth view of our health and alert us to the first signs of trouble, helping to stop the disease before it progresses.

Faster fibre links for data centres

Newer fibre optic systems now can transmit large data that is several thousand gigabits of data per second. Chips can also able to transmit such high volume of a big step up from top speeds of several hundreds of gigabits in today’s data centres. New medical big data is heterogeneous, transactional and unstructured, collected privately and available publicly leading to challenges in data transfer and process.

Diagnosis and serotyping of dengue virus infection at/near points of care

Article-Diagnosis and serotyping of dengue virus infection at/near points of care

Dengue disease is a major mosquito-borne public health problem in tropical countries and has been continuously spreading to new geographical areas. Cases of dengue have increased almost fourfold from 2000 to 2013, and 30 folds over the last five decades, notably with more frequent, larger outbreaks and more severe cases. According to the estimation of World Health Organization (WHO), dengue is endemic in 128 countries with 3.9 billion people at risk per year. Among them, around 96 million may manifest clinically, and approximately 500,000 people per year may require hospitalisation. In 2018, an increase in recognised dengue cases were reported in India, Philippines, and Singapore.

Dengue virus (DENV), the etiological agent of dengue, is a member of the Flavivirus genus in Flaviviridae family, which also includes Yellow fever virus, and Japanese encephalitis virus. DENV infection causes a wide range of clinical signs in humans, from asymptomatic to acute febrile illness (dengue fever, DF), to severe dengue haemorrhagic fever/dengue shock syndromes (DHF/DSS). Although the overall mortality associated with dengue is low, the burden on health services caused by dengue has been significant. Early diagnosis of dengue can help doctors in selecting appropriate measures to reduce morbidity and mortality. High viremia was associated with the development of severe dengue disease, suggesting early administration of antivirals in the course of illness could help lessen severity. However, diagnosis of DENV infection cannot rely solely on clinical signs and symptoms since the majority of the infected individuals are either asymptomatic or present with symptoms similar to those of other febrile-episode-inducing diseases.

Current methods to aid diagnosis of DENV infection include antigen detection (virus isolation, immunofluorescence assay, NS1 immunological test), serological assays (plaque reduction neutralization titers (PRNT), IgM/IgG immunological test), and RNA detection (reverse transcription-polymerase chain reaction [RT-PCR]). Virus isolation and PRNT are two options only for well-equipped laboratories with well-trained staff. NS1 antigen detection and/or serological methods are the most commonly used diagnostic methods. NS1 enzyme linked immunosorbent assay [ELISA] can detect circulating NS1 antigen for up to nine days after symptom onset. IgM appears later but lasts longer than NS1 and viral RNA during primary infection; IgM ELISA has been widely used in public healthcare centres; interpretation of the results can be enhanced by a rise in the titre when testing paired samples from both acute- and convalescent-phases. However, their cross reactivity to other flaviviruses such as Zika virus is still a concern.

NATs generally have greater sensitivity and specificity than antigen and serological methods. Using NATs, e.g. real-time RT-PCR, to detect DENV RNA during the first five to six days after symptomatic onset was recommended by the WHO as an aid for laboratory confirmation of DENV infection. NATs have gradually replaced virus isolation to help identify acutely DENV-infected patients.

Higher prevalence

Dengue disease is caused mainly by four antigenically related serotypes of DENV, namely DENV-1, -2, -3, and -4; about 50 – 70 per cent differences in sequence identities are found among their genomic RNA. Multiple virus serotypes have been found co-circulating in several hyperendemic regions. Numerous studies reported that some serotypes are more often associated with severe diseases or specific clinical manifestation than the other serotypes. For example, severe disease was associated more with DENV-2 than other serotypes in children in several studies, and more with DENV-1 in adults in a Hong Kong study. A study in western South America found a higher prevalence of musculoskeletal and gastrointestinal manifestations in DENV-3-infected patients and a higher prevalence of cutaneous and respiratory manifestations in DENV-4-infected patients. A study of a DENV-2- and DENV-3-infected cohort in Taiwan showed a higher prevalence of respiratory and cutaneous manifestations (rash) in DENV-3- and DENV-2-infected subjects, respectively. In addition, chances for developing DHF-DSS is elevated when infection with one of the four serotypes is followed by a heterotypic serotype; this is likely due to antibody-mediated enhancement of DENV infection. This association has been observed also in a large scale; for example, the replacement of DENV-3 by DENV-1 was associated with a wave of severe dengue epidemic in Sri Lanka in 2009. Hence, DENV serotyping is also important in prevention and control of dengue.  

DENV serotyping by serological methods have been impeded by cross-reactivity of antibodies induced by different DENV serotypes. A DENV envelope protein-based Ig-M ELISA capable of differentiating DENV serotypes during the acute phase of dengue was reported recently. However, its clinical performance needs to be further validated and existing IgM from previous infection remains a challenge in data interpretation. Several CE-IVD marked multiplex real-time RT-PCR methods capable of serotyping are available for laboratory settings. A Luminex-based single DNA fragment amplification assay capable of multiplex DENV serotyping was reported recently.

Rapid on-site detection and serotyping of DENV can potentially alert front-line health professionals of invasion of a new or long-time absent serotype, allowing timely implementation of intervention strategies. Although parallel NS1 antigen and IgM/IgG rapid testing has improved dengue diagnosis at points of need/points of care (PON/POC) in recent years, both tests do not provide serotype information. In addition, they are generally less sensitive and specific than those of ELISA and nucleic acid testing (NAT) methods. Their sensitivity could be further lowered during secondary infection since the levels of detectable NS1 could be reduced by existing antibody, and levels of IgM are not as high as in primary infection and can be undetectable in some cases.

Real-time RT-PCR assays and the Luminex-based assay allow serotyping of DENV with great sensitivity. However, their execution and/or data interpretation require skilled technicians and relatively expensive equipment that are not commonly available in remote areas or developing countries; long-distance transportation of specimens is another major obstacle.

To facilitate timely near-patient serotyping of DENV at low-resource settings, rapid, easy, mobile, NAT methods of high sensitivity and serotype-specificity are still needed to bring accurate detection/serotyping of DENV to PON/POC. On-site serotyping of DENV by a combination of isothermal amplification and DNA sequencing with a portable sequencer was recently demonstrated to be feasible. However, its immediate clinical applications at PON/POC are limited by the costs and technical skills required.

A limited number of methods, including the reverse-transcription-loop-mediated isothermal amplification (RT-LAMP) and the TaqMan probe-based RT-insulated isothermal PCR (RT-iiPCR) methods, became available recently with potential to enable easy near-patient detection and serotyping of DENV. LAMP can be performed with a simple incubator and generally requires only a very simple nucleic acid extraction step. One DENV serotyping RT-LAMP panel including four singleplex DENV-1, 2, 3, and 4 reactions targeting the 2A, NS4B, NS4A, and 3’UTR markers was reported to generate signals visualised by naked eyes or a laboratory fluorescence-monitoring device in 25 minutes. Its clinical performance was demonstrated preliminarily to be similar to that of a real-time RT-PCR. Another RT-LAMP panel also included four reactions all targeting the NS1 gene; SYBR Green I signals can be detected visually or with a field-deployable Genie II flourometer (OptiGene, UK) in about 35 minutes. Preliminary study showed that this assay and CDC real-time RT-PCR assay complemented NS1 to increase the diagnostic coverage of febrile patients to similar degrees.

Another singleplex DENV serotyping RT-PCR panel that works on a mobile PCR system has been evaluated preliminarily to have performance comparable to that of CDC Real-time RT-PCR. The DENV-1, 2, 3, and 4 reagents, targeting the NS5, E, prM, and prM genes, respectively, are based on the TaqMan probed-based iiPCR technology. Automated amplification and detection of iiPCR are achieved consistently in a capillary tube in a simple, insulated heater/detector. CE-marked compact iiPCR devices are commercially available, including a field-deployable model (POCKIT, GeneReach Biotech) and a hand-held series (POCKIT Micro) with built-in rechargeable batteries. Results are detected and interpreted automatically to generate qualitative results within one hour. Its performance has been verified and validated to be comparable to that of several reference laboratory methods (real-time PCR, virus isolation) with various markers and sample types. A CE-marked pan-DENV RT-PCR is already available to detect all four DENV serotypes in human plasma and serum on this platform to aid PON/POC identification of DENV. PCR testing requires nucleic acid extraction, and a compact automated nucleic acid extraction system has been bundled with these PCR devices in a durable suitcase to realise a mobile PCR laboratory. Furthermore, a fully automated, sample-in-answer-out, compact system (POCKIT Central Nucleic Acid Analyser) is available recently with the CE mark to further minimise human error risks and allow easy molecular bio-detection near PON/POC.

Progress in translating novel molecular technology into diagnostics for dengue infection has helped the development of relatively inexpensive, rapid, and simple NATs to meet the needs in early serotyping of DENV near PON/POC. These tools have potential to enable timely management, control, and monitoring of different DENV serotypes especially in under-served communities. Further verification and validation studies of these methods should be expedited to bring them to clinical settings.

References available on request. 

How VR and AR are empowering healthcare industry

Article-How VR and AR are empowering healthcare industry

The acceptance of immersive technologies is on the rise and the healthcare industry hasn’t been an exception. According to a report from Reportbuyer, Augmented Reality (AR) and Virtual Reality (VR) in healthcare industry will touch US$5 billion growing at a pace of 36.6 per cent compound annual average.

With the advent of AR and VR technologies like Magic Leap and Microsoft HoloLens, the gates to new opportunities are now wide open in the healthcare industry. These immersive visual technologies combine virtual and real environments and are usually referred to as Extended Reality or XR technology.

VR is when the user is immersed in an absolutely virtual environment, while augmented reality abbreviated as AR is when virtual environment or objects are overlaid on real environment to enhance contextual meaning. Here are some of the ways XR technologies are going to shape the healthcare industry in the near future.

Facilitating medical learning and healthcare training

One of the key benefits of XR technologies lies in improving the quality of learning and training for medical professionals while driving costs down and enhancing retention and understanding.

Realistic 3D visualisation

AR and VR technologies can help medical professionals to learn physiology and anatomy of the human body in an effective manner. Conventional training procedures involve static two-dimensional images where a medical student has to rely on his or her own mental imagination to complete the picture. XR technologies enable students to see every detail in full immersion improving the learning process.

Skills development training

Another aspect of medical training purely depends on performing physical tasks such as inserting a catheter, drawing blood, and performing surgeries. While traditional methods involve learning from textbooks, slide shows, and watching a professional perform these tasks; AR and VR technologies enable the same students to learn these behavioural skills in a virtual or mixed reality environment by actually performing them.

By actually performing these skills in an immersive environment, medical students don’t only improve the quality of learning but learn to do so with a much higher degree of accuracy and precision.

Advanced learning

Apart from the above benefits, medical professionals also get to learn new and innovative procedures and medicinal novelties through an immersive environment, which helps them to retain more information.

One of the major challenges for today’s healthcare professionals is the quickly evolving landscape of medicine, which is changing daily. That’s why they have to stay ahead of the curve by learning and absorbing all the new information to ensure they’re not falling behind. AR and VR technologies help them achieve that feat in an efficient and immersive manner.

Enhanced patient care and education

Not only medical professionals but patients can also derive a number of practical benefits from AR and VR technologies helping them understand the medical conditions and details about treatment and a variety of procedures.

Improved patient education

XR technologies, especially AR can be used to provide interactive and immersive education to the patients who may be fearful or sceptical. Doctors and healthcare professionals can’t only employ VR and AR to tech interns but also use the technology to educate patients during consultation sessions.

This will also enable doctors to impart confidence and trust into their patients putting them in a much better position to make informed decisions. When patients are able to understand their condition and the treatment approach, they tend to be much more receptive and responsible toward self-care.

Virtual assistance in medical facilities

Augmented reality-based navigation can make it quick and convenient for patients to find exactly what they are looking for. It can also help nurses and other medical professionals to find the right spot as well as equipment during emergencies and critical situations.

Virtual assistance through AR can also enhance the orientation experience in hospitals and medical clinics to ensure the audience grasps the concepts being explained to them clearly and have a much higher chance of retaining the information.

Pain management through VR

VR technology can be employed to distract patients who are in pain or discomfort. Suzanne Hardacre, a midwife, says, “There’s a great opportunity particularly to use this with women in early labour, to try and help them with some breathing and relaxation and take them out of the moment.”

VR technology brings many therapeutic apps and techniques, which can be used to provide a bit of comfort and reassurance to burn victims, women in labour, and other patients in order to provide assistance to them during a painful recovery.

Enhanced diagnosis

One of the crucial aspects of XR technologies is that they can be integrated with artificial intelligence to leverage the technology’s advantage even further. By integrating AI into AR and VR technologies, a number of medical procedures can be sped up to ensure efficient and accurate diagnosis.

Digital entertainment in hospitals

Hospital isn’t a place people want to be in. They are there because they absolutely need to be, and the uncertainty can be frustrating and quite distressful. Moreover, hospitals can be downright scary for little children.

This is where technologies like VR and AR can be used in medical facilities to provide entertainment content to the patients, for example, games for kids, VR stories and orientations for adults. Educational content can also be disseminated through the use of these technologies such as telling parents and kids the significance of timely vaccinations.

Moreover, there could be other educational areas that can be targeted, for example importance of healthy diet, how different medical equipment such as X-ray works, etc. There are playing areas for kids in many hospitals already, equipping them with immersive technologies like VR and AR, which can prove to be quite effective.

Healthcare marketing and advertising through AR

The use of XR technologies in the healthcare industry is still in its infancy and has a long way to go. With every passing day, medical professionals are exploring new ways to embrace VR and AR in different fields of medical profession; marketing and advertising being one of them.

Pharmaceutical companies are already using mobile apps to sell their medicines. These apps make full use of flow animations to show the effects of a formula inside the human body and how it treats a particular condition.

The technology is also being used to introduce a variety of products to doctors through hovering animations, which are far more interactive and immersive than conventional PowerPoint slides.

Conclusion

Like any other disruptive technology, AR and VR will also take some time to become mainstream to overcome a variety of barriers. The medical professionals as well as patients should be ready to embrace the change. When we talk about XR adoption, it can be safely said that we’re getting there.

Top technology giants including Google, Microsoft, Apple, Oculus, Facebook, Amazon and many others have invested heavily in AR and VR technology and results are already showing. The day isn’t far when XR technologies would be as ubiquitous as thermometers and stethoscopes.

Evolution of blood transfusion medicine

Article-Evolution of blood transfusion medicine

Every two seconds someone needs a blood transfusion or blood product, such as people of all ages who are injured, need surgery or who are suffering from illness. Around 117.4 million blood donations were collected globally during 2018; 42 per cent of these are collected in high-income countries, home to 16 per cent of the world’s population.

Blood transfusion saves lives and improves health, but many patients requiring transfusion do not have timely access to safe blood. Providing safe and adequate blood should be an integral part of every country’s national healthcare policy and infrastructure.

WHO recommends that all activities related to blood collection, testing, processing, storage and distribution be coordinated at the national level through effective organisation and integrated blood supply networks. The national blood system should be governed by national blood policy and legislative framework to promote uniform implementation of standards and consistency in the quality and safety of blood and blood products.

An adequate and reliable supply of safe blood can be assured by a stable base of regular, voluntary, unpaid blood donors. These donors are also the safest group of donors as the prevalence of blood borne infections is lowest among this group.

Despite ongoing improvements in the collection, processing, testing, delivery, and monitoring of transfusions during the past several decades, concerns over the safety of these therapies and the process in general continue today.

Blood transfusion in the 21st century is about safe blood, high quality and standardisations. The need for safe and quality blood and blood products is a worldwide issue and cannot be over emphasised. Blood saves lives but can also be life-threatening. Historically, viral infections e.g. HIV, HCV, and HBV, parasitic infections e.g. malaria have been transmitted via transfusion. A heartbreaking example in history was the 1980’s and 1990’s when many patients with haemophilia in the UK, France, Canada, Japan, U.S., and elsewhere contracted HCV and HIV from blood transfusions and factor concentrates.

Historically, there was concern about transmitting infectious diseases from a donor to a recipient. Now blood is regularly tested for infectious disease transmission, particularly for viruses such as Hepatitis B and C, HIV, and West Nile Virus. Traditionally, serum has been tested to look for the body’s response to past infectious exposure, but many serum tests have been replaced by molecular testing called nucleic acid amplification testing (NAT), which finds active viruses in the donor’s blood to determine infection risk. If an active virus is found, the donor unit is discarded. Blood transfusion has never been safer from known infectious risk than it is today.

There is, hence, an idealistic expectation that blood supply must be safe whatever the cost. The safety, quality and cost effective practices in transfusion of blood and components in which different professional groups with different functions are involved has to be ensured. Blood organisations and hospital transfusion services are therefore under pressure to minimise adverse events and the risk of transfusion transmitted infections. It seems unrealistic because of the emergence of new pathogens, e.g. variant Creutzfeldt-Jakob disease (vCJD), whose mode of transmission are not fully elucidated. Besides infection with microorganisms, transfusion also carries other hazards that can cause substantial morbidity without proper management systems.

Minimising risks

Today, the blood transfusion community continues to advance its transfusion systems, guided by the ISBT (International Society of Blood Transfusion), AABB (American Association of Blood Banks), FDA (Food and Drug Administration), and other federal and professional organisations. Researchers are also establishing new surveillance systems that record data and transfusion outcomes (haemovigilance system) to better understand and manage the risks associated with transfusion. They are offering more personalised treatment, limiting transfusions based on careful assessment of need, and ultimately improving patient care.

In addition to infectious disease risks, treating physicians must also manage other risks, such as post-transfusion reactions. These include transfusion-related lung injury (TRALI), during which the donor’s immune antibodies cause breathing problems in the recipient; transfusion associated cardiac overload (TACO), which is swelling caused by the increased blood volume; and post-transfusion iron overload, which is a build-up of iron in the body, usually caused by multiple or regular transfusions.

To minimise these risks, researchers studying the body’s immune response to transfusions have found that modifying the blood prior to transfusion can reduce reactions. In particular, removing white blood cells or radiating blood to prevent white blood cell growth can reduce the likelihood that the recipient will reject the donor blood. Recently, studies found that using male plasma and platelets may eliminate the transmission of certain antibodies that can cause reactions and are found only in previously pregnant women and transfused males. However, using these techniques has reduced the amount of blood available for transfusions, so researchers are working to identify better ways to safely increase available blood sources.

Avoidable transfusion errors, mostly in patient identification, remain a serious cause of injury and death. There is also heightened awareness of the risk of transmission of viral and bacterial infections. Of particular concern is the (theoretical) possibility of transmission of vCJD.

Unnecessary transfusions and unsafe transfusion practices expose patients to the risk of serious adverse transfusion reactions and transfusion-transmissible infections. Unnecessary transfusions also reduce the availability of blood products for patients who are in need.

Research and development

In recent years, the demand for red blood cells has declined, primarily because doctors are learning to transfuse more scientifically and use blood only when necessary. However, challenges remain. Type O negative red cells, which can be transfused to people of all blood types, is the blood type most likely to be in short supply, and there is a need for donors of all blood types, all the time.

Patient Blood Management (PBM) is an evidence-based bundle of care to optimise medical and surgical patient outcomes by clinically managing and preserving a patient’s blood. PBM has identified risk factors and modify them into the application of the “three pillars” of Patient Blood Management: optimise erythropoiesis (including red cell mass and iron stores), minimise blood loss (surgical) and bleeding (coagulopathy), harness and optimise the patient-specific physiological reserve of anaemia while treatment is initiated. While the phases in surgical and medical patients includes before specific treatment (Pre-operative), during specific treatment (Intra-operative) and following up after specific treatment (Post-operative).

Blood banks all over the world are excited about many developments in blood transfusion medicine that is under research and development. This includes a new red blood cell testing technology aimed at patients with illnesses that require frequent transfusions, for example sickle cell anaemia. These patients sometimes develop antibodies that complicate finding compatible blood for transfusion. A new FDA laboratory is evaluating this new technology, which identifies the genetic characteristics of a patient’s red blood cells so they can be more precisely matched to a donor.

Exciting research is also progressing on technologies that will significantly reduce any bacteria, viruses and parasites that may be in blood. These technologies would complement existing tests for these infectious agents, thus making a safe blood supply even safer.

There is also the topic of artificial blood. Companies are working to develop oxygen carriers that could substitute for red blood cells. We’re working forward and remain hopeful that one or more of these technologies eventually will prove to be safe and effective.

While the blood bankers continue to improve processes, the last decade has also seen developments in innovative approaches, particularly in high-tech cellular therapies and bioengineering.

New techniques can isolate specialised cell populations from blood – most importantly, hematopoietic progenitor cells (HPCs) that are used for stem cell transplantation. Because HPCs are easier to extract and the process is safer for donors, HPC transplantation has replaced bone marrow transplantation for many cancers and other diseases.

Bioinformatics and predictive modelling as tools for clinical diagnostics

Article-Bioinformatics and predictive modelling as tools for clinical diagnostics

Clinicians and medical researchers believe that healthcare systems would enhance their performance if integrative diagnostic approaches were implemented. Although these approaches are advantageous, they are time-consuming, expensive and require complex interpretation, making it hard to be implemented in clinical laboratories. In the last decade, high-throughput technologies such as NGS, microarrays, RNAseq and MALDI-ToF have been evolving and demonstrating in research studies to have an enormous potential to be applied as integrative approaches in personalised medicine and clinical diagnostics.

With these technologies fully implemented in clinical laboratories, it would be possible to extract a large amount of information about the genome, transcriptome, proteome, metabolome and phenome of patients. The information available from these “omics” data is rich enough to allow screening and early detection of multiple diseases as well as the detection of therapeutic targets in drug discovery. High-throughput technologies have been improving the rate by which they generate data from biological samples in such a way that become acceptable for clinical application and personalised medicine.

For example, MALDI-ToF is an ultra-fast and affordable high-throughput technology that processes multiple samples in the scale of minutes, rendering mass spectra that contain proteomic and/or metabolomic information. NGS and RNAseq, on the other hand, are two high-throughput technologies that give robust information about the sequence of the genes and its expression level, which would make possible to detect mutations and deregulation’s associated to genetic diseases. In some clinics, MALDI-ToF is already been used for the detection of bacterial strains, whereas NGS, microarrays and RNAseq for detection of some genetic diseases. Most of these high-throughput technologies have a high cost in the beginning with the acquisition and implementation in laboratories. However, the cost of operations is now quite affordable, particularly in the case of MALDI-ToF, which would pay off the investment and optimise the analytical service in a long-term scale. Moreover, these high-throughput technologies should be seen as potential tools for screening multiple diseases from one patient’s biological sample, which would further decrease the operational costs per disease if proper analytical systems are implemented.

Bioinformatics as a solution for clinical diagnostics

Integrating a large amount of data coming from high-throughput technologies towards personalised medicine and diagnostics cannot be possible without using computational approaches to sort out the complexity of processing and correlating multiple variables at the “omics” level. Bioinformatics is an interdisciplinary field of biology that is focused on applying computational techniques for the analysis and extracting information from data coming from biomolecules. Usually, it integrates techniques from the fields of informatics, computer science, molecular biology, genomics, proteomics, mathematics, and statistics. Although it started as a field fully dedicated to basic research in evolution and genetics, it has been evolving in parallel with high-throughput techniques resulting in the development of many methods and tools that facilitate the interpretation of “omics” data. In bioinformatics, high-throughput data is processed and analysed systematically from raw data to the results using pipelines of analysis using the full potential of computers. Bioinformatic pipelines usually contain multiple steps for data quality assessment, feature extraction, dimension reduction, biomarker detection and results generation. This set of analysis is fully automated where the user has no interference but can play the role of “curator” to check the validation of the outputs (results).

With the evolution of the computational power, bioinformatics gained the potential to tackle big data and integrate a large amount of data much faster than it is produced, becoming a solution applying high-throughput techniques in clinical diagnostics and personalised medicine. For example, some studies have demonstrated that bioinformatic pipelines developed for the analysis of MALDI-ToF mass spectra can extract diagnostic information from urine, blood and embryo culture media faster than its capacity of being generated. In genomics, several bioinformatic pipelines of analysis for NGS, RNAseq and microarrays have been also developed to extract diagnostic information out of sequencing of virus, pathological bacteria and cancer biopsies.

Moreover, bioinformatics tools for processing “omics” have also been successful in the discovery of novel drug targets for cancer therapy. Bioinformatics can further improve clinical laboratories efficiency and costs by saving time and human resources on the analysis and reporting to clinics and patients. This can be done by developing pipelines of analysis with automated reporting and APIs fully dedicated to giving real-time online access, facilitating the communication between laboratories, clinicians and patients. Besides, patient historical data and metadata should be secure and organised in a structured way (data “warehouses”) such that it can be further pulled systematically to bioinformatics pipelines. This would allow going beyond in integrative analysis of patients by having their data as a function of time allowing a more personalised monitoring of patients diagnostic and allowing better prognostics.

Predictive modelling as a complementary diagnostic tool

Predictive modelling frameworks have been extensively used for describing physiological systems and diseases. These frameworks use mathematical models and algorithms for generating predictions about a phenotype or making reasonable estimates with the available data. The integration of these predictors in bioinformatic pipelines is fundamental to make accurate classifications of patient’s samples into the likelihood of having a particular disease or not. This methodology is useful in clinical laboratories for screening and early detection of genetic or metabolic diseases. Also, this type of approach allows making estimations of inaccessible body chemistry parameters based on others, which otherwise are impossible due to experimental constraints or the available methods are too invasive/expensive.

The development of mathematical models and algorithms that generate robust predictions is a hard task and requires rigorous validation procedures before a predictor is ready to be launched into the market. Not many predictors for diagnostics are available to be used or can be adapted to a given clinical laboratory setting. Thus, model development and optimisation for each lab would be the ideal scenario. Integrating predictive modelling workflows in bioinformatic pipelines also facilitate model development by systematising the process of validation and model selection using the data and metadata.

Several types of models can be used to make diagnostic predictions and the choice depends on the data available, technology and the nature of the problem. Statistical models based on known distributions of biomarkers are very common to be used in the diagnostic of a particular disease. These are easy to implement in bioinformatic pipelines and serve as complementary information for clinicians. Implementation of pattern recognition, machine learning and artificial intelligence (AI) algorithms into bioinformatic pipelines are key to optimise mathematical models towards meeting more accurate predictions. Importantly, the usage of machine learning and AI algorithms are essential for the idea of personalised medicine because they enable the fitting of generic models of disease to each patient scenario and body chemistry. Deterministic models such as the logical and kinetic modelling frameworks can also be used for simulation of physiological scenarios and making robust predictions with clinical applications.

For example, simulation of the tumour micro-environment using a logical network model of the regulation of cell adhesion properties allowed to establish relations between cancer de-regulations and the metastatic potential. This has a huge potential for the future development of bioinformatics tools that allow the prediction of the metastatic potential and suggest the best therapy for each case based on the tumour biopsy. Kinetic models, on the other hand, have the potential to be more precise and generate a continuous range of prediction values. However, their parameter estimation is complex and requires machine learning algorithms to adapt them to a particular physiological system. These types of models are excellent for describing the metabolism and can be very useful in as future tools in personalised medicine.

Bioinformatics Fig 1.png

Implementing bioinformatics in clinical laboratories

There is still much to be done for the implementation of the full potential of bioinformatics as a diagnostic tool in clinical laboratories. This requires a joint effort between clinical laboratories, healthcare systems, and software companies to make it happen. As initial steps, clinical laboratories and healthcare systems should start to invest in the following:

  • Acquisition of “omics” high-throughput laboratory equipment (NGS, RNAseq and MALDI-ToF).
  • Acquisition of computational resources (high-performance computers and servers).
  • Hire bioinformaticians or specialised outsourcing companies.
  • Implementation bioinformatic tools specifically designed for each laboratory reality.
  • Implementation of predictive models in software applications for clinical diagnostics.
  • Implementation of web platforms for connecting laboratories, patients and clinicians.

Nevertheless, health organisations should also make an effort to recognise, legislate and validate most bioinformatics and predictive modelling diagnostic tools. Although, this would be a big investment in time, money and resources. However, in the future, it would pay off in terms of the quality of the diagnostic services offered to the populations, making it evolve towards personalised medicine that is affordable for most people.

References available on request.