Articles tagged as "Resources/ Impact/ Development"

Patient expenditures for TB care are impoverishing and may prevent access to care

Household catastrophic payments for tuberculosis care in Nigeria: incidence, determinants, and policy implications for universal health coverage.

Ukwaja KN, Alobu I, Abimbola S, Hopewell PC. Infect Dis Poverty. 2013 Sep17;2(1):21. [Epub ahead of print]

Background: Studies on costs incurred by patients for tuberculosis (TB) care are limited as these costs are reported as averages, and the economic impact of the costs is estimated based on average patient/household incomes. Average expenditures do not represent the poor because they spend less on treatment compared to other economic groups. Thus, the extent to which TB expenditures risk sending households into, or further into, poverty and its determinants, is unknown. We assessed the incidence and determinants of household catastrophic payments for TB care in rural Nigeria.

Methods: Data used were obtained from a survey of 452 pulmonary TB patients sampled from three rural health facilities in Ebonyi State, Nigeria. Using household direct costs and income data, we analyzed the incidence of household catastrophic payments using, as thresholds, the traditional >10% of household income and the >=40% of non-food income, as recommended by the World Health Organization. We used logistic regression analysis to identify the determinants of catastrophic payments.

Results: Average direct household costs for TB were US$157 or 14% of average annual incomes. The incidence of catastrophic payment was 44%; with 69% and 15% of the poorest and richest household income-quartiles experiencing catastrophic activity, respectively. Independent determinants of catastrophic payments were: age >40 years (adjusted odds ratio [aOR] 3.9; 95% confidence interval [CI], 2.0, 7.8), male gender (aOR 3.0; CI 1.8, 5.2), urban residence (aOR 3.8; CI 1.9, 7.7), formal education (aOR 4.7; CI 2.5, 8.9), care at a private facility (aOR 2.9; 1.5, 5.9), poor household (aOR 6.7; CI 3.7, 12), household where the patient is the primary earner (aOR 3.8; CI 2.2, 6.6]), and HIV co-infection (aOR 3.1; CI 1.7, 5.6).

Conclusions: Current cost-lowering strategies are not enough to prevent households from incurring catastrophic out-of-pocket payments for TB care. Financial and social protection interventions are needed for identified at-risk groups, and community-level interventions may reduce inefficiencies in the care-seeking pathway. These observations should inform post-2015 TB strategies and influence policy-making on health services that are meant to be free of charge.

Abstract access 

Editor’s notes: Household health care expenditures can often push households into poverty. These payments, known as catastrophic payments, mean that households are giving up the consumption of basic goods and services to pay for health care.  This study uses individual level data on health care expenditures for TB services and income levels, to examine the extent to which TB involves catastrophic payments in Nigeria.  Although TB services are subsidized and supposed to be free, this survey confirms this is not the case with patients paying most frequently for drugs, laboratory tests and transport.   Of particular concern is the high level of pre-diagnostic costs; that the poor are more vulnerable and the situation is exacerbated for those with HIV co-infection. The findings are important for policy makers trying to improve access to TB care, HIV care and access to health care in general. They emphasize the importance of prepayment schemes to facilitate access to health care when individuals are at their most in need.  

  • share

Food insecurity associated with poor treatment outcomes in rural Uganda

Longitudinal assessment of associations between food insecurity, antiretroviral adherence and HIV treatment outcomes in rural Uganda.

Weiser SD, Palar K, Frongillo EA, Tsai AC, Kumbakumba E, Depee S, Hunt PW, Ragland K, Martin J, Bangsberg DR AIDS. 2013 Aug 9. [Epub ahead of print] PubMed

Introduction:  Food insecurity is a potentially important barrier to the success of antiretroviral therapy (ART) programs in resource-limited settings. We undertook a longitudinal study in rural Uganda to estimate the associations between food insecurity and HIV treatment outcomes.

Design:  Longitudinal cohort study.

Methods:  Participants were from the Uganda AIDS Rural Treatment Outcomes study and were followed quarterly for blood draws and structured interviews. We measured food insecurity with the validated Household Food Insecurity Access Scale. Our primary outcomes were: ART nonadherence (adherence <90%) measured by visual analog scale; incomplete viral load suppression (>400 copies/ml); and low CD4 cell count (<350 cells/μl). We used generalized estimating equations to estimate the associations, adjusting for socio-demographic and clinical variables.

Results:  We followed 438 participants for a median of 33 months; 78.5% were food insecure at baseline. In adjusted analyses, food insecurity was associated with higher odds of ART nonadherence [adjusted odds ratio (AOR) 1.56, 95% confidence interval (CI) 1.10-2.20, P < 0.05], incomplete viral suppression (AOR 1.52, 95% CI 1.18-1.96, P < 0.01), and CD4 cell count less than 350 (AOR 1.47, 95% CI 1.24-1.74, P < 0.01). Adding adherence as a covariate to the latter two models removed the association between food insecurity and viral suppression, but not between food insecurity and CD4 cell count.

Conclusions:  Food insecurity is longitudinally associated with poor HIV outcomes in rural Uganda. Intervention research is needed to determine the extent to which improved food security is causally related to improved HIV outcomes and to identify the most effective policies and programs to improve food security and health.

Abstract access

Editor’s notes: The associations between markers of poverty and HIV infection, access to interventions and HIV-related morbidity and mortality have been much discussed. These associations are both complex and changing over time. HIV has disproportionately affected sub-Saharan Africa, the world’s poorest region. Within Africa, higher socioeconomic position individuals carried the highest burden of infection early in the epidemic though this appears to have changed over time. Much less is known about how these relationships play out in access or adherence to HIV/AIDS treatments. Using a cohort study the authors show that in a Ugandan rural population of ART initiators, food insecurity was common. Self-reported non adherence and incomplete viral suppression were less common than food insecurity among the whole group. The study shows that the food insecure group were somewhat more likely to exhibit poor treatment outcomes.  In addition, it shows some evidence that poor adherence was responsible for the worse levels of immune suppression among the food insecure. Future research should continue to monitor inequalities in access to treatment, and HIV-related morbidity and mortality as an important phenomenon in its own right. As the authors suggest, trials of the added benefit to treatment outcomes of interventions that seek to improve food-security among treatment initiators are also warranted.

  • share

Single dose treatment regimens associated with lower pharmacy costs, fewer hospitalisations and lower hospital costs in the US

Association between daily antiretroviral pill burden and treatment adherence, hospitalisation risk, and other healthcare utilisation and costs in a US medicaid population with HIV.

Cohen CJ, Meyers JL, Davis KL. BMJ Open. 2013 Aug 1;3(8). doi:pii: e003028. 10.1136/bmjopen-2013-003028.

Objectives:  Lower pill burden leads to improved antiretroviral therapy (ART) adherence among HIV patients. Simpler dosing regimens have not been widely explored in real-world populations. We retrospectively assessed ART adherence, all-cause hospitalisation risk and costs, and other healthcare utilisation and costs in Medicaid enrollees with HIV treated with ART as a once-daily single-tablet regimen (STR) or two or more pills per day (2+PPD).

Design:  Patients with an HIV diagnosis from 2005 to 2009 receiving complete ART (ie, two nucleoside/nucleotide reverse transcriptase inhibitors plus a third agent) for ≥60 days as STR or 2+PPD were selected and followed until the first of (1) discontinuation of the complete ART, (2) loss of enrolment or (3) end of database. Adherence was measured using the medication possession ratio. Monthly all-cause healthcare utilisation and costs were observed from regimen initiation until follow-up end.

Results:  Of the 7 381 patients who met inclusion criteria, 1 797 were treated with STR and 5 584 with 2+PPD. STR patients were significantly more likely to reach 95% adherence and had fewer hospitalisations than 2+PPD patients (both p<0.01). STR patients had mean (SD) total monthly costs of $2 959 ($4 962); 2+PPD patients had $3 544 ($5 811; p<0.001). Hospital costs accounted for 53.8% and pharmacy costs accounted for 32.5% of this difference. Multivariate analyses found that STR led to a 23% reduction in hospitalisations and a 17% reduction in overall healthcare costs. ART adherence appears to be a key mechanism mediating hospitalisation risk, as patients with ≥95% adherence (regardless of regimen type) had a lower hospitalisation rate compared with <95% adherence.

Conclusions:  While it was expected that STR patients would have lower pharmacy costs, we also found that STR patients had fewer hospitalisations and lower hospital costs than 2+PPD patients, resulting in significantly lower total healthcare costs for STR patients.

Abstract Full-text [free] access

Editor’s notes: This observational study using insurance claims of over seven thousand patients over three years looked at differences in adherence between patients taking a single tablet per day as compared to those prescribed ART treatment of more than a single tablet each day. Almost a quarter of the patients were on a single pill regimen. Adherence was assessed using the medication possession ratio which is a common proxy for medication adherence. This measures “the proportion of the ART-exposure period in which supply was maintained for all ART components comprising the regimen”. Though observational studies may be subject to numerous biases, they are the only approach to studying real world adherence. By showing in subgroup analysis that high and low adherers in each treatment group have similar hospitalization rates and health care costs, they are able to conclude that the key difference in outcomes is attributable to differences in adherence rates between the regimens. Further sub-group analyses address the various potential selection biases present. Authors found that not only were drug costs lower for the single dose regimen, but these were easier for patients to adhere to, which in turn leads to lower morbidity, hospitalization rates and health care costs.  Though this appears a win-win situation, there are only limited drug combinations available as single pill regimens at present; therefore, they are not always appropriate for all patients. This study provides an additional impetus for continued development of single pill regimens to increase their access to a broader range of patients and improve patient outcomes.

Northern America
United States of America
  • share

Accelerating initiation of antiretroviral therapy in India in the era of free roll-out

Impact of generic antiretroviral therapy (ART) and free ART programs on time to initiation of ART at a tertiary HIV care center in Chennai, India.

Solomon SS, Lucas GM, Kumarasamy N, Yepthomi T, Balakrishnan P, Ganesh AK, Anand S, Moore RD, Solomon S, Mehta SH. AIDS Care. 2013 Aug;25(8):931-6. doi:10.1080/09540121.2012.748160. Epub 2012 Dec 7.

Antiretroviral therapy (ART) access in the developing world has improved, but whether increased access has translated to more rapid treatment initiation among those who need it is unknown. We characterize time to ART initiation across three eras of ART availability in Chennai, India (1996-1999: pregeneric; 2000-2003: generic; 2004-2007: free rollout). Between 1996 and 2007, 11 171 patients registered for care at the YR Gaitonde Centre for AIDS Research and Education (YRGCARE), a tertiary HIV referral center in southern India. Of these, 5 726 patients became eligible for ART during this period as per Indian guidelines for initiation of ART. Generalized gamma survival models were used to estimate relative times (RT) to ART initiation by calendar periods of eligibility. Time to initiation of ART among patients in Chennai, India was also compared to an HIV clinical cohort in Baltimore, USA. Median age of the YRGCARE patients was 34 years; 77% were male. The median CD4 at presentation was 140 cells/µl. After adjustment for demographics, CD4 and WHO stage, persons in the pregeneric era took 3.25 times longer (95% confidence interval [CI]: 2.53-4.17) to initiate ART versus the generic era and persons in the free rollout era initiated ART more rapidly than the generic era (RT: 0.73; 95% CI: 0.63-0.83). Adjusting for differences across centers, patients at YRGCARE took longer than patients in the Johns Hopkins Clinical Cohort (JHCC) to initiate ART in the pregeneric era (RT: 4.90; 95% CI: 3.37-7.13) but in the free rollout era, YRGCARE patients took only about a quarter of the time (RT: 0.31; 95% CI: 0.22-0.44). These data demonstrate the benefits of generic ART and government rollouts on time to initiation of ART in one developing country setting and suggests that access to ART may be comparable to developed country settings.

Abstract access

Editor’s notes: This study documents changes in the time from HIV diagnosis until initiation of ART over three defined calendar periods, for ART eligible patients attending a single treatment centre in Chennai, India. Over three periods of time between1996 and 2007 which were characterized by (i) treatment with pre-generics (ii) treatment with generics (iii) free roll-out of ART, there were sequential very substantial reductions in time to treatment, to the extent that in the latter period, the time to treatment was shorter than treatment in a clinical cohort in Baltimore, USA, in adjusted analyses.

  • share

High prevalence of neurocognitive dysfunction among ART-naive youth

Neurocognitive Functioning in Antiretroviral Therapy-Naïve Youth With Behaviorally Acquired Human Immunodeficiency Virus.

Nichols SL, Bethel J, Garvie PA, Patton DE, Thornton S, Kapogiannis BG, Ren W, Major-Wilson H, Puga A, Woods SP.Put authors and journal here J Adolesc Health. 2013 Aug 21. doi:pii: S1054 139X(13)00371-6. 10.1016/j.jadohealth.2013.07.006. [Epub ahead of print]

Purpose:  Youth living with human immunodeficiency virus (HIV) account for over one third of new HIV infections and are at high risk of adverse psychosocial, everyday living, and health outcomes. Human immunodeficiency virus-associated neurocognitive disorders (HAND) are known to affect health outcomes of HIV-infected adults even in the era of combination antiretroviral therapy. Thus, the current study aimed to characterize the prevalence and clinical correlates of HAND in youth living with HIV. Here, we report baseline neurocognitive data for behaviorally HIV-infected youth enrolled in a prospective study evaluating strategies of antiretroviral treatment initiation and use.

Methods:  A total of 220 participants, age 18-24 years, who were naive to treatment (except for prevention of mother-to-child HIV transmission; n = 3), completed a comprehensive neurocognitive, substance use, and behavioral health assessment battery.

Results:  Sixty-seven percent of youth met criteria for HAND (96.4% were asymptomatic and 3.5% were syndromic); deficits in episodic memory and fine-motor skills emerged as the most commonly affected ability areas. Multivariable models showed that lower CD4 count, longer time since HIV diagnosis, and high-risk alcohol use were uniquely associated with neurocognitive deficits.

Conclusions:  Over two thirds of youth with behaviorally acquired HIV evidence neurocognitive deficits, which have modest associations with more advanced HIV disease as well as other factors. Research is needed to determine the impact of such neuropsychiatric morbidity on mental health and HIV disease treatment outcomes (e.g., nonadherence) and transition to independent living responsibilities in HIV-infected youth, as well as its long-term trajectory and possible responsiveness to cognitive rehabilitation and pharmacotherapy.

Keywords:  Adolescent, HIV, HIV-associated neurocognitive disorder, Neurocognitive functioning, Substance use

Abstract Full-text [free] access

Editor’s notes: HIV-associated neurocognitive disease (HAND) is emerging as an important cause of morbidity among adults, even among those who are successfully treated with ART. This study used a comprehensive battery of previously validated assessments to examine different domains of neurocognitive function, including memory, motor skills, attention and executive function, among youth with behaviourally-acquired HIV. Most participants had become infected within the past 24 months. While nearly two-thirds of participants did have impaired test results consistent with HAND, only a minority were symptomatic, defined as having an impact on daily functioning.  There was also a high prevalence of concurrent demographic, psychiatric and psychosocial risk factors for poorer neurocognitive functioning.  These included alcohol and substance use, lower educational achievement, high unemployment rates and black ethnicity. It is likely that these factors modify the risk of developing HAND, and that “low cognitive reserve” may increase the risk of developing neurocognitive impairment.         

Environmental and biological factors interact to cause HAND, and this study (like other similar studies), recruited no controls. This complicates the interpretation of the findings. Additionally it is a cross-sectional study and the natural history and risk of progression of impairments in this age-group is not known. Young people are at the highest risk of becoming infected with HIV compared to any other age-group. The strikingly high rate of HAND in a population who have acquired HIV relatively recently is of concern.

Adolescence is a critical period of ongoing brain development, including fronto-striatal systems that are particularly vulnerable to HIV infection.  This period is characterized by acquisition of skills essential for transition to adulthood, while at the same time associated with increased experimentation and risk taking.  Therefore, these defects have important implications for functional and behavioural outcome in this age-group.  Secondly, it raises concerns that, unlike previous assumptions that neurocognitive disease was a consequence of advanced HIV disease, the central nervous system is affected early in the course of HIV infection.  Further research is required to address the possibility that adolescents and youth who recently acquired HIV may be at increased risk of neurocognitive impairment from early subclinical events even if remaining otherwise healthy.

Northern America
United States of America
  • share

Some evidence of impact from external funding for HIV, TB and malaria - and the need for more

Impact of external funding for HIV, tuberculosis and malaria: systematic review.

de Jongh TE, Harnmeijer JH, Atun R, Korenromp EL, Zhao J, Puvimanasinghe J, Baltussen R. Health Health Policy Plan. 2013 Aug 5. [Epub ahead of print]

Background:  Since 2002, development assistance for health has substantially increased, especially investments for HIV, tuberculosis (TB) and malaria control. We undertook a systematic review to assess and synthesize the existing evidence in the scientific literature on the health impacts of these investments.

Methods and Findings:  We systematically searched databases for peer-reviewed and grey literature, using tailored search strategies. We screened studies for study design and relevance, using predefined inclusion criteria, and selected those that enabled us to link health outcomes or impact to increased external funding. For all included studies, we recorded dataset and study characteristics, health outcomes and impacts. We analysed the data using a causal-chain framework to develop a narrative summary of the published evidence. Thirteen articles, representing 11 individual studies set in Africa and Asia reporting impacts on HIV, tuberculosis and malaria, met the inclusion criteria. Only two of these studies documented the entire causal-chain spanning from funding to programme scale-up, to outputs, outcomes and impacts. Nonetheless, overall we find a positive correlation between consecutive steps in the causal chain, suggesting that external funds for HIV, tuberculosis and malaria programmes contributed to improved health outcomes and impact.

Conclusions:  Despite the large number of supported programmes worldwide and despite an abundance of published studies on HIV, TB and malaria control, we identified very few eligible studies that adequately demonstrated the full process by which external funding has been translated to health impact. Most of these studies did not move beyond demonstrating statistical association, as opposed to contribution or causation. We thus recommend that funding organizations and researchers increase the emphasis on ensuring data capture along the causal pathway to demonstrate effect and contribution of external financing. The findings of these comprehensive and rigorously conducted impact evaluations should also be made publicly accessible.

Keywords: Africa, Asia, Health financing, developing countries, donors, health outcomes, impact

Abstract access

Editor’s notes: In the current context of resource constraints and after a decade of unprecedented increases in development assistance for health (particularly for HIV, tuberculosis and malaria), donors are increasingly concerned about the value for money of their investments. This study reviewed available evidence on the impact of external funding, finding a paucity of rigorous scientific evaluation data on the efficiency, effectiveness and impact.

The identified HIV studies found associations between programme investments and increased access and adherence to ART, as well as reduced HIV-related mortality, but limited evidence of preventive impacts on rates of HIV infection. There were many study limitations, including the lack of randomization or robust controls, and relatively small (or statistically insignificant) observed effects. Few studies provided a full analysis of effectiveness along the causal chain from inputs to impact, and none considered the potential undesirable effects of external funding.

Although the aims of the study were ambitious, this paper highlights the challenges of documenting the impacts of financial investments, with the authors arguing that future evaluations need to adopt a more systemic approach to impact evaluation that better captures the causal pathway between investment inputs and impacts, as well as broader system-wide effects. 

Africa, Asia
Cameroon, China, India, Kenya, Malawi, Zambia
  • share

No reductions in HIV incidence yet in rural KwaZulu-Natal, South Africa

Modelling HIV incidence and survival from age-specific seroprevalence after antiretroviral treatment scale-up in rural South Africa.

Mossong J, Grapsa E, Tanser F, Bärnighausen T, Newell ML., AIDS. 2013 Jul 9. [Epub ahead of print]

Objective: Our study uses sex- and age-specific HIV prevalence data from an ongoing population-based demographic and HIV survey to infer HIV incidence and survival in rural KwaZulu-Natal between 2003 and 2011, a period when antiretroviral treatment (ART) was rolled out on a large scale.

Design: Catalytic mathematical model for estimating HIV incidence and differential survival in HIV-infected persons on multiple rounds of HIV sero-prevalence.

Methods: We evaluate trends of HIV incidence and survival by estimating parameters separately for women and men aged 15-49 years during three calendar periods (2003-05, 2006-08, 2009-2011) reflecting increasing ART coverage. We compare model-based estimates of HIV incidence with observed cohort-based estimates from the longitudinal HIV surveillance.

Results: Median survival after HIV infection increased significantly between 2003-2005 and 2009-2011 from 10.0 (95% confidence interval (CI) 8.8-11.2) to 14.2 (95% CI 12.6-15.8) years in women (p < 0.001) and from 10.0 (95% CI 9.2-10.8) to 14.0 (95% CI 10.6-17.4) years in men (p = 0.02). Our model suggests no statistically significant reduction of HIV incidence in the age-group 15-49 in 2009-2011 compared to 2003-2005. Age- and sex-specific model-based HIV incidence estimates were in good agreement with observed cohort-based estimates from the ongoing HIV surveillance.

Conclusions: Our catalytic modelling approach using cross-sectional age-specific HIV prevalence data could be useful to monitor trends of HIV incidence and survival in other African settings with a high ART coverage.

Abstract access

Editor’s notes: Measuring HIV incidence directly is challenging, and different modelling approaches have been used.  In this paper, a relatively simple model based solely on age- and sex-specific HIV seroprevalence data from annual serosurveys was used to model HIV incidence in rural KwaZulu-Natal.  The models (and observed data) showed little decline in HIV incidence from 2003 to 2011 in either men or women, with high cumulative incidence by age 50 years (about 75% for women and 70% for men), perhaps due to relatively low coverage of ART until recently in the age groups modelled in this study (15-49 year olds). The peak HIV incidence was at age 21-22 years for women and age 27-29 years for men.  More positively, the study did show a significant reduction in HIV-associated mortality, and increased life expectancy.  The modelling approach used here could be useful for monitoring incidence and survival in other settings where cross-sectional seroprevalence data is collected.

South Africa
  • share

Improving access to HIV viral load testing in resource-limited settings

Pooled HIV-1 viral load testing using dried blood spots to reduce the cost of monitoring antiretroviral treatment in a resource-limited setting.

Pannus P, Fajardo E, Metcalf C, Coulborn RM, Durán LT, Bygrave H, Ellman T, Garone D, Murowa M, Mwenda R, Reid T, Preiser W J., Acquir Immune Defic Syndr. 2013 Jul 25. [Epub ahead of print]

Roll-out of routine HIV-1 viral load monitoring is hampered by high costs and logistical difficulties associated with sample collection and transport. New strategies are needed to overcome these constraints. Dried blood spots from finger-pricks have been shown to be more practical than the use of plasma specimens, and pooling strategies using plasma specimens have been demonstrated to be an efficient method to reduce costs. This study found that combination of finger-prick dried blood spots (DBS) and a pooling strategy is a feasible and efficient option to reduce costs while maintaining accuracy in the context of a district hospital in Malawi.

Abstract access

Editor’s notes: The use of dried blood spots from finger-prick testing overcomes some of the logistical barriers to scaling up viral load monitoring in resource-limited settings. It enables task-shifting of sample collection to lower cadre healthcare workers, removes phlebotomy costs and, as there is no need for a cold chain, overcomes transport barriers. A pooling strategy, whereby samples from different patients are pooled, analyzed together, and only analyzed individually if the pooled sample ‘flags’ positive, has been shown to reduce the cost of viral load monitoring without compromising test accuracy. This study, conducted by MSF in a district hospital laboratory in Malawi, demonstrates that this combined approach is feasible in this setting, cost efficient in that it reduces the number of viral load tests by approximately 30-50%, and accurate (negative predictive value 97-100%; positive predictive value 96-100% despite limited sensitivity at the 5 000 copies/ml threshold). Although the cost savings will vary according to the underlying prevalence of virological failure (pooling is less efficient at a high prevalence) and local laboratory costs, this study demonstrates that this approach is a feasible strategy which could be used to facilitate the roll-out of viral load monitoring in resource-limited settings. 

  • share

Better virological outcomes with efavirenz compared to nevirapine

Outcomes for efavirenz versus nevirapine-containing regimens for treatment of HIV-1 infection: a systematic review and meta-analysis.

Pillay P, Ford N, Shubber Z, Ferrand RA., PLoS One. 2013 Jul 22;8(7):e68995. doi: 10.1371/journal.pone.0068995. Print 2013

Introduction: There is conflicting evidence and practice regarding the use of the non-nucleoside reverse transcriptase inhibitors (NNRTI) efavirenz (EFV) and nevirapine (NVP) in first-line antiretroviral therapy (ART).

Methods: We systematically reviewed virological outcomes in HIV-1 infected, treatment-naive patients on regimens containing EFV versus NVP from randomised trials and observational cohort studies. Data sources include PubMed, Embase, the Cochrane Central Register of Controlled Trials and conference proceedings of the International AIDS Society, Conference on Retroviruses and Opportunistic Infections, between 1996 to May 2013. Relative risks (RR) and 95% confidence intervals were synthesized using random-effects meta-analysis. Heterogeneity was assessed using the I(2) statistic, and subgroup analyses performed to assess the potential influence of study design, duration of follow up, location, and tuberculosis treatment. Sensitivity analyses explored the potential influence of different dosages of NVP and different viral load thresholds.

Results: Of 5011 citations retrieved, 38 reports of studies comprising 114 391 patients were included for review. EFV was significantly less likely than NVP to lead to virologic failure in both trials (RR 0.85 [0.73-0.99] I(2) = 0%) and observational studies (RR 0.65 [0.59-0.71] I(2) = 54%). EFV was more likely to achieve virologic success than NVP, though marginally significant, in both randomised controlled trials (RR 1.04 [1.00-1.08] I(2) = 0%) and observational studies (RR 1.06 [1.00-1.12] I(2) = 68%).

Conclusion: EFV-based first line ART is significantly less likely to lead to virologic failure compared to NVP-based ART. This finding supports the use of EFV as the preferred NNRTI in first-line treatment regimen for HIV treatment, particularly in resource limited settings.

Abstract  Full-text [free] access

Editor’s notes: Efavirenz and nevirapine are key antiretroviral agents, particularly in resource-limited settings. Nevirapine has been widely used, for reasons including safety during pregnancy and lower cost, despite lower potency and a higher risk of hepatotoxicity and severe allergic reactions, than with efavirenz. This article summarizes data on virological outcomes from clinical trials and observational cohort studies comparing efavirenz and nevirapine. The finding that efavirenz is associated with slightly better virological outcomes is not surprising but it is valuable to have the available data summarised. The result, along with recent recommendations allowing efavirenz to be taken throughout pregnancy, and price reductions, supports the move towards efavirenz-based fixed drug combinations as first-line antiretroviral treatment in resource-limited settings.

  • share

HIV care cascade in the USA reveals 75% with unsuppressed viral load

Differences in human immunodeficiency virus care and treatment among subpopulations in the United States.

Hall HI, Frazier EL, Rhodes P, Holtgrave DR, Furlow-Parmley C, Tang T, Gray KM, Cohen SM, Mermin J, Skarbinski J.,  JAMA Intern Med. 2013 Jul 22;173(14):1337-44. doi: 10.1001/jamainternmed.2013.6841.

Importance: Early diagnosis of human immunodeficiency virus (HIV) infection, prompt linkage to and sustained care, and antiretroviral therapy are associated with reduced individual morbidity, mortality, and transmission of the virus. However, levels of these indicators may differ among population groups with HIV. Disparities in care and treatment may contribute to the higher incidence rates among groups with higher prevalence of HIV.

Objective: To examine differences between groups of persons living with HIV by sex, age, race/ethnicity, and transmission category at essential steps in the continuum of care.

Design and setting: We obtained data from the National HIV Surveillance System of the Centers for Disease Control and Prevention to determine the number of persons living with HIV who are aware and unaware of their infection using back-calculation models. We calculated the percentage of persons linked to care within 3 months of diagnosis on the basis of CD4 level and viral load test results. We estimated the percentages of persons retained in care, prescribed antiretroviral therapy, and with viral suppression using data from the Medical Monitoring Project, a surveillance system of persons receiving HIV care in select areas representative of all such persons in the United States.

Participants: All HIV-infected persons in the United States.

Main outcomes and measures: Percentage of persons living with HIV who are aware of their infection, linked to care, retained in care, receiving antiretroviral therapy, and achieving viral suppression.

Results: Of the estimated 1 148 200 persons living with HIV in 2009 in the United States, 81.9% had been diagnosed, 65.8% were linked to care, 36.7% were retained in care, 32.7% were prescribed antiretroviral therapy, and 25.3% had a suppressed viral load (≤200 copies/mL). Overall, 857 276 persons with HIV had not achieved viral suppression, including 74.8% of male, 79.0% of black, 73.9% of Hispanic/Latino, and 70.3% of white persons. The percentage of blacks in each step of the continuum was lower than that for whites, but these differences were not statistically significant. Among persons with HIV who were 13 to 24 years of age, only 40.5% had received a diagnosis and 30.6% were linked to care. Persons aged 25 to 34, 35 to 44, and 45 to 54 years were all significantly less likely to achieve viral suppression than were persons aged 55 to 64 years.

Conclusions and relevance: Significant age disparities exist at each step of the continuum of care. Additional efforts are needed to ensure that all persons with HIV receive a diagnosis and optimal care to reduce morbidity, mortality, disparities in care and treatment, and ultimately HIV transmission. Ensuring that people stay in care and receive treatment will increase the proportion of HIV-infected individuals who achieve and maintain a suppressed viral load.

Abstract access

Editor’s notes: This elegant analysis brings together data from multiple sources to summarise the proportion of people living with HIV in the USA who have successfully navigated each stage of the care pathway. The output quantifies attrition at each stage of the pathway towards effective treatment in a very simple and clear way, and illustrates how few people are successfully maintained on treatment, as indicated by a suppressed viral load. The observation that three-quarters of people with HIV in the United States have unsuppressed viral load has profound implications for continuing morbidity as well as ongoing transmission. The low proportion of young people aware of their HIV-positive status is a particular cause for concern. Similar analyses may help other countries assess where efforts should be targeted to minimise HIV-related morbidity and reduce transmission.

Northern America
United States of America
  • share