Trends in Antimalarial Drug Use in Africa

Download File

Abstract

Chloroquine (CQ) was the most frequently used first-line therapy for uncomplicated Plasmodium falciparum (P.f.) malaria from the 1940s through to the 2000s.1 As a result of its high efficacy, good safety profile, and low cost, CQ was a key part of the 1950s Global Malaria Eradication Program.2 However, factors including funding constraints, lack of political support, and the emergence and subsequent spread of resistance to CQ and the pesticides used in vector control hampered eradication plans.3 Resistance to CQ was first identified in the late 1950s, on the Thai–Cambodian border and concomitantly in South America.4–6 The spread of CQ resistance to Africa ensued, with treatment failures confirmed in 1978 in Kenya and Tanzania,7,8 and later reported in West Africa in the 1980s.9,10 Despite declining use, CQ remained the first-line therapy for uncomplicated P.f. malaria in the majority of sub-Saharan countries until after 2000. An increase in malaria morbidity and mortality in children < 5 years of age was observed during this period, and this trend has been attributed partly to CQ resistance.11,12

During the 1960s, sulphadoxine-pyrimethamine (SP) was introduced in many countries to replace CQ. Because of a rapid decline in efficacy in areas of intense use, first in Southeast Asia in the 1970s and later in East Africa in the late 1980s,13 SP was withdrawn from African countries as a firstline treatment of P.f. malaria between 2003 and 2008. SP continues to be recommended as an intermittent preventative treatment of pregnant women14 and more recently as part of the seasonal malaria chemoprevention in areas of highly seasonal transmission.15