[Cross-posted from www.the-trench.org]
Book review: Debora MacKenzie, COVID-19: The Pandemic That Never Should Have Happened and How to Stop the Next One (New York: Hachette Books, 2020), 279p.
The book opens with the quote from a poster seen at the first March for Science on 22 April 2017: “Every disaster movie starts with someone ignoring a scientist”. My immediate thought: well, scientists may be clever, but they just cannot express their thoughts in a register that politicians and opinion shapers might understand. Another reflection replaced it instantaneously, this time on politicians: they are so ideological that if their teachings tell them to view everything through a square, then they will only see squares. Try to square a circle if the opposition solely looks at the world through circles. Compromise, if possible, may take policies forward, but maybe not in directions that raise resilience to catastrophes.
Debora MacKenzie is a scientist and a journalist for the New Scientist and other science publications since – to the best of my knowledge – the early 1980s. She is also an engaged writer with a knack for making complex things understandable to a broad audience. And her deep knowledge reaches far wider than the mere ‘science’ of things or processes. This makes COVID-19 so engaging. The book project may have started early in the spring; the contents draw on her many years of probing experts in many disciplines.
She has been onto the potential of epidemics and pandemics since the start of the century. The Severe Acute Respiratory Syndrome (SARS) crisis that began in China in 2002 and hit her country of origin, Canada, badly is seared in her memory. She wondered whether an outbreak on the scale of ‘svarta döden’ (as the Swedes began calling the Great Pestilence of the mid-1300s in the 15th century) might recur; whether mortality might reach somewhere between 33% and 50% again; and what social, economic, and political factors contributed to the pandemic’s severity. Similar questions guide her investigation into COVID-19. Unsurprisingly, the narrative takes us back many years, even decades before the new coronavirus infected people in China. That story differs greatly from those politicians, experts, opinion shapers and other pundits like to feed us.
The perfect storm
In his book ‘The Perfect Storm’ Sebastian Junger recreates the final moments of a fishing boat out into the Atlantic Ocean when a massive cyclone hit the US east coast over the Halloween period in 1991. To many the title means a rare, unfolding event during which separate developments conspire to produce an aggravated outcome. Well, not quite. In the book, each contributing circumstance had been predicted or foretold, but the protagonists failed to act upon the warnings. In this sense, COVID-19 describes how many policy options, economic strategies, and social preferences over the past 3-4 decades made the preconditions for a perfect storm.
A first contributing element was a growing conviction in the 1960s that humanity had by and large vanquished disease. Technological and social optimism accompanied the view: vaccines were to defeat infections and prosperity contributed to the overall decline in disease. The formal declaration by the World Health Organisation (WHO) in May 1980 that the global immunisation programme had rid the smallpox scourge capped that optimism. The disposition, however, spawned two other trends whose impact greatly contributed to the difficulties in containing major epidemics in the 21st century.
Governments, especially in the more affluent societies, disinvested in public health. Internationally they reduced surveillance stations to detect outbreaks early or new sources of infection. The process also coincided with decolonisation, due to which surveillance outposts in spaces where new diseases often originate no longer provided advance warning of emerging health threats. From the early 1980s onwards, fiscal frugality to reduce national budget deficits combined with supply-side economics in industrialised countries eventually resulted in imposing budgetary constraints on international organisations like the WHO. This too affected surveillance and response capacities. Domestically, the same trends led into significant reductions of investments in health as a public good and the broad privatisation of health care. Governments likewise no longer saw value in maintaining development and stockpiling in drugs, vaccines, and diagnostics, leaving research and production decisions in the hands of private sector interests. Those governments also neglected to maintain a surge production capacity for critical goods if a major health emergency were to arise. After SARS had been overcome in 2003, investment in developing countermeasures halted lacking a market. MacKenzie argues that if public investment into vaccine research against the coronavirus had continued, then we may have had a head start in containing the COVID-19 crisis.
A second major contributing element are governments’ reluctance to heed warnings by scientists. For all the optimism that existed after the eradication of smallpox, by the end of the 1980s infectious disease experts were sounding the first alarm bells about emerging and re-emerging diseases. They also noted the zoonotic origin of many of the new health threats: as humans were increasingly destroying or penetrating the natural habitat for many species, multiple animal pathogens mutated to infect humans and then to become transmissible among humans. Their alarm signals did not prompt governments to raise their guards, meaning that when epidemics arrived, they had to rush measures, more often improvising than implementing preconceived policies. When opportunities presented themselves to re-evaluate political and economic choices, they ignored those early warning signals.
A third major strand MacKenzie identifies through the analysis of complex systems. In our globalised societies everything has become tightly and efficiently interconnected. This optimisation benefits profits. Therefore, supplies arrive ‘just-in-time’ and production sources are offshored to low-wage countries. One consequence, as most of us discovered during the COVID-19 crisis, is that most medication and medical equipment are manufactured in China (another thing health professionals had warned policy makers about several years earlier). Another upshot is the extreme rigidity in a highly optimised complex system. A shock gets transmitted through much of that system; if a link breaks down, the whole system suffers.
From a security perspective, no system should be optimised to the hilt. ‘Resilience’ and ‘redundancy’ are central concepts. Systems should be able to withstand maximal stresses. Should a link give way, then alternative options or routes must be available to immediately take over the failed link’s functions. The consequences of the rigidly organised complex system of international production, transport and delivery became almost instantaneously visible. With China in lock-down when the number of COVID cases exploded in Europe and the USA, much of the critical equipment such as face masks, disinfectant hand lotions or ventilators were lacking. Their unavailability from production sources – in China mostly – led to panicky decision-making and inept initial responses. Especially the hasty closing of borders contributed to further perturbations in the global system, affecting other economic sectors (including travel and tourism).
On the level of companies big and small, what was good for shareholders and balance sheets proved once more a liability in a crisis. Just like governments did not learn from the first SARS epidemic, companies ignored the lessons available to learn from the 2008 financial crisis. Governments this time stepped in with rescue packages costing hundreds of billions, if not trillions of Euros in taxpayer revenue to save the economy and employment. Big companies and smaller, family-owned businesses are failing or will crash once government-supported measures end and the COVID crisis is not yet over.
As MacKenzie wryly remarks, had governments been less willing to be economically or fiscally optimised and invested tens of billions of Euros in disease surveillance, preparedness and health as a public good over the years, then today they would be saving a multitude of that money now being expended as crisis response. And that thought does not even take the human and societal toll from lack of resilience and redundancy into account.
A ‘black swan’ event COVID-19 is not. MacKenzie illustrates throughout her book how scientists have been sounding alarms for many years. Both the outbreak and its consequences were foreseeable. The lack of preparation was a consequence of political and economic (so-called ‘rational’) decision-making, driven by ideological preferences.
The Wuhan lab, bats, and the USA
The author opens the fourth chapter with a curt answer to where Severe Acute Respiratory Syndrome Coronavirus 2 (SARS-CoV-2) originated: ‘The COVID-19 virus comes from bats’. To emphasise the point, she adds: ‘So did SARS. So do MERS, Ebola, Marburg, Nipah, Hendra, and Lassa viruses.’ The bigger point: We need not look for exotic explanations about the beginning of COVID-19, such as a genetically engineered pathogen or an agent that escaped from a laboratory. She also gives short shrift to the idea that civets or pangolins at a wet market might have transmitted the virus to humans. When Chinese scientists, most from Wuhan where the outbreak was raging, summarised their work in late January 2020, they confirmed their finding that SARS-like coronaviruses had their natural reservoir in bats and that several among them had the potential to infect humans. The virus killing people in the city was 96 percent identical to one found in bats and uses the same cellular receptor. The SARS-like virus in pangolins is much more different.
The Wuhan Institute of Virology has played a key role in this monitoring of bats and research. Not in the sense of biological weapons research or a major biosafety incident as many press reports and commentators have suggested. Whether the wet market played any role in the outbreak still needs to be determined. However, as MacKenzie points out, many locals and bats interact in several ways with each other. Bats on the traditional Chinese menu are the larger fruit bats rather than the much smaller insect-eating horseshoe bats, which are host to SARS-like corona viruses. Therefore, traditional medicine may play a much larger role. Horseshoe bats’ faeces are used in products to treat several conditions due to the high Vitamin A content. The list of afflictions includes eye disorders. Researchers have confirmed coronaviruses in fresh horseshoe bat droppings. Drying such faeces might kill most, but not all pathogens present. The eye has receptors onto which SARS-CoV-2 latches itself, and research suggests that the virus may be persistent there. Practitioners of traditional Chinese medicine recommend the application of a water solution directly to the eye. It is therefore possible that the eye may have been an important route of infection. Dried bodies of the horseshoe bat are also a traditional remedy against coughs. Impoverished people catch horseshoe bats and collect their droppings, which may have created other routes through which humans might have contracted COVID. The faeces are also applied as an agricultural fertiliser.
Shortly after the SARS outbreak in China the Wuhan Institute of Virology began searching for the virus in nature. Right from the start the scientists considered bats (and their products in food and medicine sold on markets) as reservoirs for coronaviruses and mapped out the genetic diversity of the virus and how they attacked cells.
The Chinese researchers, however, did not work alone. They partnered with other initiatives, including from the USA. The PREDICT programme of the US Agency for International Development had set up local labs and surveillance in China and other countries with infectious disease hotspots. Their work included, among other things, detection of coronaviruses. A team from the EcoHealth Alliance, a PREDICT partner, isolated a live SARS virus that could infect both bat and human cells. It triggered an immediate immune response in people who had suffered SARS in 2003. After fourteen years, in 2017, they established the provenance of the SARS virus from bats as a scientific fact.
PREDICT also found that several coronaviruses were on the verge of human infection, meaning they did not require the intermediary of another animal species or much additional adaptation to cause illness. MacKenzie writes this knowledge was already available seven years ago and had been reported to a meeting on emerging diseases in Vienna in 2016. It was also noted that the new virus type might avoid experimental SARS vaccines.
From this the USA not only had the scientific data to understand the risks posed by novel types of coronaviruses, but also knew the activities in the Wuhan Institute of Virology. But the Trump administration shuttered the pandemic monitoring programme. Funding ended in 2019 and activities stopped in September after the money had run out. (It received an emergency extension for six months, starting in April 2020.) This followed on earlier actions that reduced the US Centers for Disease Prevention’s presence in China from 47 to 14 staff members since January 2017, downgrading monitoring capacity. Disease surveillance and early warning – at least for the USA – could not have been interrupted at a more critical moment.
A perfect storm just does not care about conspiracy theories, disinformation campaigns or alternative facts. It just takes place.
A sobering analysis
Debora MacKenzie presents us with sobering analysis of how an outbreak that infectious disease experts had been expecting for some years could turn into a global pandemic. Decisions made for political, ideological and economic reasons over the past four decades – some specifically relating to public health and infectious disease surveillance; other ones to how an increasingly interconnected, globalising society was organising itself – created the preconditions for the fast spread of SARS-CoV-2.
She also discusses many decisions by different actors once the outbreak had begun. These have been less the subject of the present review, but they were equally consequential. We can think of the refusal of lower and mid-level bureaucrats in China to transmit early reports of victims to the central government. There were the early actions by Chinese authorities to limit surveillance and containment only to people who had travelled to Wuhan; an error many governments in Europe and especially the USA repeated by initially focussing on persons coming out of China. As MacKenzie discusses, the first reflex (as seen in so many past pandemics) to blame foreigners produced measures that exacerbated the spread of the virus.
While the author introduces the reader to the basic science behind the COVID-19 pandemic (and actually makes the effort to explain terms and concepts in an easily digestible way), her broad knowledge of the ways social dynamics shape science and her long experience as a science journalist have yielded a most valuable book on how to understand current events. The book comes early in the pandemic and certain questions remain open. For sure, other sociological and political research questions rest to be formulated and answered.
Through her fluent writing and ease of explaining complex issues, she not only captivates the reader (it is difficult to put the book aside once having started to read it), but also helps her or him to make sense of an existential crisis many people alive today have not yet experienced before.
Keynote speach at the CONDENsE Conference, Ypres, Belgium, 29 August 2019
(Cross-posted from The Trench)
Good evening ladies and gentlemen, colleagues and friends,
It is a real pleasure to be back in Ieper, Ypres, Ypern or as British Tommies in the trenches used to say over a century ago, Wipers. As the Last Post ceremony at the Menin Gate reminded us yesterday evening, this city suffered heavily during the First World War. Raised to the ground during four years of combat, including three major battles – the first one in the autumn of 1914, which halted the German advance along this stretch of the frontline and marked the beginning of trench warfare; the second one in the spring of 1915, which opened with the release of chlorine as a new weapon of warfare; and the third one starting in the summer of 1917 and lasting almost to the end of the year, which witnessed the first use of mustard agent, aptly named ‘Yperite’ by the French – Ypres was rebuilt and, as you have been able to see to, regain some of its past splendour.
Modern chemical warfare began, as I have just mentioned, in the First World War. It introduced a new type of weapon that was intended to harm humans through interference with their life processes by exposure to highly toxic substances, poisons. Now, poison use was not new.
However, when the chlorine cloud rose from the German trenches near Langemark (north of Ypres) and rolled towards the Allied positions in the late afternoon of 22 April 1915, the selected poisonous substance does not occur naturally. It was the product of chemistry as a scientific enterprise. Considering that the gas had been CONDENsE-d into a liquid held in steel cylinders testified to what was then an advanced engineering process. Volume counted too. When the German Imperial forces released an estimated 150–168 metric tonnes of chlorine from around 6,000 cylinders, the event was a testimonial to industrial prowess. Poison was not a weapon the military at the start of the 20th century were likely to consider. Quite on the contrary, some well-established norms against their use in war existed. However, in the autumn of 1914 the Allies fought the German Imperial armies to a standstill in several major battles along a frontline that stretched from Nieuwpoort on the Belgian coast to Pfetterhausen – today, Pfetterhouse – where the borders of France, Germany and Switzerland then met just west of Basel. To restore movement to the Western front, the German military explored many options and eventually accepted the proposal put forward by the eminent chemist Fritz Haber to break the Allied lines by means of liquefied chlorine. 22 April 1915 was the day when three individual trends converged: science, industrialisation and military art.
This particular confluence was not by design. For sure, scientists and the military had already been partners for several decades in the development of new types of explosives or ballistics research. And the industry and the military were also no strangers to each other, as naval shipbuilding in Great Britain or artillery design and production in Imperial Germany testified. Yet, these trends were evolutionary, not revolutionary. They gradually incorporated new insights and processes, in the process improving military technology. The chemical weapon, in contrast, took the foot soldier in the trenches by complete surprise. It was to have major social implications and consequences for the conduct of military operations, even if it never became the decisive weapon to end the war that its proponents deeply believed it would.
[Cross-posted from The Trench]
On 2 May the Technical Secretariat of the Organisation for the Prohibition of Chemical Weapons (OPCW) organised a workshop relating to its programme to fully implement Article XI of the Chemical Weapons Convention (CWC). I addressed the States Parties in the session on ‘Promoting chemical knowledge’ and focussed on the responsibilities of chemists, both as members of their scientific associations and as individuals, in preventing the misuse of their discipline.
Consequences down the road
The role of chemists in war is not a new thing. The role of chemists in chemical warfare is of more recent origin. Just over a century ago, modern chemical warfare, as it began in my country, Belgium, on 22 April 1915, may seem like it came out of the blue. Actually, it resulted from the confluence of several trends in Europe and North America. Those trends emerged in the late 18th century. They included the establishment of chemistry as a science and the onset of the first industrial revolution. Those trends gathered pace throughout the 19th century.
Chemistry discovered many new molecules. Organic chemistry—one of the early convergences of chemistry and biology (another one of the new scientific disciplines)—yielded compounds that later often acquired widespread use as intermediaries in industrial production. Many decades after their initial discovery, several also became warfare agents during the 1st World War. In the first half of the 19th century, chemists also synthesised the first organophosphorus structures, which laid the foundation for the development of the nerve agents from the mid-1930s onwards.
After 1850, industrialisation increasingly shaped the organisation of science; it gave direction to the scientific endeavour; and it helped to restructure the scientific curricula at universities and other institutions of education. The idea of science for science’s sake gave way to a much more utilitarian vision in service of society.
Stagnation on the Western front in the autumn of 1914 would prove to be the catalyst for modern chemical warfare. Belligerents drew on national industrial and scientific prowess to try and force the decisive breakthrough on the battlefield to end the carnage. Toxic chemicals used to deliberately harm humans were one choice. Alas.
I am not saying that in the 19th and early 20th century chemists set out to design and develop chemical weapons (CW). All I know is that in each of the belligerent countries, these chemists were fully aware of the social and technological dynamics that were transforming their respective societies; often they were the drivers of these changes.
The 1st World War was the catalyst that brought science, industry and military art together with the purpose of devising a new mode of warfare. It was almost accidental. (With the design of the atomic bomb a quarter of a century later, the convergence was deliberate, and governments have maintained that interconnectedness ever since.)
Today, our societies are once again undergoing major transformations. Chemistry is changing fast; the interactions with other disciplines are widening as well as deepening. Chemical industry has spread across the planet; so many people all over the world are seeking careers in fields that have more than a tangible impact on the CWC. These areas are also critical to development; they are key to ameliorating the conditions of peoples everywhere and meeting future challenges to individual and human survival.
International cooperation and development benefit from peaceful intent
The OPCW’s Advisory Board on Education and Outreach (ABEO) is keenly aware of current transformations that might once again contribute to CW development and acquisition. Its members are also keenly aware that we are facing new situations in which toxic chemicals can be and are being used. A big challenge to the CWC is that our conception of CW is changing fast. Indeed, opportunistic use of industrial toxicants (such as chlorine) on the battlefields, terrorism and non-state actor use of toxic agents, and now more recently, assassinations with substances that had initially been developed or produced for military arsenals, are situations the CWC negotiators could not—and did not–anticipate.
In February of this year, the ABEO produced a report on the role of education and outreach in preventing the re-emergence of CW. It contains many recommendations for the Technical Secretariat to enhance the impact of its activities with States Parties in terms of education and outreach. The report also addresses how chemists everywhere can expand their consciousness about the dual-use characteristics of much of their work. It also seeks to enhance their awareness of the international and domestic scientific and technological environment in which they are functioning. It helps them to anticipate possible outcomes of their work many years into the future.
Engagement of chemists is evident from a key clause in the report’s title: ‘preventing the re-emergence of CW’. The report defines this goal as ‘the collective of actions undertaken by the OPCW, its Secretariat, and the National Authorities to implement the Convention, on the one hand, and by professional, scientific, and academic communities, as well as civil society constituencies and individuals, to advance consciousness, responsibility, and specific behaviours that support purposes not prohibited by the Convention, on the other hand’. (p. 6, para. 2.11)
In other words, ‘Prevention of the re-emergence of chemical weapons’ appeals to the responsibility of stakeholder communities and individuals, including chemists, to uphold the norm in the CWC.
Members of the ABEO have been involved in the development of the Hague Ethical Guidelines to promote responsible practice of chemistry. They are also active in promoting the Ethical Guidelines, including through active learning processes that involve chemists, which are advanced in the ABEO report. Some members have been instrumental in mobilising chemical societies and chemical industry councils to formally condemn the use of chlorine as a weapon. Some among them have also participated in the development of the on-line educational tool ‘Multiple Uses of Chemicals’ to promote the beneficial uses and prevent abuses of multiple-use chemicals, which the Technical Secretariat now supports by offering translation into the six official languages.
Reaching out to today’s chemist and the next generation of chemists (who are now in secondary school) is a task that National Authorities can help to promote, in addition to the ongoing initiatives undertaken by the Technical Secretariat.
At this point, I wish to stress that while the ABEO report suggests educational strategies, it does not offer one-size-fits-all suggestions. There is great need to adapt educational strategies to specific regional and national characteristics.
Awareness of the challenges—those visible today, as well as those looming on the horizon—is a task of permanent education. The ABEO report contains many practical examples of how such permanent education can be organised and practically implemented. It is of benefit to development for peaceful purposes and international collaboration in the scientific field of chemistry worldwide.
States Parties are welcome to approach the ABEO and its members—via the Office of Strategy and Policy of the Technical Secretariat—for assistance and concrete advice on education and outreach to key stakeholder groups.
[Cross-posted from The Trench]
During the Meeting of Experts of states parties to the Biological and Toxin Weapons Convention (BTWC) last August, the Netherlands organised or co-hosted three side events relating to safeguarding the life sciences. A significant incident, in which the Dutch virologist Ron Fouchier and his team were required to obtain an export licence to publish their research on how they had mutated H5N1 into an aerosol-transmissible avian influenza virus variant, undeniably informed the need to clarify national policies and approaches to biorisk management. A month earlier the Appellate Court had annulled the ruling by a lower court in support of the government position on procedural grounds. Does this annulment validate the Dutch government’s position or does it imply that the whole debate about the publication of so-called dual-use research in the life sciences is back to square one? Moreover, in the meantime the debate had evolved from a terrorist proliferation risk to one of health security in which the ethics and utility of this type of gain-of-function research stand central. In other words, do biosafety worries warrant biosecurity policy measures, such as the imposition of non-proliferation export controls?
Some background to the Netherlands decision
In September 2011 the European Scientific Working Group on Influenza (ESWI) held its fourth conference in Malta. Europe and the world were then confronting an outbreak of avian influenza caused by the H5N1 virus. Its rapid spread among birds over long distances caused governments worldwide to order drastic measures in efforts to stem the epidemic. Over 500 humans (representing some 60% of all people who had contracted the disease) had already died, but all deaths thus far had resulted from direct interaction with fowl and not from human-to-human transmission.
In Malta Ron Fouchier announced that he and his team at the Erasmus Medical Center in Rotterdam had succeeded in transforming H5N1 into a viable aerosol virus. According one conference report he applied rather colourful language: his team ‘mutated the hell out of H5N1’. The discovery that it required as few as three single mutations to gain the ability to latch onto cells in the nasal and tracheal passageways, he described as ‘very bad news’. Transmission among ferrets, a mammal that offers the best laboratory model to study influenza in humans, still did not occur easily when, in Fouchier’s recorded words, ‘someone finally convinced me to do something really, really stupid’. They provoked two further mutations by transferring mutated viruses from the nose of one sick ferret to that of a healthy one, in the process creating the viable aerosol virus.
Initial articles on the gain-of-function research did not suggest any link with bioterrorism, but sometimes carried dramatic titles evoking cataclysmic consequences reminiscent of the 1918 Spanish flu epidemic that killed tens of millions worldwide. However, Fouchier’s dramatic speech caught the attention of counter-terrorism officials on both sides of the Atlantic. When he offered his research results for publication in Science, the US National Science Advisory Board for Biosecurity (NSABB) intervened and eventually accepted publication provided some methodological details were removed from the text. (A parallel paper on H5N1 submitted to Nature by a team led by US scientist Yoshihiro Kawaoka fared a similar fate.) NSABB considered the biosecurity risks outweighed any scientific merit in this type of research: rogue laboratory researchers or terrorists might wish to unleash the deadly virus on the human race to devastating effect.
NSABB’s intervention caused controversy with one side calling it censorship and the other side up in arms that publication had been authorised at all. In the United States, self-publication on the internet was not legally restricted at the time. In the Netherlands too the affair had caught the attention of authorities, many of whom wanted to block publication outright, but lacked the appropriate legal tools to do so. The only way to appraise the risks for malfeasance posed by information in the research manuscript was to implement non-proliferation export controls, which according to European Union regulations must be enforced for applied (but not fundamental) research in the life sciences.
The Netherlands decision caused shockwaves among European life scientists. After initial defiance, Fouchier and his team eventually applied for and received an export licence. Otherwise he might have faced up to six years imprisonment and $102,000 in fines. The Erasmus Medical Center subsequently took the government to court to have the principle of export licences for scientific research overturned. On 20 September 2013 the District Court of North Holland ruled in favour of the government, but on 15 July 2015 the Appellate Court in Amsterdam annulled the ruling, saying that the case was without merit in view of the application and granting of an export licence. Put differently, the lower court should have never taken up the case. As presented during the BTWC Meeting of Experts in August 2015, the Dutch Government believes that the juridical process vindicated its approach to research with potential dual-use implications.
The future of the CWC in the post-destruction phase
Report – No15 – 27 March 2013
Yasemin Balci, Richard Guthrie, Ralf Trapp, Cindy Vestergaard, Jean Pascal Zanders
edited by Jean Pascal Zanders
From the Foreword by Ambassador Jacek Bylica, Principal Adviser and Special Envoy for Non-proliferation and Disarmament, European External Action Service:
The international community can be justifiably proud of the Chemical Weapons Convention. It has banned an entire category of weapons of mass destruction and provided for their verifiable elimination under international supervision. A small but effective intergovernmental organisation, the Organisation for the Prohibition of Chemical Weapons (OPCW), has been created for this purpose.
In the present international situation it is important to note that the Convention has created a de facto legal norm against the production, possession and usage of chemical weapons for military purposes. This prohibition goes beyond the letter of the Convention and stems from the reactions to the tragic experience of World War I and more recent cases of CW usage, including against non-combatants.
This volume features contributions derived from some of the presentations made by world-class experts at the workshop organised by the EU Institute for Security Studies in cooperation with the European External Action Service on 10 September 2012. The workshop offered an opportunity to reflect on some of the challenges facing the CWC over the next decade in preparation of the Third Review Conference at The Hague in April 2013. I am confident that this report presents an invaluable contribution to the debate on the future direction of our joint efforts which aim at the total and irreversible elimination of chemical weapons from the face of the Earth.