Skip to main content

Advertisement

Towards inclusive social appraisal: risk, participation and democracy in governance of synthetic biology

Article metrics

Abstract

Frameworks that govern the development and application of novel products, such as the products of synthetic biology, should involve all those who are interested or potentially affected by the products. The governance arrangements for novel products should also provide a democratic mechanism that allows affected parties to express their opinions on the direction that innovation does or does not take. In this paper we examine rationales, obstacles and opportunities for public participation in governance of novel synthetic biology products. Our analysis addresses issues such as uncertainties, the considering of alternative innovations, and broader social and environmental implications. The crucial issues in play go beyond safety alone, to include contending social values around diverse notions of benefit and harm. The paper highlights the need for more inclusive social appraisal mechanisms to inform governance of Synthetic Biology and alternative products, and discusses a few practical methods to help achieve this goal.

Background

New synthetic biology [1] and gene drive [2] technologies raise prospects like: altering or suppressing entire populations of disease vectors or agricultural pests by releasing just a few genetically modified organisms [3]; or revolutionising agricultural production by synthesising products such as palm oil [4]. These developments have led to widespread reconsideration of current risk-based governance mechanisms as means to balance associated risks and opportunities [5]. Here we define governance to be the set of regulatory processes, mechanisms and organizations through which political actors influence environmental actions and outcomes [6]. Risk-based mechanisms aim to help the decision maker balance the potential benefits of innovation against the potential harms to humans and their environments, whilst recognising the uncertainties associated with both [7].

Current reconsiderations within the field of risk assessment emphasise that the assessments should not be restricted in scope to human safety-related parameters but should embrace wider ecological and societal issues [8], which will likely raise deeper questions about the governance of synthetic biology in democratic societies [9,10,11]. Other pertinent issues in this view include the need for methods to help formalise the inherently subjective choices that determine how a risk-assessment is framed [12], characterise the quality of quantitative information used in a risk assessment [13], and how to make governance procedures flexible and precautionary in face of the “deep uncertainty” that accompanies many new technologies [12].

The desire to include deliberative procedures within the governance arrangement of synthetic biology has led various agencies to assert the need to engage communities, stakeholders and broader publics in decision making processes. The National Academies of Sciences, Engineering and Medicine [13] for example, recommends that “defined mechanisms and avenues for [public] engagement should be built into the risk assessment and decision-making process from the beginning”. This recommendation is echoed by the International Risk Governance Council [14] who call for regulatory agencies to “prototype new approaches for iterative risk analyses that incorporate external peer review and public participation”.

It implies no necessary criticism of such calls to note that public participation, as opposed to communication, is not routinely practised in existing risk assessments, and important questions can be raised about underlying rationales. In the words of a former executive in the US Environmental Protection Agency, there typically exist highly diverse and often contending motivations to enact procedures nominally referred to as “participatory” [15]. Aims may variously be: ‘substantive’ - i.e. about making better decisions; ‘normative’ – i.e. about pursuing appropriate process in a democracy; or ‘instrumental’ - about engineering pre-existing aims [16]. This latter category includes: fostering trust [17]; providing justification [18]; securing acceptance [19]; and managing blame [20]. In this paper we focus on substantive approaches to public participation that involve genuine empowerment of all affected parties in the interests of making better choices among contending innovation or policy pathways in any given field [21].

It is also important to recognise that these drivers for more participatory practices are not new. Similar calls for more inclusive risk-based governance have been made previously in the context of genetically modified plants [22], and for risk assessment more generally [23], on the stated grounds that: (i) early public engagement can provide information that improves decisions; (ii) including stakeholders and the public in the decision-making process leads to more trusted decisions; and, (iii) citizens have a right to influence decisions about issues that affect them [24]. If public engagement exercises around synthetic biology or gene drives are to be credible or robust in the substantive terms described above, then they should not be restricted to issues of risk or safety alone, nor confined merely to the ways in which a new technology should be introduced [25].

Rather than focussing on one particular innovation, participatory practices need to provide a balanced analysis of the relative pros and cons of a diversity of contrasting policies and innovations that are able to address the same societal or environmental functions [21]. This prevents the process being reduced simply to a means to modulate implementation of one particular technology. Comparative appraisal opens the possibility that a strongly-backed new technology might, under appropriate circumstances or perspectives, be deprioritised in favour of an alternative strategy [21]. As highlighted in a recent report for the UK Government Chief Scientist, the enabling of societies to exercise agency over the directions taken by innovation in this way is, at one level, a basic imperative of democracy [26].

It is against this background that this paper examines the opportunities and barriers that exist to substantive public engagement and participation in risk-based governance frameworks. A key emphasis will lie in exploring relations with quantitative, probabilistic approaches to risk assessment that have recently been recommended for synthetic biology products like gene drives. We pay attention to examining mechanisms that assist substantive public engagement and so allow for more societally robust and democratically accountable social choices of technology in these fields [27].

The issues addressed in this paper, then, are not just about whether or what kind of public participation might be required in each step in a typical risk assessment. Instead, it is risk assessment itself that is placed in wider context. Nor is this simply a matter of ‘bolting on’ some additional processes around a conventional regulatory appraisal. Since the answers obtained in risk assessment depend on both questions and assumptions, the point becomes clear that if risk assessment itself is to be regarded as rigorous, then it needs to be as systematic and robust about its own qualitative framing conditions as it already tries to be about quantitative data and analysis [28].

So, public participation should not be seen as a matter of ‘political correctness’ or as a means to achieve a pre-conceived end, but rather as inherent to the rigour and effectiveness of regulatory assessment. It allows attention to extend beyond crucial questions over ‘how safe?’, to address equally imperative issues over ‘which way?’ innovation should go in any given field; and ‘who says?’ and ‘why?’ [28]. If these substantive kinds of issue are attended to, then regulatory assessment of synthetic biology and gene drives can move from a purely risk-based analysis, to diverse – more substantive – processes of ‘social appraisal’ [17].

Risk assessment and social appraisal

In this section, we consider the practical implications of the issues raised above for a risk-based governance system of synthetic biology products such as gene drives. To this end, Fig. 1 kicks off with an overview of the steps involved in a conventional risk assessment process. It is important to emphasise that this Figure is a simplified and idealised version of a risk assessment process. In practise, risk assessments rarely if ever follow all the stages, or pursue the sequence described here with precision. It is presented here as a model to facilitate the discussion within the paper.

Fig. 1
figure1

An idealised scientific risk assessment process (amended from [29]). A flow chart showing steps where public participation would be essential under a social appraisal process and easily facilitated (solid green), essential but harder to achieve (solid orange), and essential but difficult to achieve (solid blue) for novel synthetic biology products. Pattern shading represents steps where public participation may be useful but not essential to a social appraisal process

The assessment has essentially three stages, labelled: (i) Identify, define and agree; (ii) Calculate, evaluate and manage; and (iii) Monitor, validate and compare. The steps within each of these three stages were originally described in the context of assessing and managing the environmental risks posed by the release of transgenic fish [29]. We have annotated these steps to distinguish those parts of the overall risk assessment process where public participation is essential if the assessment is to broaden and achieve the substantive imperative of a social appraisal process.

To begin, it is widely recognised to be essential, at least for novel technologies such as gene drives, that ‘the public’ are engaged in the opening steps of the first (identify, define and agree) stage of a risk assessment process. Indeed, these first steps are designed to engage the public in an initial discussion about the problem that the technology or product is designed to address, and the extent to which this problem might be addressed with other new or existing technologies or practices.

Public engagement at this stage is essential for several reasons. In the first instance, different ‘publics’, that is constituencies with contrasting understandings, values and interests, must agree that the problem at hand is real and that a novel technological solution is at least worth contemplating. Failure to successfully engage diverse publics at this stage is likely to lead to immediate opposition, as occurred with Oxitec’s proposal to release genetically engineered sterile male mosquitoes in the Florida Keys [30]. Furthermore, any activity that potentially exposes publics to environmental or health risks is unethical if there are alternative, more benign options, or if there is no expectation of possible benefits [31]. In other words, there must be some agreed basis that a problem currently exists, leading to appropriate justification of need [32].

If a particular technological ‘solution’, such as a synthetic gene drive, is propounded as a potentially effective response to an agreed problem, either on its own or as part of an integrated package of solutions, then additional community engagement is desirable for ethical and practical reasons [33]. In terms of the risk analysis steps, stakeholders can at this point make useful contributions to: (i) defining the boundaries and scope of the assessment, for instance concerning which alternatives are considered; (ii) describing conceptual models of the environmental and socio-economic systems that the options will interact with; (iii) identifying valued components or processes of these systems (assets); and (iv) identifying circumstances that could lead to adverse outcomes (hazards) if the technology is deployed. Facilitated discussions with broad groups of stakeholders at this stage have been shown to improve the conceptual understanding of systems and the hazard identification stage [34].

In these early stages of the process it is particularly important that the governance regime does not restrict or suppress an adequate exploration of alternative (social or technological) responses. Appraisal should devote symmetrical attention to all considered alternatives and offer a balanced picture of associated pros and cons as seen by affected stakeholders – particularly those that have no commercial interest in the technology concerned. For instance, many effective alternative innovations in seed production are often excluded from regulatory appraisal processes around the world, in favour of more energetically-propounded transgenic options that offer attractive private benefits from intellectual property, profits from value chains or sales of associated products [24]. These neglected alternatives can include: ‘marker selection’ [35]; participatory breeding [36]; agricultural extension services [37]; and open source seed sharing methods [38], which all harness the innovative capacities of farmers themselves and help tailor crop development to local conditions [39]. If alternative approaches like these are to be given a fair hearing, then they must be addressed by wider practices of social appraisal that extend risk assessment attention beyond a single or narrow set of options.

The third main component in Fig. 1 - identifying assets - becomes particularly important when addressing novel technologies, because the ‘assets’ determine assessment endpoints – i.e. values that risk assessment is trying to protect. The assessment endpoints in turn determine the measurement endpoints – i.e. the things that the risk assessment will make predictions about [40], and these should be used to identify the risk acceptance criteria that support a decision and a compliance monitoring strategy. Public engagement is essential at this step, in order to ensure that the risk endpoints, and hence acceptance criteria, reflect the values that affected communities actually hold. If risk assessment fails to do this, the focal product is unlikely to be considered acceptable, irrespective of the assessment outcomes. If assessment is not understood to address community values, it is likely to be perceived as irrelevant, at best – or at worst, as representing other vested interests.

Practically it is more difficult to undertake public engagement in the second stage in the idealised framework (labelled Calculate, evaluate and manage in Fig. 1). This stage aims to determine risks for an agreed set of priority hazards. Risk calculations can be performed qualitatively or quantitatively, but here we focus on probabilistic risk assessment calculations. With a probabilistic risk assessment, the scope for public engagement in this second stage is much reduced, largely because of the particular kinds of expertise required and associated barriers around styles of knowledge and understanding including language (discussed below).

Probabilistic risk assessment for novel technologies, at least initially, must rely on opinions and beliefs. Classical actuarial approaches are not possible because the technology’s operational history is limited and/or its potential adverse outcomes occur at a very low frequency. A key feature of probabilistic risk assessment in these circumstances is that opinions are typically elicited from experts using formal methods (see for example [41] carefully designed to minimise the various forms of ambiguity caused by the natural vagueness of language [42, 43]), and to provide predictions that can be (in)validated with observations. In theory these methods could be employed to elicit the opinions of stakeholders and thereby help to expose differences in opinion that might be masked by different interpretations of the same word, such as “negligible”. But we are unaware of any examples of this in practise, perhaps because of the barrier that probabilistic methods present.

It is still possible, but somewhat more difficult, to engage publics around the formulation of what in risk assessment parlance are termed ‘loss functions’. These functions express the loss that occurs following a predicted change in the value of a measurement endpoint – i.e. they measure the possible consequences should adverse outcomes occur. In financial contexts, loss functions are typically expressed as a change in the monetary value of a portfolio over time [44]. In a human health context, loss is also relatively readily defined [45, 46]. Such functions can help make explicit the particular value judgements embedded in any given assessment that underlie a specific interpretation of impact, and facilitate comparisons with other equally-reasonable values that might yield different interpretations [47, 48]. In ecological contexts, the concept of loss is more ambiguous and value laden – so it is therefore more desirable, but at the same time more difficult, for publics to be engaged in the formulation of the way in which the ecological consequences of adverse outcomes are measures and expressed [49,50,51].

It is also essential for interested and affected communities to be engaged in this second stage of the risk assessment procedure around issues of “acceptability”. For new technologies this is typically a difficult stage. Public engagement is only meaningful here if involved communities have previously contributed to risk acceptance criteria (stage 1) and have also been kept informed of observed outcomes, and decisions that arise following these observations (stage 3). Perhaps most importantly, a focus on “acceptance” can only be considered valid, if the assessment gives equal attention to the pros and cons of a variety of alternative technologies or strategies with the same policy aims.

Finally, the overall objective of the third and last stage of the conventional risk assessment framework is to compare risk predictions to observed outcomes. Testing risk and benefit predictions against observed outcomes is an important science quality criteria: to comply with the scientific method, risk predictions must be, at least theoretically, capable of being invalidated by observations. At this stage it is also possible – and sometimes desirable on cost grounds – to engage interested and affected communities in monitoring strategies through, for example, citizen science activities [52].

Again, in substantive terms, it is important to emphasise that questions around benefit and harm must be directed to the potential pros and cons associated with a diverse array of alternative policy options. It should be noted, however, that contrasting dimensions of each option may not necessarily be subject to simple trade-offs, and methods beyond those usually associated with risk-based governance mechanisms may be needed to address the complex, dynamic and uncertain relations between wider social and environmental values – as well as entirely-valid and reasonable, but non-utilitarian ways of reasoning [53, 54].

For example Fig. 2 shows one characterisation, among many other variants in the literature [55,56,57,58] of four contrasting aspects of incertitude. This identifies a variety of methods that may serve useful functions in substantive social appraisal – including as bridges, catalysts and frameworks for wider processes of public participation. Terminology can be controversial, so the point of Fig. 2 is not to insist on words. The term ‘uncertainty’, for instance, is used in a variety of sometimes opposing ways [59, 60]. Bayesians assert that subjective probability is the only coherent way to perform the types of comparisons across multiple options with uncertain outcomes that a substantive social appraisal process demands [61] and that subjective probability is in principle an adequate way to represent uncertainty due to knowledge gaps and the uncertainty caused by the inherent variability of many real-world processes [62].

Fig. 2
figure2

Different aspects of incertitude. As distinguished in relation to the fundamental parameters of risk assessment (probabilities and outcomes) (adapted from [28])

Other probability theorists point out, however, that it must be acknowledged at least in principle, that there is always a possibility that a situation will arise under which there exists no firm basis for confidence in the values that might be taken by probability distributions [63]. Indeed, under these conditions, some probability theorists acknowledge a condition under which – in objective terms – probabilities simply “do not exist” [64]. This may be because neither historical evidence nor the completeness of available models are felt to be sufficient to derive likelihoods for all relevant real-world outcomes. To assert single probability distributions under such conditions would involve “pretence of knowledge” [65]. Following longstanding usage in policy appraisal over the past century [66, 67], it is this condition that is referred to in Fig. 2 here as ‘uncertainty’. Other terms may legitimately be preferred, but it is crucial not to let reluctance to name this condition lead to a situation in which it is simply ignored [28].

The practical point here, however, is clear. The key issues that arise are: (i) probabilistic methods can present a barrier to public participation; and, (ii) it is crucial to consider whether alternative approaches to uncertainty provide a coherent and transparent way to analyse pros and cons of different technological solutions. It is to this end, that frameworks like that offered in Fig. 2 can offer one useful input among many, in prompting greater consideration.

A crucial further point in Fig. 2 is that challenges in social appraisal do not just involve problematic knowledge about likelihoods, but also different kinds of ‘contradictory certainty’ over meaning [68]. Referred to as ‘ambiguity’ – meaning the condition of being open to more than one interpretation - in Fig. 2, these disagreements may concern: interests or values; ‘benefits’ or ‘harms’; or alternative policy options [69]. It is a matter of analytical rigour to recognise that such dilemmas also mean that there can in principle exist no uniquely optimal analytical solution [70, 71]. This further underscores the substantive importance of participation [72,73,74,75].

Taken together, the main issue that arises in all this is simply the need to recognise: a) that some level of ignorance will always exist with a new technology – where “unknown unknowns” [76] mean “we don’t know what we don’t know” [55]; and b) that a substantive social appraisal entails value based judgements that probabilistic risk assessment techniques are not designed to address. This makes it important that governance embeds risk-based assessment in a broader social appraisal that includes public participation [77].

Obstacles and promises of public participation

What the preceding discussion has shown is that the scope and methods of risk-based governance need to be broadened if the substantive issues raised by new technologies like synthetic biology and gene drive technologies are to be addressed. These considerations require attention to a broader set of practical methods, beyond those currently used in risk assessment, for wider social appraisal [78]. Here, a multiplicity of forms of public participation become recognisable as crucial means to help achieve both analytical rigour and democratic accountability in the framing and implementation of governance measures.

Potential obstacles to public participation in the crucial first steps of the first stage of a risk assessment include language barriers, conflicting styles of knowledge, and availability and accessibility of information. The extent to which these are actual barriers will vary on a case by case basis, but in general these obstacles are likely to be more acute in developing nations [78].

Cost is also a significant barrier – from the perspectives of both sponsors and participants. Public participation activities can be costly to organize, in terms of labor, logistics, preparatory materials, and design of interactions [79]. Equally costly is the time and effort expended by participants, especially in a political-cultural environment that does not make it clear that such participation will make a meaningful difference in decision making [80]. Put another way, a major obstacle is the credibility that it will be worthwhile for publics to bother to participate.

Under a view that participation will focus primarily on modalities for implementation, and participation is merely about ‘deciding how to do it’, the value of the engagement is the sharing of perspectives and knowledge, and such sharing can be costly. For example, the costs of organizing engagement at each step of a laboratory study would be prohibitive, suggesting that some balance needs to be achieved in terms of the frequency and intensity of engagement and its costs to enact.

The transaction costs of public engagement are more difficult to compute and less relevant, however, under a contrasting view that participation should extend to the possibility that an alternative strategy will be substituted. Here the value of engagement could be larger, but harder to determine, if the costs of adverse effects are avoided by pursuing an alternative strategy.

Either way, we note that “transaction costs” implies that what is valuable is the exchanged “material,” while the transaction itself is worthless and should be minimized to the greatest extent for the sake of efficiency. In the case of engagement, however, the transaction has value - both in terms of what is exchanged and the experience of connecting with others who have different knowledge, perspectives, and values. These transactions have the potential to build relationships, trust, and insight. The value of these achievements are difficult to measure but we nonetheless suggest it is helpful to consider both the “transaction costs” and “transaction benefits” of stakeholder and public engagement.

Synthetic biology products and gene drives in particular may raise specific challenges in this context: (i) they are often described in very technical, domain-specific, terminology that is not accessible to a wide audience. To understand their production processes and modes of action requires a high degree of training or a significant amount of editorial effort to de-mystify the language; (ii) techniques such as CRISPR based gene-editing have lowered technological barriers and substantially compressed design to production cycles, enabling the field to move rapidly. This in turn can reduce the lead-in time for risk assessment and social engagement activity; and, (iii) low threshold gene drives will theoretically spread throughout the domain of an entire target population. If a target species has a large range, for example a mosquito malaria vector found across sub-Saharan Africa, the number and diversity of potential stakeholders could be larger than that encountered with other new products. The nature and severity of these issues will vary on a case by case basis, but taken together they could significantly raise the costs of a substantiative engagement process.

On the other hand, of course, costs incurred when unduly narrow governance circumscribes assessment, excludes alternatives or sidelines relevant uncertainties can prove to be very large [81, 82]. Once realised, it can be extremely costly to address (potentially irreversible) environmental effects [83, 84] or shift away from technologies that have already locked in [26, 85,86,87]. These possible wider burdens of narrow governance are important to take seriously, because they often fall most heavily on people who are in other ways most excluded and vulnerable [21, 88,89,90].

Against this background, there are a diversity of practical participatory methods. For instance, risk assessment may usefully be informed by carefully-structured workshops like Problem Formulation Options Assessment (PFOA) methodology [91]. The PFOA methodology has been specifically designed to front-end a risk-based governance framework for novel technologies and has been successfully applied to GM crops in Kenya [92]. Multicriteria mapping is another approach that has been applied with some success [93]. Also yielding concrete quantitative pictures – alongside a rich body of qualitative information – concerning a diversity of contrasting innovation pathways, this has also been used to explore GM maize and parallel options in Kenya [94].

Engaging stakeholders in the conceptual modelling and hazard identification stage of the risk assessment can be facilitated by using graphical conceptual modelling methods, such as cartoons, influence diagrams and Signed Directed Graphs [95]. Graphical conceptual methods also provide a structure for, and therefore facilitate, public participation in the identification of assessment and measurement endpoints, and can also guide the development of quantitative models that are otherwise difficult to engage the public in due to the technical hurdles that they present. Signed Directed Graphs have added advantages in this context as they provide the basis for analysis of the effects of feedback in complex systems [96], including the socio-economic systems that are coupled to, and drivers for, the environmental systems that may be perturbed by a novel technology.

Irrespective of any particular methods, substantive aims in technology governance should also be responsive to ‘uninvited’ engagement by marginal voices on their own terms [97]. Many methods also exist to help enable this – including: open space [98], participatory rural appraisal [99], deliberative mapping [100], do-it-yourself juries [101], participatory technology assessment (https://ecastnetwork.org), and action research [102] – which if properly undertaken can help to further these aims.

Despite the complexity and diversity of views, then, it is possible to draw some firm overall conclusions. Quite simply, broader public participation in environmental decision making leads to better quality decisions [103]. There exists much scope for, as well as obstacles to, practical extensions of existing procedures in order to enable meaningful participatory deliberation.

In the end, the most important questions in this process are not just about ‘yes or no?’, ‘how much?’ or ‘how fast?’ concerning a circumscribed partisan selection of possible ‘solutions’ – but rather about fundamental issues for democracy over: ‘which way?’, ‘who says?’ and ‘why?’ Addressing such questions – in collaborations that span domains of expertise and civil society – can ‘open up’ a diversity of alternative viable policy responses [104]. How societies address the uncertain benefits and risks of these alternative responses, however, remains a contentious issue and one that requires much more attention than this paper allows.

References

  1. 1.

    Brown ZS, Carter L, Gould F. An introduction to the proceedings of the environmental release of engineered pests: building an international governance framework. BMC Proc. 2018.

  2. 2.

    Collins JP. Gene drives in our future: challenges of and opportunities for using a self-sustaining technology in pest and vector management. BMC Proc. 2018.

  3. 3.

    Esvelt KM, Smidler AL, Catteruccia F, Church GM. Concerning RNA-guided gene drives for the alteration of wild populations. eLife. 2014;3 https://doi.org/10.7554/eLife.03401.

  4. 4.

    Piaggio AJ, Segelbacher G, Seddon PJ, Alphey L, Bennett EL, Carlson RH, Friedman RM, Kanavy D, Phelan R, Redford KH, Rosales M, Slobodian L, Wheeler K. Is it time for synthetic biodiversity conservation? Trends Ecol Evol. 2016, 32:97–107. https://doi.org/10.1016/j.tree.2016.10.016.

  5. 5.

    Nuffield Council on Bioethics. Emerging biotechnologies: technology, choice and the public good. London: Nuffield Council on Bioethics; 2012.

  6. 6.

    Lemos MC, Agrawal A. Environmental governance. Annu Rev Environ Resour. 2006;31:297–325.

  7. 7.

    Schmidt M, Kelle A, Ganguli-Mitra A, de Vriend H, editors. Synthetic Biology: the technoscience and its societal consequences. Berlin: Springer; 2009.Eds.

  8. 8.

    Wynne B. Risk and environment as Legitimatory discourses of technology: reflexivity inside out. Curr Sociol, vol 50, no May. 2002:459–77.

  9. 9.

    Jasanoff S. Designs on nature: science and democracy in Europe and the United States. Princeton: Princeton University Press; 2005.

  10. 10.

    Kaebnick G. Humans in nature: the world as we find it and the worldas we create it. Oxford: Oxford University Press; 2014.

  11. 11.

    Wickson F, Wynne B. The anglerfish deception. The light of proposed reform in the regulation of GM crops hides underlying problems in EU science and governance. EMBO Rep. 2012;13(2):100–5.

  12. 12.

    Funtowicz S, Ravetz JR. Uncertainty and quality in science for policy. Dordrecht: Kluwer Academic Publishers; 1990.

  13. 13.

    National Academies of Sciences, Engineering, and Medicine. Gene Drives on the Horizon: Advancing Science, Navigating Uncertainty, and Aligning Research with Public Values. Washington: The National Academies Press; 2016.

  14. 14.

    IRGC. Guidelines for the Appropriate Risk Governance of Synthetic Biology (Policy Brief). Geneva: IRGC - International Risk Governance Council; 2010.

  15. 15.

    National Academies of Sciences, Engineering, and Medicine. Preparing for Future Products of Biotechnology. Washington: The National Academies Press; 2017.

  16. 16.

    Fiorino DJ. Citizen participation and environmental Risk : a survey of institutional mechanisms. Sci Technol Hum Values. 1990;15(2):226–43.

  17. 17.

    O’Neill O. Autonomy and Trust in Bioethics. Cambridge: Cambridge University Press; 2002.

  18. 18.

    Collingridge D. Critical Decision Making: a new theory of social choice. London: Frances Pinter; 1982.

  19. 19.

    Wynne B. Redefining the issues of risk and public acceptance: the social viability of technology. Futures. 1983;15(1):13–32.

  20. 20.

    Hood C. The blame game: spin, bureaucracy, and self-preservation in government. Princeton: Princeton Univ Press; 2011.

  21. 21.

    Leach M, Scoones I, Stirling A. Dynamic Sustainabilities: technology, environment, social justice. London: Routledge; 2010.

  22. 22.

    National Academies of Sciences. Environmental effects of transgenic plants. Washington: National Academy Press; 2002.

  23. 23.

    Stern PC, Fineberg HV. Understanding risk informing decisions in a democratic society. Washington: National Academy Press; 1996.

  24. 24.

    Kuzma J, Kokotovich A. Renegotiating GM crop regulation. EMBO Rep. 2011;12:883–8. https://doi.org/10.1038/embor.2011.160.

  25. 25.

    Stirling A. Towards innovation democracy? Participation, responsibility and precaution in innovation governance. London: UK Government; 2014.

  26. 26.

    Annual Report of the Government Chief Scientific Adviser. Innovation: Managing Risk, Not Avoiding It The Government Office for Science, London; 2014.

  27. 27.

    A. Ely, P. Van Zwanenberg, and A. Stirling. Broadening out and opening up technology assessment: Approaches to enhance international development, co-ordination and democratisation. Res. Policy. vol. in press, Nov. 2013.

  28. 28.

    Stirling A. Keep it complex. Nature. 2010;468:1029–31.

  29. 29.

    Hayes KR, Kapuscinski AR, Dana G, Li S, Devlin RH. Introduction to environmental risk assessment for transgenic fish. In: Kapuscinski AR, Hayes KR, Li S, Dana G, editors. Environmental risk assessment of genetically modified organisms, volume 3: methodologies for transgenic fish. Oxfordshire: CABI Publishing; 2007. p. 1–28.

  30. 30.

    Maxman A. Florida abuzz over mosquito plan. Nature. 2012;487:286. https://doi.org/10.1038/487286a.

  31. 31.

    Resnik DB. Ethical issues in field trials of genetically modified disease-resistant mosquitoes. Dev World Bioeth. 2014;14:37–46. https://doi.org/10.1111/dewb.12011.

  32. 32.

    Owen R, Bessant J, Heintz M, editors. Responsible Innovation: managing the responsible emergence of science and innovation in society. Chichester: Wiley; 2013.

  33. 33.

    Kolopack PA, Parsons JA, Lavery JV. What makes community engagement effective? Lessons from the eliminate dengue program in Queensland, Australia. PLoS Negl Trop Dis. 2015;9:e0003713. https://doi.org/10.1371/journal.pntd.0003713.

  34. 34.

    Dana GV, Kapuscinski AR, Donaldson JS. Integrating diverse scientific and practitioner knowledge in ecological risk analysis: a case study of biodiversity risk assessment in South Africa. J Environ Manag. 2012;98:134–46.

  35. 35.

    P. Poltronieri and I. B. Reca, “Transgenic, Cisgenic and novel plant products, Regulation and Safety Assessment,” in Applied plant genomics and biotechnology, P. Poltronieri and Y. Hong, Eds. Cambridge: Elsevier Woodhead, 2014.

  36. 36.

    FAO. Save and Grow: a policymaker’s guide to the sustainable intensification of smallholder crop production. Rome: UN Food and Agriculture Organisation; 2011.

  37. 37.

    Leeuwisen C. Communication for rural innovation rethinking agricultural extension. Oxford: Blackwell; 2004.

  38. 38.

    Lockie S, Carpenter D, editors. Agriculture, Biodiversity and Markets: livelihoods and agroecology in comparative perspective. London: Earthscan; 2010.

  39. 39.

    Altieri M, Nicholls CI. Agroecology and the search for a truly sustainable agriculture. Mexico City: UNEP; 2005.

  40. 40.

    Suter GW. Ecological Risk Assessment. Boca Raton: CRC Press; 2006. p. 680.

  41. 41.

    Hosack GR, Hayes KR, Barry SC. Prior elicitation for Bayesian generalised linear models with application to risk control option assessment. Reliab Eng Syst Saf. 2017;167:351–61.

  42. 42.

    Carey JM, Burgman MA. Linguistic uncertainty in qualitative risk analysis and how to minimize it. Ann N Y Acad Sci. 2008;1128:13–7.

  43. 43.

    Regan HM, Colyvan M, Burgman MA. A taxonomy and treatment of uncertainty for ecology and conservation biology. Ecol Appl. 2001;12:618–28.

  44. 44.

    McNeil AJ, Frey R, Embrechts P. Quantitative Risk Management. Princeton: Princeton University Press; 2015. p. 699.

  45. 45.

    Fox DR. Statistical issues in ecological risk assessment. Hum Ecol Risk Assess. 2006;12:120–9. https://doi.org/10.1080/10807030500430476.

  46. 46.

    McDowell I. Measuring health: a guide to rating scales and questionnaires. Oxford: Oxford University Press; 2006. p. 765.

  47. 47.

    Layard R, Glaister S, editors. Cost benefit analysis. Cambridge: Cambridge UP; 1994.Eds.

  48. 48.

    Mishan EJ, Quah E. Cost-benefit analysis. London: Routledge; 2007.

  49. 49.

    O’Neill J. Ecology, Policy and Politics: human wellbeing and the natural world. Abingdon: Taylor & Francis; 1993.

  50. 50.

    Kalof L, Satterfield T, eds. The Earthscan Reader in Environmental Values. London: Earthscan; 2005.

  51. 51.

    Hanley N, Spash C. Cost-benefit analysis and the environment. London: Edward Elgar; 1993.

  52. 52.

    Irwin A. Citizen science: a study of people, expertise and sustainable development. London: Routledge; 1995.

  53. 53.

    Spash CL. Investigating individual motives for environmental action: lexicographic preferences, beliefs and attitudes. In: Environmental Science and Technology Library, vol. 13, KLUWER ACADEMIC PUBLISHERS; 1998. p. 46–62.

  54. 54.

    Foster J, editor. Valuing nature: economics, ethics and environment, vol. 21, no. 14. London: Routledge; 1997.

  55. 55.

    Wynne B. Uncertainty and environmental learning: reconceiving science and policy in the preventive paradigm. Glob Environ Chang. 1992;1992:111–27.

  56. 56.

    Faber M, Proops JLR. Evolution, Time, Production and the Environment. Berlin: Springer; 1990.

  57. 57.

    Sluijs JP, Van Der Risbey JS, Kloprogge P, Ravetz JR, Funtowicz SO, Quintana SC, Pereira AG, Marchi B, De Petersen AC, Jansen PHM, Hoppe R, Huijs SWF. RIVM/MNP Guidance for Uncertainty Assessment and Communication Detailed Guidance; 2003.

  58. 58.

    Walker WE, Rotmans J, Harremoes P, Van der Sluijs JP, van Asselt MBA, Janssen P, von Krauss MPK. Defining uncertainty: a conceptual basis for uncertainty management. Integr Assess. 2003;4(1):5–17.

  59. 59.

    Roeser S, Hillerbrand R, Sandin P, Peterson M, editors. Handbook of Risk Theory: epistemology, decision theory, ethics and social implications of risk. Berlin: Springer.

  60. 60.

    Morgan MG, Henrion M. Uncertainty: a guide to dealing with uncertainty in quantitative risk and policy analysis. Cambridge: Cambridge Univ. Press; 1990.

  61. 61.

    Lindley DV. Understanding uncertainty. Hoboken: Wiley; 2006.

  62. 62.

    O’Hagan A, Oakley JE. Probability is perfect, but we can’t elicit it perfectly. Reliab Eng Syst Saf. 2004;85:239–48.

  63. 63.

    Spiegelhalter DJ, Riesch H. Don’t know, can't know: embracing deeper uncertainties when analysing risks. Philos Trans A Math Phys Eng Sci. 2011;369(1956):4730–50.

  64. 64.

    de Finetti B. Theory of probability - a critical introductory treatment. Chichester: Wiley; 1974.

  65. 65.

    Hayek FA. New studies in philosophy, politics, economics and the history of ideas. London: Routledge; 1978.

  66. 66.

    Knight FH. Risk, Uncertainty and Profit. Boston: Houghton Mifflin; 1921.

  67. 67.

    Keynes JM, Lewis CI. A treatise on probability. Philos Rev. 1922;31(2):180.

  68. 68.

    Thompson M, Warburton M. Decision making under contradictory certainties: how to save the Himalayas when you can’t find what's wrong with them. J Appl Syst Anal. 1985;12:3–34.

  69. 69.

    O’Brien M. Making Better Environmental Decisions: an alternative to Risk Assessment. Cambridge: MIT Press; 2000.

  70. 70.

    Arrow KJ. Social choice and individual values. New Haven: Yale University Press; 1963.

  71. 71.

    Stirling A. Risk, uncertainty and precaution: some instrumental implications from the social sciences. In: Berkhout F, Leach M, Scoones I, editors. Negotiating change: new perspectives from the social sciences. Cheltenham: Edward Elgar; 2003.

  72. 72.

    OECD, “Open government: fostering dialogue with civil society,” 2003.

  73. 73.

    OECD. Evaluating Public Participation in Policy Making. Paris: Organisation for Economic Co-operation and Development; 2005.

  74. 74.

    Royal Society, “Science in Society,” 2004.

  75. 75.

    IRGC, “Risk governance: towards an integrative approach,” 2013.

  76. 76.

    Brooks H. The Typology of Surprises in Technology, Institutions and Development. Cambridge: IIASA / CUP; 1986.

  77. 77.

    P. C. Stern and H. V Fineberg, Understanding risk: informing decisions in a democratic society. 1996.

  78. 78.

    Ely A, Van Zwanenberg P, Stirling A. Broadening out and opening up technology assessment: approaches to enhance international development, co-ordination and democratisation. Res Policy. 2014;43(3):505–18.

  79. 79.

    Kleinman DL, Delborne JA, Anderson AA. Engaging citizens: the high cost of citizen participation in high technology. Public Underst Sci. 2011;20:221–40. https://doi.org/10.1177/0963662509347137.

  80. 80.

    Delborne JA, Schneider J, Bal R, Cozzens S, Worthington R. Policy pathways, policy networks, and citizen deliberation: disseminating the results of world wide views on global warming in the USA. Sci Public Policy. 2013;40:378–92. https://doi.org/10.1093/scipol/scs124.

  81. 81.

    Gee D, Grandjean P, Hansen SF, van den Hove S, MacGarvin M, Martin J, Nielsen G, Quist D, Stanners D, Stanners D, editors. Late lessons from early warnings: science, precaution, innovation, no. 1. Copenhagen: European Environment Agency; 2013.

  82. 82.

    Harremoës P, A. European Environment. Late lessons from early warnings : the precautionary principle 1896–2000. Luxembourg: Office for Official Publications of the European Communities; 2001.

  83. 83.

    Arrow KJ, Fisher AC. Environmental Preservation, Uncertainty, and Irreversibility. Q J Econ. 2013;88(2):312–9.

  84. 84.

    Faber M, Manstetten R, Proops JLR. On the conceptual foundations of ecological economics: a teleological approach. Ecol Econ. 1995;12:41–54.

  85. 85.

    Dosi G. Technological paradigms and technological trajectories: a suggested intepretation of the determinants and directions ofn technological change. Res Policy. 1982;11(3):147–62.

  86. 86.

    Arthur WB. Competing technologies, increasing returns, and lock-in by historical events. Econ J. 1989;99(394):116–31.

  87. 87.

    Walker W. Entrapment in large technology systems: institutional commitment and power relations. Res Policy. 2000;29(7–8):833–46.

  88. 88.

    Sandler R, Pezzullo PC, editors. Environmental Justice and Environmentalism: the social justice challenge to the environment movement. Cambridge: MIT Press.

  89. 89.

    Shrader-Frechette K. Environmental justice: creating equality, reclaiming democracy. Oxford: Oxford University Press; 2002.

  90. 90.

    Ashford NA, Caldart CC. Environmental Law, Policy and Economics: reclaiming the environmental agenda. Cambridge: MIT Press; 2008.

  91. 91.

    Nelson KC, Andow DA, Banker MJ. Problem formulation and option assessment (PFOA) linking governance and environmental risk assessment for technologies: a methodology for problem analysis of nanotechnologies and genetically engineered organisms. J Law Med Ethics. Winter 2009. 2009:732–48.

  92. 92.

    Nelson KC, Kibata G, Muhammad L, Okuro JO, Muyekho F, Odindo M, Ely A, Waquil JM. Problem Formulation and Options Assessment (PFOA) for Genetically Modified Organisms: the Kenya case study. In: Hilbeck A, Andow DA, editors. Environmental risk assessment of genetically modified organisms, volume 1: a case study of Bt maize in Kenya. Wallingford: CABI Publishing; 2004. p. 57–82.

  93. 93.

    Stirling A, Mayer S. A novel approach to the appraisal of technological risk: a multicriteria mapping study of a genetically modified crop. Environ Plan C-Government Policy. 2001;19(4):529–55.

  94. 94.

    E. Millstone, I. Scoones, A. Ely, E. Shah, S. Stagl, J. Thompson, H. Odame, B. Kibaara, S. Nderitu, F. Karin, E. M. I. X, E. Millstone, H. Odame, F. Karin, and A. Adwera. Pathways in and out of maize. vol. 4, no. 2007, 2009.

  95. 95.

    Hayes KR, Leung B, Thresher RE, Dambacher JM, Hosack GR. Assessing the risks of genetic control techniques with reference to the common carp (Cyprinus carpio) in Australia. Biol Invasions. 2014;16:1273–88. https://doi.org/10.1007/s10530-012-0392-9.

  96. 96.

    Dambacher JM, Li HW, Rossignol PA. Qualitative predictions in model ecosystems. Ecol Model. 2003;161:79–93.

  97. 97.

    Wynne B. Public participation in science and technology: performing and obscuring a political–conceptual category mistake. East Asian Sci Technol Soc an Int J. 2007;1(1):99–110.

  98. 98.

    Owen H. Open Space Technology: a User’s Guide. Potomac: Berrett-Koehler; 2008.

  99. 99.

    Chambers R. Participatory Rural Appraisal (PRA): Analysis of Experience. World Dev. 1994;22(9):1253–68.

  100. 100.

    Davies G, Burgess J, Eames M, Mayer S, Staley K, Stirling A, Williamson S. Deliberative Mapping: Appraising Options for Addressing ‘the Kidney Gap’. London: Wellcome Turst; 2003.

  101. 101.

    Wakeford T. Citizen Foresight: a tool to enhance democratic policy-making. London: LoGIS, Genetics Forum and University of East London; 1999.

  102. 102.

    P. Reason and H. Bradbury, Eds., Handbook of action research participative inquiry and practice. London: Sage, 2001.

  103. 103.

    Beierle TC. The quality of stakeholder-based decisions. Risk Anal. 2002;22:739–49.

  104. 104.

    Stirling A. Opening up the politics of knowledge and power in bioscience. PLoS Biol. 2012;10(1):e1001233.

Download references

Acknowledgements

This work is part of a special issue of this journal produced as part of a workshop entitled “Environmental Release of Engineered Pests: Building an International Governance Framework,” hosted at North Carolina State University on October 5-6, 2016 and organized by the Genetic Engineering & Society Center and CSIRO. The workshop agenda and presentation slides from the meeting can be found at: https://research.ncsu.edu/ges/research/projects/oecd-crp-meeting/

The opinions expressed and arguments employed in this publication are the sole responsibility of the authors and do not necessarily reflect those of the OECD or of the governments of its Member countries.

Funding

Funding for this publication was provided by the OECD Co-operative Research Programme on Biological Resource Management for Sustainable Agricultural Systems and by the North Carolina Biotechnology Center.

About this supplement

This article has been published as part of BMC Proceedings Volume 12 Supplement 8, 2018: Environmental Release of Engineered Pests: Building an International Governance Framework. The full contents of the supplement are available online at https://bmcproc.biomedcentral.com/articles/supplements/volume-12-supplement-8.

Author information

AS and KH led drafting of the manuscript with assistance from JD. All authors read and approved the final manuscript.

Correspondence to Andrew Stirling.

Ethics declarations

Ethics approval and consent to participate

Not applicable.

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Reprints and Permissions

About this article