SRA 2013 Annual Meeting Abstracts - The Society for Risk Analysis

Loading...

SRA 2013 Annual Meeting Abstracts W2-C.4 . Hughes, K; Vickers, C; Clark, B*; Kadry, A; U.S. Environmental Protection Agency; [email protected] A Global Collaborative Approach to Human Health Risk Assessment: The WHO Chemical Risk Assessment Network In response to the need to assess and manage the health risks associated with the rapid increase in production, use, transport and trade of chemicals and products worldwide, and to meet obligations related to health priorities under the Strategic Approach to International Chemicals Management, WHO initiated action to map the way forward to strengthen global collaboration in support of chemical risk assessment. To identify key issues in chemical risk assessment and priority actions to address them, as well as to solicit input on options for international collaboration, WHO convened two meetings, in Geneva in 2010 and in Bonn in 2012. Additionally, WHO conducted a survey of institutions on capacity building needs and visions for networking amongst the risk assessment community. These consultations supported the establishment of a WHO Chemical Risk Assessment Network comprised of representatives from a broad range of organizations with relevant expertise, from all regions and economic strata around the world. It was recommended that Network be project oriented and build upon previous initiatives, with activities focusing on identified priority areas, such as capacity building and training, chemical risk assessments and knowledge sharing, risk assessment methodology, and research. In response to this endorsement, and with the assistance of an Initial Steering Group, the WHO Chemical Risk Assessment Network is being launched to share expertise and knowledge and facilitate the coordination of projects. Institutions engaged in assessment of risks to health associated with chemical exposures and potential donors will be invited to contact WHO about becoming Network Participants. The views expressed in this abstract do not necessarily represent the views or policies of Health Canada, World Health Organization and the U.S. Environmental Protection Agency.

W2-A.3 Abdukadirov, S; Mercatus Center, George Mason University; [email protected] See no evil, hear no evil: Political incentives in agency risk tradeoff analysis This study examines how risk trade-offs undermine safety regulations and why agencies fail to account for risk trade-offs in their analysis. Safety regulations often come with unintended consequences in that regulations attempting to reduce risk in one area may increase risks elsewhere. The increases in countervailing risks may even exceed the reduction in targeted risks, leading to a policy that does more harm than good. The unintended consequences could be avoided or their impacts minimized through more careful analysis, including formal risk trade-off analysis, consumer testing, and retrospective analysis. Yet agencies face strong incentives against producing better analysis; increased awareness of risk trade-offs would force agencies to make unpalatable and politically sensitive choices, a prospect they would rather avoid. Further, a narrow focus on their mission often leads agencies to overlook the broader impacts of regulation. In addition, budget constraints induce agencies to prioritize new regulations over the review of existing ones. Thus, policymakers must mandate that agencies produce better analysis and subject their analyses to external oversight.

M4-C.4 Abrahamsen, EB; Asche, F; Milazzo, MF*; University of Messina; [email protected] What are the effects on safety of using safety standards in major hazard industries? To protect people from hazards it is common to use safety standards. There is a general understanding in the literature that such standards will contribute to improved safety. In this paper we discuss this issue when several safety measures are present and safety standards apply to a subset of these alternatives. We show that the use of safety standards does not always give the expected effect on safety, as the implementation of safety standards for some measures can reduce investments in safety measures not affected by the standards. If the restrictions are sufficiently strong, the safety standards may lead to a reduction in overall safety. Our starting point focuses on major hazard industries, but our discussion is to a large extent general and could also be applied in other areas.

M4-C.3 Abrahamsen, EB; Asche, F; Gelyani, A*; Guikema, S; University of Stavanger, Norway (for author 1, 2 and 3). Johns Hopkins University, USA (for author 4); [email protected] How often should safety critical valves be tested? The regulation given by the PSAN (Petroleum Safety Authority Norway) requires annual testing of safety critical valves. In the present paper we discuss the rationale for this requirement, as annual testing by the operators is considered as too strict. The expected utility theory, which is the backbone for all economic thinking on decisions under uncertainty, is used as basis for the discussion. We show that requirements formulated by the authorities on how often safety critical valves should be tested, usually will be stricter than what the operators prefer. We also show that the requirement of annual testing likely will be too strict also from a societal point of view, if the objective of annual testing is only improvements in reliability of the valves. One is then disregarding the fact that testing of safety critical valves also has negative effects on safety for those who perform the tests, as well as negative effects for the environment.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.102 Ackerlund, W.S.; Kleinfelder; [email protected] Public Collaboration on a 30-year Commitment to Assess Superfund Health Outcomes in Butte, Montana A collaborative process is described for gaining meaningful community participation on an atypical Superfund study to investigate the adequacy of health protection achieved through remediation. The process will guide the scope and conduct of health studies of the Butte community to be performed every five years for thirty years. The process emerged to address differences among stakeholders about health study needs and concerns for trust and transparency in the risk analysis. It effectively coordinates a technical working group (comprised of responsible parties, oversight agencies, supporting agencies, and consultants), a citizen’s advisory committee, EPA’s Technical Assistance Grant (TAG), and citizens at large. Major collaboration principles applied, lessons learned, and project benefits to date are identified. The collaboration effort provides a useful example of how to achieve common understanding among diverse stakeholders about public health risks.

M4-J.4 Adler, MD; Duke University; [email protected] Distributive Weights: A Defense This talk defends the use of distributive weights in cost-benefit analysis. A substantial body of scholarship in welfare economics explains how to specify weights, so as to mimic a utilitarian or equity-regarding social welfare function. The main literature on social welfare functions is vast -- encompassing optimal tax theory, public finance, theoretical social choice, and growth theory including climate change. But despite its firm intellectual foundations, distributive weighting is viewed skeptically by many engaged in cost-benefit analysis in the U.S. –- for various reasons, but perhaps mostly because of the received wisdom that ties cost-benefit analysis to Kaldor-Hicks efficiency. In this talk, I rehearse the strong case against the Kaldor-Hicks approach; review the theory of distributive weighting; provide a concrete formula for weights; and rebut a slate of objections to weighting, such as the fact that it involves value judgments (as if traditional cost-benefit doesn’t!), the supposed optimality of redistribution through the tax system, and others.

W2-C.2 Afriyie-Gyawu, E*; Shankar, P; Adu-Oppong, A; Hsu , J-P; University; [email protected] Toxicological and Public Health Implications of the Use of Scrap Rubber Tires for Smoking Meat in Africa Many slaughterhouses in some developing countries (such as Ghana and Nigeria) are known to frequently use open fires, set with scraps of automobile tires, to singe the fur of slaughtered goats, sheep, cows, etc. intended for human consumption. This has been a common practice for well over 50 yr. This is a potential health risk as tires are made of various chemicals and materials that, when released into the environment under ambient conditions (e.g., open fire burning), could contaminate the meat and the environment. Chemicals emitted through open tire burning include carbon monoxide, sulfur oxides, nitrogen oxides, particulate matter – such as volatile organic compounds (e.g., benzene), dioxins, polycyclic aromatic hydrocarbons (PAHs, e.g., benzo-a-pyrene), polychlorinated biphenyls (PCBs), and heavy/toxic metals/metalloids (e.g., arsenic, mercury, cadmium, chromium, etc.). Human exposures to these chemicals occur through ingestion (e.g., meat and meat products smoked with scrap rubber tires) and inhalation (indoor and outdoor air polluted with the tire-fire smoke). Thus, the toxicological and public health impact of this practice on the slaughterhouse operators, consuming public, individuals living in close proximity to the slaughterhouses, and the entire ecosystem cannot be overemphasized. The main objectives are to discuss the: 1) chemicals generally present in rubber tires, 2) pollutants emitted via scrap rubber tire burning, 3) types and levels of contaminants in the tire-smoked meat, and 4) potential toxicological and public health implications associated with using scrap rubber tires for smoking meat and other foodstuffs. We also highlight our current research activities in Ghana, West Africa, related to this practice, as well as some preliminary findings. Finally, it is anticipated that this presentation will heighten awareness of this practice as a potential health risk particularly in developing countries, and stimulate further scientific investigations into this food and environment-related issue.

M3-E.4 Aguila, I.E.; Jimenez, R.B.*; Ruiz, P.; Universidad Andres Bello; [email protected] Mortality risk from personal exposure to PM2,5 and UFP in different transportation modes: travel by bus, drive a car, take the metro or ride a bicycle? Strong evidence from epidemiological studies suggest that exposure to transport-related pollution increases the risk of premature death. Recent studies have found significant differences between air pollution concentrations in different transportation modes. So far no study has addressed the increase in personal health risks attributable to air pollution exposure in transport environments in Chile. To fill this gap, the main goal of this study was to assess changes in personal health risks from exposure to traffic-related pollutants for commuters in different transportation modes in Santiago, Chile. We estimated premature mortality impacts for the attributable proportion of pollutant exposure to PM2,5 and ultrafine particles while commuting in four transportation modes: car, bicycle, bus and walking. Estimations of increased risk attributable to pollutant exposure were calculated based on previous studies (Jimenez and Bronfman, 2012; Dhondt, et al., 2012). We assessed personal and social risk of overall exposure and transport mode specific exposure to UFP and PM2,5. Changes in exposure concentrations associated with each transport mode were obtained from measurements of direct contributions from traffic emissions and background concentrations to personal exposures in previous studies (Ruiz et al., 2012). Scenario analyses and Montecarlo simulations were preformed in order to assess major sources of uncertainty in health impact estimations, mainly related to C-R functions for UFP, time spent in traffic and mobility patterns. Our preliminary results reveal high levels of risk from exposure to traffic related-pollution for all commuters and especially for cyclists. This information must be taken into account in the development of transportations policies, especially those aiming at stimulating cycling as a viable alternative to car driving in urban areas. Limitations of the study and further implications for transport policies and regulators will be discussed.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.40 Agurenko, AO*; Khokhlova, AV; RIHMI-WDC; [email protected] Long-term variability of wind regime in the atmosphere over the Arctic Meteorological observations are very important in estimating climate risks, i.e. in determining a frequency of potentially hazardous events and their intensity, as well as in designing efficient strategies of disaster mitigation. Struggling against hazardous natural processes can be efficient owing to the knowledge of both their origin and evolution. In this connection, the research on long-term trends towards the changes in climate parameters on global and regional scales under changing climate provokes permanent interest. The aim of this work is to study the long-term wind speed variability in the atmosphere over the northern polar region. The climate of the Arctic is the product of interactions between a large range of physical, chemical, and radiative processes, involving ocean, sea ice, land-surface, snow cover, clouds, and aerosols. Many of the interactions operate via atmospheric circulation. The wind speed and its variability determine significantly the circulation regime and transport of substances. Long-term upper-air observations from the IGRA (Integrated Global Radiosonde Archive) dataset were used in the study. The IGRA consists of radiosonde and pilot balloon observations at over 1500 globally distributed stations. Observations are available for standard, surface, tropopause and significant levels. The work is fulfilled from the analysis of long-term time series of wind speed based on observations at over 50 stations located at 60-80°N for the period 1972-2010. The series of mean monthly wind speeds and maximum wind speeds were constructed for this period at standard pressure surfaces from the ground to the 30 hPa elevation. Time series were analyzed and linear trend coefficients were determined. Areas of minimum and maximum changes in mean and maximum wind speeds were identified. The results obtained are of importance in analyzing meteorological risks to develop efficient strategies to mitigate disaster impacts.

M4-G.2 Akerlof, K.*; Rowan, K. E.; La Porte, T.; Ernst, H.; Nataf, D.; Batten, B.; Rajasekar, M.; Dolan, D.; George Mason University, KA, KER, TLP, DD; U.S. Naval Academy, HE; Center for the Study of Local Issues, Anne Arundel Community College, DN; Dewberry, BB, MR; [email protected] Risky Business: Engaging the Public in Policy Discourse on Sea-Level Rise and Inundation In the United States, public discourse about adaptation to anthropogenic climate change began more recently than debates on reductions in greenhouse gas emissions, and thus far has been more muted. It has been unclear whether public opinion has the potential to become as sharply polarized on adaptation responses as it has been on mitigation policies. To examine this question, we surveyed a representative sample of residents of a coastal county in Maryland, and tested the impact of a community deliberative event that presented sea-level rise information within small-group discussions as a potential strategy to reduce polarization. We found that the same preferences for societal “ways of life,” such as degree of individualism and hierarchy, that have contributed to politically polarized beliefs about climate change are also associated with people’s perceptions of local sea-level rise risk. These preferences are predictive of perceptions of sea-level rise risk to the county—the level at which local governmental policy responses will be decided—whereas living near coastal flooding and inundation hazards is not. Coastal proximity is a significant predictor of sea-level rise risk perceptions, but only for people’s own homes and neighborhoods. The community deliberative event—a daylong process of expert presentations, access to property-level risk data, and small-group discussions— significantly increased topic knowledge among all participants, and significantly increased problem identification, issue concern, and sea-level rise belief among those participants with a worldview predisposing them to lower risk perceptions. With respect to sea-level rise, this implies that policy discussions that emphasize local community identity as a component of public engagement and decision-making may be more effective in bypassing cultural polarization in problem recognition, than either larger-scale issue debates or those which neglect the role of social context.

W4-B.1 Albertini, RJ*; Thirman, MJ; University of Vermont, University of Chicago; [email protected] Relevance of genetic changes in circulating blood cells following formaldehyde exposure Genetic changes in circulating blood cells have been proposed as signals of formaldehyde’s in vivo systemic genotoxic effects in humans. Some studies report increases of chromosome aberrations or micronuclei in peripheral blood lymphocytes (PBL) following formaldehyde exposure. However, PBLs are neither hematopoietic stem cells nor their surrogates. PBLs are circulating mature differentiated cells that infiltrate tissues at body surfaces (including nasal mucosa)and are present at sites-of-contact for formaldehyde inhalation exposures. Genetic changes in PBLs do not reflect changes in acute myeloid leukemia initiating cells (AML-IC). Studies addressing issues of target cell relevance for AML induction cite increases in aneuploidy of chromosomes 7 and 8 in vivo in CFU/GM myeloid precursor cells from formaldehyde exposed workers. However, AML-IC can not be measured using CFU-GM or other CFU assays which do not measure cells with self-renewal capability. Aneuploidy measured in this study could indicate formaldehyde exposure by inhalation does exert systemic effects – a contention that would be true only if the study truly measured in vivo events. However, the method measured aneuploid metaphases obtained by pooling all cells from all colonies arising from single cells after approximately 14 days of culture. For detected aneuploidy to have arisen in vivo, any colony arising following formaldehyde exposures cell will contain all aneuploid progeny, i.e. the unit of measurement must be aneuploid colonies to reflect aneuploidy arising in vivo. By contrast, the study in question measured aneuploid metaphases, which most likely arose in vitro, not in vivo. Current evidence implicating genetic changes in circulating blood cells signals neither penetration of genotoxic effects beyond sites-of-contact following formaldehyde exposures nor effects in AML-ICs. Furthermore, when assessing effects of environmental exposures on peripheral blood counts, it is important to assess for genetic variants that might not be randomly distributed between exposed and non-exposed groups.

T1-K.3 Alfredo, KA*; Roberson, JA; Ghosh, A; Seidel, C; American Water Works Association; Jacobs Engineering; [email protected] Using a Relative Health Indicator (RHI) metric to estimate health risk reductions in drinking water Numerous methodologies are available for cumulative risk assessment (CRA) that are used to assess individual or population risks from exposure to multiple environmental sources, stressors, effects, and pathways. Depending on the objective of the analyses placing all impacts on a common metric such as Quality Adjusted Life Years (QALYs) or Disability Adjusted Life Years (DALYs) can be achieved. The resulting metric of risk correspondingly can range from primarily qualitative to primarily quantitative. This project developed a summary metric of relative cumulative health impact resulting from drinking water, the Relative Health Indicator (RHI). An intermediate level of quantification and modeling was chosen, one which retains the concept of an aggregated metric of public health impact (the DALY) and hence allows for comparisons to be made across “cups of water,” but avoids the need for development and use of complex bio-mathematical models that are beyond the existing state of the science.The hybrid CRA methodology developed for RHI considers the complexity of health effects caused by individual and multiple contaminants in drinking water, and the scarcity of appropriate health effects and exposure data, but applies simplifications that enables quantitative computations. The RHI metric has several potential applications from comparing health impacts from contaminants at a single utility to helping prioritize contaminant regulatory determinations at a national policy level. Using the USEPA Six-Year Review data and other available national occurrence surveys, this research explores the applicability of the RHI metric on the national scale, comparing currently regulated contaminants with contaminants that may be regulated in future.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-A.4 Anadon, LD*; Bosetti, V; Chan, G; Nemet, GF; Verdolini, E; Harvard University; [email protected] Energy technology expert elicitations: their use in models and what can we learn from workshops and metaanalysis Developing energy policies that are robust to a broad set of possible future conditions typically requires characterization of the anticipated performance of individual energy technologies. In particular, decisions about public investments in research, development, and demonstration (RD&D) aimed at promoting technological change in a range of energy technologies ideally require not only an explicit and robust consideration of the uncertainty inherent to innovation, but also a way of considering tradeoffs and making allocations across different technology areas. Over the past few years several groups on both sides of the Atlantic have developed expert elicitations and used energy-economic models to shed light on these questions. This presentation will draw on work from two of these groups and will be divided into four sections. First, we will discuss the lessons learned from the design and implementation of seven energy-technology expert elicitations of forecasted technology cost and performance metrics, highlighting the need for (and difficulties associated with) matching elicitation design and modeling approach. Second, we will present insights drawn from an effort to use expert elicitations to optimize RD&D investment portfolios, including results on decreasing marginal returns to and optimal levels of overall RD&D investments, as well as of the importance of identifying policy scenarios and metrics for evaluation. Third, we will discuss the usefulness of devising online elicitation tools and of combining individual elicitations with group discussions to increase reliability and, in the long-run, reduce costs. And fourth, we will discuss the application of meta-analysis techniques to improve our understanding of the impact of expert selection on elicitation results and of the shape of the expected returns to RD&D.

T5-C.2 ANDERSON, EL; Exponent; [email protected] Creating a Field That Matters Risk analysis was widely used in the field of engineering before it was introduced as policy by the U S. Environmental Protection Agency in 1976 for carcinogen risk assessment. While health applications were obviously very different from those in engineering, there were and remain overlapping methods and approaches. The impact of the use of risk assessment in the health field was profound and wide spread; the regulation of exposure to suspect carcinogens was and remains controversial and comes under close scrutiny by the publics whose health public health agencies are required to protect. In short order, it became obvious that the public perception of risk was most often different from the science based assessed risk. How to create a field that matters from this disparate application of risk analysis formed the impetus for the founding of our Journal, Risk Analysis: An International Journal. The Journal became the forum for sharing and merging interdisciplinary, scholarly research on topics in risk analysis to inform the whole of risk analysis. While in the beginning there was only one Editor in Chief, by 1999, it had become obvious that at least three Area Editors were need for Engineering, Health and Social Sciences to adequately cover the submissions, proactively solicit interesting special collections of papers, and oversee peer review. As the leading international Journal on topics in risk analysis, there is little doubt that Risk Analysis: An International Journal has played an important role in unifying and informing methods and applications in risk analysis. Exploring the creation of this field from the early days to the professional based foundations of today required imagination, determination, and interdisciplinary cooperation of many scholars. The success of the Journal is a model for creating a field that matters.

M2-I.2 Andrijcic, E.*; Haimes, Y.Y.; Rose-Hulman Institute of Technology, University of Virginia; [email protected] Developing a multi-phase, iterative and collaborative decision coordination process for transportation infrastructure management The practice of persistent infrastructure underinvestment, coupled with a significant growth in commercial and non-commercial transportation demand, has left the U.S. transportation infrastructure unable to adequately support current and future needs. The lack of political will to allocate the needed funds to bridge infrastructure improvement stems, in part, from the disharmonious goals and objectives among the various stakeholders, and political and other decision makers, as well as the lack of appreciation of the critical interdependencies among the myriad sub-systems of the bridge infrastructure. To address this challenge, we present a multi-phase, iterative and collaborative decision coordination process that is based on the theory of intrinsic meta-modeling via shared state variables. The developed approach enables the harmonization of multiple models representing varied sub-systems and stakeholders’ perspectives. The approach provides decision makers with the ability to better visualize and collaboratively coordinate their shared and conflicting interests with the purpose of achieving public policy solutions for transportation infrastructure that are satisficing to all involved stakeholders, and sustainable over a long planning horizon. We present an illustrative example in which we utilize the meta-modeling coordination to explore the engineering, social, economic, and political implications of insufficient bridge maintenance. We focus on the evolving nature of objectives, interest groups, organizational, political and budgetary baselines, and requirements associated with varied stakeholders, and show that the meta-modeling coordination process enables all stakeholders and decision makers to plan for future emergent changes through collaborative and foresighted efforts. Additionally, we illustrate how the developed process could be utilized to more equally distribute risk ownership among all involved stakeholders.

W3-C.1 Antoniou, G; Gebrayel, A; Mhanna, P; Sarri, M; Stylianou, K; Kouis, P*; Cyprus International Institute, Cyrus University of Technology; [email protected] A Preliminary Characterization of Public Health Risks from Industrial Operations in Jubail This paper characterizes the health risks due to the emissions of the industrial complex at Jubail, S. Arabia, one of the largest in Middle East. Students at the Cyprus International Institute conducted a probabilistic risk assessment using publically-available data on the nature and production capacity of representative facilities – i.e., a refinery, steel plant, fertilizer manufacturer, plastics facility, and methanol plant. A preliminary risk assessment was conducted, using the LCA software Simapro®, to determine typical emissions and the contributions of various pollutants. It indicated that PM2.5, sulfur dioxide and nitrogen oxides dominated mortality impacts. For these pollutants a comprehensive risk assessment, reflecting Jubail conditions, was developed using Analytica®. This coupled emissions from Simapro®, with intake fractions for PM and its precursors adapted from Wolff (2000). It used the probabilistic PM exposure-response coefficients from six European epidemiologists developed by Tuomisto (2008). Mortality was valued using US VSL estimates of Viscusi and Aldy (2003) adapted to the region using benefits transfer with income elasticity between 0.4 and 0.6. Uncertainty was analyzed with 10 runs, of 5,000 iterations each, and Latin Hypercube sampling. The analysis suggests that about 200 deaths/yr are attributable to industrial emissions from Jubail. Only a small fraction of these occur among the residents of Jubail. The power plant and refinery are responsible for the most of these. Secondary sulfate is the dominant source of health impacts. Potential benefits of pollution control are large (nearly 1 billion US$/year), however most of these benefits accrue to populations outside Saudi Arabia. Since the analysis relies on publically available data, results are somewhat tentative but could easily be refined if these generic characterizations of industrial processes and emissions were replaced with information site-specific processes and emissions.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-D.4 Anyamba, A*; Small, J; Oryang, D; Fanaselle , W; NASA Goddard Space Flight Center; [email protected] Monitoring and Mapping Conditions Associated with Enteric Pathogens using Rainfall and Satellite Vegetation Index Data Food borne outbreaks associated with fresh produce are increasing in the United States and around the world. The increase in the proportion of food borne outbreaks linked to fresh produce from the 1970s (<1%) to the 1990s (6%) provides a vivid description of the increase. The increase is largely due to a combination of an increase in the consumption of fresh produce, along with improved outbreak surveillance techniques. However such outbreak surveillance while important is behind the curve in detecting areas or regions of risk. In an attempt to shift the detection time of potential pathogen contamination (Salmonella, E. coli 0157 and human norovirus), FDA’s Center for Food Safety and Applied Nutrition in collaborations with NASA’s Goddard Space Flight Center are using time series rainfall and satellite derived vegetation index data to test and operationalize these climate and environmental variables as leading indicators and drivers of potential contamination. This is based on the assumption that periods of elevated rainfall result in heavy overland runoff leading to accumulation of fecal matter and other waste in zones of potential contamination of agricultural land. Additionally excess irrigation (that can be detected by satellite vegetation) can create conditions for contamination. An index of cumulative precipitation and vegetation index is derived and shown to provide an early warning of potential environmental contamination over the major produce areas of Salinas Valley, Central Valley and Imperial Valley of California. This method is currently being adapted into a real-time Geospatial Risk Assessment model to monitor and provide an advanced alert to FDA and industry about locations and times of likely contamination of leafy greens by enteric pathogens.

P.46 ANYIKA, E*; WEKE, PO; ACHIA, TN; UNIVERSITY OF NAIROBI; [email protected] REAL SYSTEMATIC RISK FOR MODELING WEIGHTED PRICES AS AN ASSET FOR DECISION MAKING In this paper the notion that real non-diversifiable (systematic or market) risk does not exist or is one minus diversifiable risk (non-systematic risk) is investigated. Real systematic risk is then developed from its basic principles of not being able to be diversified like non-systematic risk which can by increasing the number of portfolios. Systemic together with non-systematic risk is then weighted against expected returns of assets to determine maximum returns of these assets at minimum risk. A Real Risk Weighted Pricing Model is thus developed which is able to postulate expected returns and risks of assets in the present and in the near future. This enables capital allocation, investing and financial decisions to be accurately determined.

P.80 Aoyagi, M*; Kanamori, Y; Yoshida, A; National Institute for Environmental Studies; [email protected] Irrational fears for radioactivity: Qualitative and quantitative evaluation Using qualitative (focus group interviews) and quantitative (opinion survey) surveys, which carried out from October 2012 to February 2013, we discuss public understandings, attitudes, and images against radioactive contaminations by the East Japan Great Earthquake. As we have already reported qualitative part elsewhere (Other SRA chapters), logical explanation could not remove “fears” from participants of focus group interviews. We further explore people’s understanding of radioactive contaminations by quantitative questionnaire survey on Japanese respondents drawn from nationally representative probabilistic samples of males and females between 20 and 80 years old in February 2013. Surprisingly, the most chosen option of information source for social events general, was TV programs (92%) followed by printed newspapers (75%). Internet resources such as SNS (6.4%), newspaper on websites (10.8%) were lower options than radio (22.8%), friends and family members (19.5%), magazines (14%), although about 70% of our respondents answered they used internet. Actually, TV program is a strongest in diffusing information on risks, such as global warming or radioactive contamination. 55% of our respondents chose “journalists or commentators appear on the TV program” as the most trusted information sources. This number is twice as high as the option of “academics from universities or other research institutes” (26.8%), or “national government”(22.2%), “international governmental bodies”(14.5%). Above findings seem to be one of the reasons of people having ambiguous fears for radioactive contamination. We asked three quizzes for the science of radioactivity, and found more than half of respondents answered wrong, for at least one. Majority of people do not have enough knowledge for radioactivity, and could not understand news or information from either national or local governments, because they relied on journalists or commentators appeared on TV programs, whose comments may not always be based on scientific evidence.

W4-E.5 Armstrong, TW; TWA8HR Occupational Hygiene Consulting, LLC; [email protected] Estimates of Legionnaires Disease Risk From Whirlpool Spas Exposure to aerosols of biological agents represents both occupational and public health risks, and presents significant challenges for evaluation, control and health protection. Bioaerosol risk assessments involve many uncertainties in understanding the quantitative relationship between exposure and resulting disease. During the roughly 40 years after the first identified outbreak of Legionnaires disease (LD), many occurrences worldwide have been tied to aerosols from sources such as cooling towers and whirlpool spas. The extensive information available about LD, particularly for exposure from whirlpool spas, may be used to quantitatively estimate the ranges of risk from the causative organism. This relative wealth of information presents an opportunity for a case study in risk assessment approaches for a biological agent with occupational and public health impact. The study goal was quantitative estimation of the risks for LD from detection limit and higher counts of Legionella in WS. For a range of Lp counts in bulk water, and exposure modeling, estimated exposures from WS were developed. LD risk was then estimated using previously published Quantitative Microbial Risk Assessment (QMRA) models. At 500 CFU/L in WS water the median estimate was 3 LD cases amongst 100,000 persons so exposed. The expected risks projected by the QMRA models are linear in the following range. At 5,000 and 50,000 CFU/L, the median risk estimates rose to 3 in 10,000 and 3 in 1,000 respectively. 500 CFU/L of Lp in WS may represent non-zero risk for LD. Further study is needed to reduce uncertainties which include intensity, duration, and frequency of exposure, and the distribution of Lp contamination in spas. Additional uncertainties include differences in Legionella virulence and human susceptibility in the exposed population. The results indicate rigorous prevention and control of Lp contamination in WS is crucial for both occupational and public health protection.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-I.3 Assadian, MS*; Sadeghi, F; Isfahan Regional Center for Business Incubators & Science Parks Development; [email protected] A Fuzzy-VIKOR model for risk assessment of environmental management system implementation in construction projects Increasing human demands and limitation of natural resources has brought environmental issues into more consideration in recent decades. Construction industry is known as least sustainable industries in the world; consuming about 50% of global energy and water consumption. Also, around half of the climate change gases are produced by buildings. These problems indicate a growing tendency for Environmental Management System (EMS) implementation in construction projects. EMS suggests some benefits by increasing compliance and waste and energy reduction. For instance, it provides positive effect on environmental performance of construction companies; also it drives to improve the performance of construction site management in water and energy consumption. Beside this positive aspect, as any strategic policy change, it consists of a certain degree of risk which could lead to different problems. Hence, an appropriate risk assessment method is needed. In most cases, as EMS implementation goes together with change in strategy, policy maker is faced with multi alternatives to select. Accordingly, this could be described as multi-criteria decision making (MCDM) problem. From different available methods for MCDM problems, we chose VIKOR because generally there are conflicts between environmental criteria and other aspects of projects including cost, time, and quality. Also we applied Fuzzy to adjust our model with uncertainty accompanied with environmental issues. The model is presented to apply for policy/decision makers to increase the success probability of EMS implementation in construction industry.

W3-E.2 Augusto, S; Pinho, P; Botelho, MJ; Palma-Oliveira, JM*; Branquinho, C; University of Lisbon; [email protected] Evaluating the risk of human exposure to environmental PCDD/Fs using biomonitors Lichens, symbioses between fungi and algae, have been used to estimate human exposure to persistent organic pollutants (POPs) at a spatial scale. The fact that lichens are long-lived organisms, with a remarkable ability to bioaccumulate atmospheric pollutants over time, together with their wide geographical distribution, contributes to their success in environmental health studies. Time integration of pollutants by lichens allows correlating low levels of pollutants with chronic effects on human health; whilst their wide distribution allows collecting them from a number of sites using low resources and obtaining information of environmental levels of pollutants with a high spatial resolution. Though some studies have been performed, there is still a lack of knowledge regarding the use of lichens to estimate human exposure to dioxins and furans (PCDD/Fs). PCDD/Fs are toxic POPs whose production is non-intentional; they’re produced as side-products during many industrial/combustion processes. In this study we’ll show an industrial case-study where lichens have been integrated into an environmental monitoring plan aiming to assess environmental impact and human health risk of PCDD/Fs in the vicinity of a cement manufacturing plant with co-incineration of hazardous waste and compared with one of the surrounding environment (>10 km). For that, PCDD/Fs were analysed in lichens and soils collected over 5 years in the vicinity and in the surrounding area of the cement plant. Data from both lichens and soils were used to select areas with highest and lowest risk of human exposure to environmental PCDD/Fs. in this case a further element was integrated, i.e., since it was performed a Risk assessment it is possible to analyze and compare the two methodologies.

M4-B.1 Aungst, JL; U.S. food and Drug Administration; [email protected] Regulation and science of BPA Bisphenol A (BPA), a component of polycarbonate resin, which is used in some hard plastic bottles and can coatings, has been the subject of regulatory and political activity, intense international scrutiny, and hundreds of research studies in the past two decades. In 2008, FDA released a draft safety assessment on BPA and since then has continued to release updates from the Agency’s ongoing review of the safety of BPA. Separate from the safety review, FDA has responded to multiple inquiries from the media, organizations, and individuals, citizen’s and food additive petitions, some which have resulted in regulatory actions. FDA’s research initiative addressing the safety of low doses of BPA, including assessment of the novel endpoints where concerns have been raised, includes multiple studies currently in progress at the FDA National Center for Toxicological Research and additional collaborations through the National Toxicology Program and National Institute of Environmental Health. The purpose of this presentation is to provide an update on current FDA activities, describing previous regulatory actions, and an overview of recent FDA research and collaborations.

P.55 Avanasi Narasimhan, R*; Shin, HM; Vieira, VM; Bartell, SM; UCI, UCI, UCI, UCD; [email protected] Potential impacts of uncertainty in the C8 Science Panel exposure assessment for perfluorooctanoate The C8 Health Project is a cross-sectional epidemiologic study of 69,030 people who were environmentally exposed to Perfluorooctanoate (PFOA) near a major U.S. fluoropolymer production facility located in West Virginia. A previously published retrospective exposure assessment model (including PFOA release assessment, integrated fate and transport modeling, and dose reconstruction) predicts the blood serum PFOA concentration for 43,360 non-occupationally exposed residents from 1951-2008; these predictions were validated against 2005-2006 serum PFOA measurements which are available for every participant (Shin et al., 2011). The fate and transport model that predicts the PFOA water concentration in the six public water districts (PWD) utilizes a number of uncertain physiochemical and hydrogeological parameters. The aim of the present study is to evaluate the extent to which the uncertainty and spatial heterogeneity in the water concentration predictions could influence the serum predictions and relative ranking of exposures for individuals in a variety of epidemiologic studies relying on the retrospective exposure estimates. Using Monte Carlo simulation, we change the individual PWD-PFOA water concentration for every year by randomly sampling from lognormal distributions centered on the original predicted concentrations We evaluate the impacts of uncertainty by comparing the spearman rank correlation coefficient between the predicted and the measured serum concentrations for each iteration, and by evaluating the similarity of the iterated serum predictions within each year. Preliminary results suggest that random variability/uncertainty in historical water concentrations has little impact on the validity of the serum predictions as measured by comparison with 2005-2006 serum measurements. We are now evaluating the potential impacts of systematic errors in water concentration predictions, using autoregressive correlation structures and shifted distributions.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-C.1 Aven, T*; Zio, E; University of Stavanger, Norway; [email protected] Foundational issues in risk assessment and management The risk assessment and risk management fields still suffer from a lack of clarity on many key issues. Lack of consensus on even basic terminology and principles, lack of proper support and justification of many definitions and perspectives adopted lead to an unacceptable situation for operatively managing risk with confidence and success. In this talk we discuss the needs, obstacles and challenges for the establishment of a strong foundation for risk assessment and risk management. We i) review and discuss the present situation and ii) reflect on how to best proceed in the future, to develop the risk discipline in the directions needed.

T4-J.3 Axelrad, DA; Chiu, W; Dockins, C*; US EPA; [email protected] Recent Efforts for Aligning Risk Assessment and Economic Analysis at EPA Risk assessment has long been a tool for informing risk management decisions and setting regulatory standards. In recent years, however, there has been increased attention on how current risk assessments align with benefit-cost and other economic analyses. For example, the Science and Decisions report from the National Academy of Sciences emphasizes the utility of risk assessment for decision making, including its role in providing quantitative estimates of risk that can inform economic analysis of regulatory alternatives. This is a departure from the context in which some risk assessment methods and tools have been developed. While cancer risk assessments at the U.S. Environmental Protection Agency (EPA) provide quantitative estimates of risk, assessments of other human health effects often do not, particularly when there is not a large body of epidemiological evidence. This results in benefits estimates that are incomplete and biased downward, which is an important consideration for regulatory actions where quantified benefits analysis is required by statute. Even when not specifically required, quantitative benefits analysis of risk reductions provides valuable information for decision makers and to the public. Over the past year economists, risk assessors, and other analysts at EPA have engaged in a series of activities and projects to enhance communication across these disciplines in order to better align risk assessment outputs with the needs of benefits analysis. This presentation provides a review of these efforts and key findings that have emerged from it. It provides lessons with respect to how risk assessment and economic analysis can be better aligned to support improved information for decision making.

W4-J.4 Aylward, LL; Summit Toxicology, LLP; [email protected] Do changes in exposure lead to changes in outcomes? Challenges in ascertaining benefits from reductions in environmental exposure levels Changes in environmental chemical exposure levels (both increases and decreases) can be measured for many chemicals. Such changes can be tracked as temporal trends in environmental media concentrations, or, more directly, through temporal trends in biomonitored concentrations of chemicals in human blood or urine. This talk discusses challenges in ascertaining whether such changes have led to measurable health benefits. Case studies are used to illustrate challenges in tracking changes in outcome prevalence or incidence over time, biases introduced in the risk assessment process, and other factors that may result in difficulty ascertaining whether predicted health benefits have actually resulted from changes in exposure levels. Case study chemicals include dioxins, DDT, brominated flame retardants, and methyl mercury.

T2-C.1 Azarkhil, M*; Mosleh, A; Reliability Engineering Program, University of Maryland at College Park; [email protected] Modeling Dynamic Behavior of Complex Systems Operating Crew during Accidents In time of accidents, operating teams of high-risk environments are responsible for the ultimate decision-making, management and control of extremely upset situations and are subject to significant amount of team interactions. Individual human errors if being aggregated in team interaction loops can result in major hazards and complex failure modes. We developed a systematic method to explicitly model the operating crew as a social interactive unit and investigate their dynamic behavior under critical situations via simulation. The ultimate goal is to study the effects of team factors and team dynamics on the risk of a complex and safety critical system; the main focus being on team errors, associated causes and error management inside the team and their impact on team performance. The framework is used to model and evaluate team activities such as collaborative information collection, establishing shared mental models, team decision making and combined action execution. The crew model consists of models of individual operators, team communication, team error management and detailed causal maps to capture the effects of associated performance shaping factors on operator/s functions. An object based modeling methodology is applied to represent system elements and different roles and behaviors within the operating team. IDAC cognitive model is used as the basic infrastructure for the individual operator model, and scenario generation follows typical dynamic probabilistic risk assessment methodologies. We developed a customized library in MATLAB Simulink which facilitates the modeling process for similar applications of the methodology. The method capabilities are demonstrated through building and simulating a simplified model of a steam/power generating plant. Different configurations of team characteristics and influencing factors have been simulated and compared. The results are also compared with several theoretical models and empirical studies.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.114 Banducci, AM*; Tvermoes, B; Bebenek, I; Monnot, A; Devlin, K; Madl, A; Cardno Chemrisk; [email protected] An exposure and health risk assessment of metals in apple juice Concerns have recently been raised about heavy metal contamination in apple juices and its potential impact on children’s health. Heavy metals such as aluminum (Al), arsenic (As), chromium (Cr), copper (Cu), lead (Pb), manganese (Mn), mercury (Hg), and zinc (Zn) are currently and have been historically used in a number of herbicides, fungicides, insecticides, and other pesticides in the United States and worldwide. As a result, these metals have the potential to contaminate fruit used to make juices. This study investigated the possible human health risks associated with heavy metal contamination in apple juice. The concentration of several metals including Al, As, cadmium (Cd), Cr, Cu, Pb, Mn, Hg and Zn were measured in six commercially available brands of apple juice and three organic brands. The concentrations of total As, Cd, Cr, Cu, Hg, and Zn in all nine apple juice brands sampled were below each metal’s respective FDA maximum contaminant level for bottled water. However, in some juices the levels of Al, Pb, and Mn exceeded FDA maximum contaminant levels for bottled water. Thus, to understand the implications of these findings in regards to children’s health, hazard quotients (HQs) for Al and Mn were calculated to estimate non-carcinogenic risk of heavy metal exposure from apple juice. In addition, blood Pb concentrations were estimated to characterize potential risk from Pb exposure following apple juice consumption. Due to recent concerns, non-carcinogenic risk was also estimated for As. Our results suggest that the exposure concentrations of Al, Mn, Pb, and As that result from apple juice consumption do not pose an increased non-carcinogenic health risk for children.

W2-K.4 Baranowski, C; US Environmental Protection Agency; [email protected] Building a More Resilient Water Sector by Assessing and Responding to Potential Vulnerabilities The U.S. Environmental Protection Agency’s (EPA) Water Security Division (WSD) has developed resources to improve the security and resilience of the nation’s drinking water and wastewater infrastructure. These resources support utilities in assessing vulnerabilities to terrorist attacks, natural disasters and extreme weather events. Several tools are freely available to be used by water utilities building resilience to potential threats and seeking to continue to meet their public health and environmental missions. The Vulnerability Self-Assessment Tool (VSAT) was developed to support the utilities meet the requirements of the 2002 Bioterrorism Response Act. Community drinking water systems serving a population greater than 3,300 were required to perform a vulnerability assessment and update their Emergency Response Plans. VSAT guides utilities through a framework to determine the risk from both security threats and natural hazards and assess the consequences of these threats in terms of public health and economic impacts to the utility and community. Utilities are also faced with preparing for hazards associated with long-term changes in climate, including droughts and sea-level rise. The Climate Resilience Evaluation and Awareness Tool (CREAT) provides relevant data in the form of climate scenarios describing a range of changes in temperature and precipitation. The risk assessment process encourages utilities to think systematically about the impact of changing conditions and to identify cost-effective strategies for adaptation. Finally, the Water Health and Economic Analysis Tool (WHEAT) provides a means to estimate the consequences for the utility and surrounding community when critical assets are lost, system becomes contaminated or hazardous gas is released. Each of these scenarios can be analyzed and data drawn from reports directly inform assessments.

M3-I.4 Baroud, H*; Barker, K; University of Oklahoma; [email protected] Modeling Resilience Stochastic Metrics with Bayesian Kernel Methods: Application to Inland Waterway Networks Critical infrastructure systems such as transportation systems have been vulnerable in the past decade to numerous disruptive events. Analyzing the risk involved with such systems requires accounting for preparedness and response decision making under uncertainty. This paper applies a Bayesian kernel approach to model the resilience of infrastructure systems, an approach aimed at accurately quantifying and estimating the resilience of a transportation system under the uncertainty of disruptive events given data describing the characteristics of the infrastructure system and disruption scenario. The resilience of the overall system relies on the performance of the individual links impacted by the event. Therefore, considering importance measures for individual links helps in identifying the links that most impact the resilience of the entire system, thus providing priorities for preparedness and recovery focus. The “resilience worth” of a link is an index that quantifies how the time to total network service resilience is improved for a certain disruptive event scenario if this link is invulnerable. The model is deployed in an application to an inland waterway transportation network, the Mississippi River Navigation system for which the recovery of disrupted links represented by sections of the river is analyzed by estimating the resilience using the Bayesian kernel model. Note that this type of application suffers from the sparse data due to the scarce occurrence of disruptions. Therefore, we supplement our modeling approach with sensitivity analysis with respect to the parameters of the prior distribution. Such a model introduces a higher level of accuracy to the estimation of the resilience worth of individual links as historical information pertaining to the characteristics of each link helps in shaping the posterior distribution. The model can assist risk managers in enhancing preparedness and recovery activities for different segments of an infrastructure system by analyzing the recovery trajectory from other disrupted segments.

W3-F.2 Barrett, AM; Global catastrophic risk institute; [email protected] Analyzing and reducing the risks of inadvertent nuclear war between the United States and Russia Despite the fall of the Soviet Union, a number of analysts argue that inadvertent nuclear war between the US and Russia still presents a significant risk. A wide range of events have been mistakenly interpreted as possible indicators of nuclear attack (including weather phenomena, wild animal activity, and control-room training tapes loaded at the wrong time) and such a conclusion could lead the US or Russia to respond in kind. Although many such failure modes have been identified and addressed in some way, additional research could be valuable in identifying both long-standing and new hazards, quantifying their relative risks, and informing policies. Potential risk-reduction strategies could then be considered in various ways by U.S. and Russian authorities, such as in the Nuclear Posture Reviews and revisions of strategy details periodically performed by the U.S. Department of Defense. We develop a mathematical modeling framework using fault trees and Poisson stochastic processes for analyzing the risks of inadvertent nuclear war from U.S. or Russian misinterpretation of false alarms in early warning systems, and for assessing the potential value of inadvertence risk reduction options. The model also uses publicly available information on early-warning systems, near-miss incidents, and other factors to estimate probabilities of a U.S.-Russia crisis, the rates of false alarms, and the probabilities that leaders will launch missiles in response to a false alarm. We discuss results, uncertainties, limitations, and policy implications.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-E.5 Bartholomew, MJ; Hooberman, B; Stewart, KN*; Okelo, PO; Graber, G; FDA Center for Veterinary Medicine; FDA Office of Foods and Veterinary Medicine; FDA Center for Veterinary Medicine; FDA Center for Veterinary Medicine; AFSS Consulting; [email protected] Ranking Contaminants in Swine and Poultry Feed In addition to being responsible for the safety and effectiveness of new animal drugs, the FDA Center for Veterinary Medicine (CVM) has responsibility for the safety of the U. S. animal feed supply. The Animal Feed Safety System (AFSS) was developed to integrate a variety of activities related to the Center’s regulatory responsibility concerning animal feed. One facet of the AFSS is a risk-ranking model, described in the talk Risk-Ranking Model for Hazards in Animal Feed, which is being presented at this meeting. The model evaluates likelihood of exposure and health consequences from exposure to contaminants in animal feed. In this presentation we apply the model, using examples for two food-producing animal species, swine and poultry. Risks associated with hazards in the animal feed for the two animal species are estimated, ranked, and compared. The AFSS model may be used by CVM to guide allocation of resources and other management decisions concerning the regulation of animal feeds. Possible implications for risk management decisions based on the findings for the examples will be presented.

M3-D.1 Bartrand, TA*; Marks, HM; Coleman, ME; Donahue, D; Hines, SA; Comer, JE; Taft, SC; Tetra Tech; [email protected] The Influence of Dosing Schedule on Rabbit Responses to Aerosols of Bacillus anthracis Traditional microbial dose response analysis and survival analysis were used to model time of death of New Zealand white rabbits exposed to low aerosol doses of Bacillus anthracis spores. Two sets of experimental data were analyzed. The first set included the times to death of hosts exposed to single doses of B. anthracis spores. The second set provided the times to death for rabbits exposed to multiple daily doses (excluding weekends) of B. anthracis spores. A model predicting times to death based on an exponential microbial dose response assessment, superimposed with an empirically derived incubation function using survival analysis methods was evaluated to fit the two data sets. Several additional models for time to death for aerosols of B. anthracis were also assessed for comparison, including varying the determined hazard function over time, survival models with different underlying dose response functions, and a published mechanistic model. None of these models provided a statistically significant improvement in fit over the exponential-based model in which there was no time dependent effect on the hazard function. Therefore, the model suggests, for the dosing schedule used in this study, long-term response of the hosts depends only on the net accumulated dose an animal received before dying. This finding may be due to small size of the data sets and number of animals that died. Further research with alternative dosing schedules, collection of immune system data (particularly innate immune response), and alternative pathogen-host pairings is needed to clarify the relationship of time to death and dosing schedule.

M2-C.1 Bassarak, C*; Pfister, HR; Böhm, G; Leuphana University Lueneburg; University Bergen; [email protected] Moral aspects in the perception of societal risks Research has long neglected aspects of morality in risk perception. However, recently there is increasing consensus between practitioners and researchers that epistemic risk judgments and moral judgments are closely related. This is particularly the case when it comes to complex societal risks such as terrorism, nuclear power or global warming. These ideas have been supported by a study employing the psychometric paradigm (Slovic, 1987) where we found that dread, a common dimension of risk perception that has been found to be related to perceived overall risk, is highly blended with morality. However, these data were measured explicitly and participants were asked for moral and risk judgments on the same occasion. In a second step, we are now interested in the question whether it makes a difference if one is asked to either give a moral or an epistemic risk judgment about a societal risk. In a laboratory study, participants (N = 51) were explicitly and implicitly asked to give either epistemic risk judgments or moral judgments regarding six societal risk items which were selected on the basis of preceding studies. Implicit judgments were measured using the single target implicit association test which is an assignment test that reports data on errors and latencies. These data are usually transformed into so called D-scores which can be interpreted as effect sizes measures for association strength. An analysis of variance suggests that D-scores are significantly higher in the morality than in the risk condition. From that we conclude that societal risks can be better mapped onto a moral-immoral dimension than onto a risky-safe dimension. Finally, analyses will be presented predicting overall explicit risk judgment with implicit risk and morality judgments. Thus, we seek to gain insight into what affects lay-peoples’ risk judgments and stimulate the discussion how this knowledge may assist political decision making making or risk communication.

P.125 Bates, ME*; Linkov, I; Clark, TL; Curran, RW; Bell, HM; US Army Corps of Engineers - Engineer Research and Development Center, Pacific Disaster Center; [email protected] Application of multi-criteria decision snalysis to humanitarian sssistance and disaster response site suitability analysis Humanitarian Assistance and Disaster Response (HADR) managers often face the complex task of prioritizing limited funds for investment across broad regions of varying need. In selecting regions and sites for project investment, project funders must assess and tradeoff site investment suitability along multiple dimensions. For example, governmental HADR resources might be invested to fit a combination of needs including: investing agency mission, local community hazard exposure, local community resilience, and projected investment sustainability, etc., each of which can be decomposed into many relevant sub-criteria. This poster presents a framework for HADR site suitability analysis based on the integration of spatial and non-spatial data from Geographic Information Systems (GIS) and other HADR sources via Multi-Criteria Decision Analysis, an analytical approach for integrating data across traditionally-incommensurate criteria via value functions and tradeoff weights. This framework is applied to a case study using HADR data to analyze investment suitability at the Department-level in El Salvador.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.124 Bates, ME*; Shoaf, H; Keisler, JM; Dokukin, D; Linkov, I; US Army Corps of Engineers, Engineer Research and Development Center; [email protected] Using portfolio optimization to select an optimal set of water security countermeasures Counterterrorism decisions for infrastructure security can be challenging due to resource constraints and the large number and scope of potential targets and threats to consider. This poster presents a multi-criteria portfolio decision model (PDM) that optimizes countermeasure selection to maximize effectiveness under various counter-terrorism budget levels. Multi-criteria decision analysis (MCDA) is used to assess the holistic benefits of protective countermeasures when applied to key infrastructure in specific threat environments. Resulting scores, cost estimates, and synergistic/redundant interactions between projects are used to construct an efficient funding frontier that tracks how budget changes affect optimal portfolio composition. Results are presented for a case study based on literature data and author judgment optimizing protection against terrorist threats to a water supply network.

W2-B.1 Bateson, T; US EPA; [email protected] Sources of uncertainty in epidemiological studies and their impact on human health risk assessments : ILSI’s Health and Environmental Science Institute identified the evaluation of causality in epidemiological studies as an emerging science priority in 2010. A multidisciplinary subcommittee of experts convened in RTP, NC in October of 2012 to stimulate a dialogue on the methods and issues related to evaluating causality and interpreting evidence from published epidemiological studies within the context of human health risk assessment. One of three workshop breakout groups was charged with characterizing the key sources of uncertainty in environmental and occupational epidemiological studies that inform risk assessments. Experts from academia, government and industry developed specific recommendations on strengthening epidemiological data for use in human health risk assessments. Ideally, a complete analysis of uncertainty would appear in every epidemiologic study, meta-analysis, toxicology/MoA study, and risk assessment. This is undoubtedly an unrealistic expectation, and reality will be driven by resources and specific needs. At a minimum, each study should acknowledge uncertainty and discuss, at least qualitatively, the likely direction and magnitude of error. As a defined goal, every study should incorporate a quantitative sensitivity analysis to examine individual factors and a conjoint analysis of factors. As an ideal, studies should incorporate a quantitative multi-source uncertainty analysis. This approach to case-by-case, tiered analysis is analogous to the discussion in the “Silver Book” (NRC 2009, page 103) with quantitative uncertainty analysis representing the “top tier.” In short, a framework for addressing uncertainty should ultimately include a comprehensive approach to individual studies, meta-analyses, toxicology, MoA, and risk assessment. Disclaimer: The views expressed are those of the authors and do not necessarily reflect the views or policies of the US EPA.

W2-D.1 Batz, MB*; Robertson, LJ; van der Giessen, JW; Dixon, BR; Caipo, ML; Kojima, M; Cahill, S; University of Florida; [email protected] Ranking Foodborne Parasites: Outcomes of an Expert-based Multi-Criteria Analysis In December 2010, the Codex Committee on Food Hygiene (CCFH) requested that the United Nations Food and Agriculture Organization (FAO) and World Health Organization (WHO) provide the Committee with “guidance on the parasite-commodity combinations of particular concern.” FAO and WHO initiated a series of activities to provide this guidance. Foodborne parasites cause a high burden of infectious disease globally, yet generally do not receive the same amount of attention as other microbiological and chemical hazards in food. Data on disease incidence and transmission routes are lacking, a problem exasperated by symptoms that may be latent or chronic. During a weeklong expert workshop, held in Rome in September 2012, a multi-criteria decision analytic approach was developed and applied. Experts screened an initial list of 95 parasites down to 24 and identified food pathways for each. Seven criteria were identified for parasite scoring, and defined using quantitative metrics where possible: disease prevalence, global distribution, disease emergence, morbidity severity, case-fatality ratio, trade relevance, and socio-economic impact. Each parasite was scored by groups of five along these criteria, with revisions following full-group discussions. Groups provided weights for combining criteria into risk scores, which were computed, averaged across groups, and ranked. Top-ranked parasites included Taenia solium, Echinococcus granulosus, E. multilocularis, Toxoplasma gondii, Cryptosporidium, Entamoeba histolytica, Trichinella spiralis, Opisthorchiidae, Ascaris, and Trypanosoma cruzi. Rankings were largely driven by public health impact over other criteria. This multi-criteria ranking is the first of its kind for global foodborne parasites, and served as a useful, systematic, and open approach to providing policy guidance. The approach itself has broader applications, as it could be adapted for regional or national use, or expanded to other infectious diseases.

W3-F.4 Baum, S; Global Catastrophic Risk Institute; [email protected] The resilience of human civilization in the face of global catastrophes Resilience can be defined as the ability of a system to withstand disruptions and return to its initial state. Resilience concepts have contributed much to risk analysis. This presentation applies resilience concepts to risks to the global human system, including global catastrophes that threaten human extinction. Research on global catastrophic risk has focused on the possibility of catastrophes and how to prevent them. But if human civilization is sufficiently resilient, then survivors may be able to maintain or rebuild civilization. This possibility has important implications for global catastrophic risk policy now, i.e. prior to catastrophes.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W3-J.1 Baxter, J.*; Robinson, L.; Metz, D.; Bolthrunis, S.; Industrial Economics Inc. (Baxter, Metz, and Bolthrunis); Havard School of Public Health (Robinson); [email protected] Quantitative Adjustments Addressing Under-reporting of Baseline Risks Associated With Recreational Boating Using National Health Care Databases The United States Coast Guard requires a reliable range of estimates on the number of fatalities and injuries that result from recreational boating accidents, and on the monetary value of these fatalities and injuries as well as property damages. It currently collects these data in the Boating Accident Report Database (BARD) system, and uses this information to guide boating safety policy and regulation. We present the results of an analysis quantif yi n g t h e ex t en t t o wh i c h BA R D m a y under-report fatalities and injuries using well-established national databases of deaths, hospitalizations, emergency department visits, and doctor’s office visits. Our results suggest BARD may under count less severe injuries by more than two orders of magnitude. We also evaluate the importance of these missing data to policy development, considering the relative values of reducing injury risks of varying levels of severity.

T4-D.1 Beaudrie, CB*; Kandlikar, M; Long, G; Gregory, R; Wilson, T; Satterfield, T; Compass Resource Management Ltd.; [email protected] Nanotechnology Risk Screening using a Structured Decision Making (SDM) Approach The responsible development of new nanomaterials and nano-enabled products requires that potential risks are understood and managed before harmful implications occur. Nonetheless, quantitative and predictive tools for anticipating risk are still in their early stages of development. Until such assessment tools are available, there is a clear need for a robust screening methodology to inform nanomaterial risk management decision-making in regulatory agencies and industry. This paper presents the results of a two-day workshop of nanotechnology experts aimed at building an expert judgment-based risk screening framework for emerging nanomaterials. Drawing upon expertise in nanotoxicology, human exposure, and environmental fate and transport, participants developed an adaptive framework relating key nanomaterial physicochemical characteristics to important hazard and exposure indicators. This framework provides the foundation for development of an open-sourced tool to aid in risk screening and identifying opportunities to re-engineer products to minimize risk.

M3-B.4 Beck, NB; American Chemistry Council; [email protected] Discussion: Pulling the Pieces Together This part of the program (the last talk of the double session) will be a discussion and questions-and-answers session to more fully tie the seven previous talks together and to explore the similarities/differences and benefits of each of the approaches that have been discussed.

T2-B.4 becker, RA*; Olden, K; becker, richard; Author 1) Becker (American Chemistry Council); Author 2) Olden (US Environmental Protection Agency); [email protected] Progress Made in Improving IRIS: A Panel Discussion Presenters and additional experts, drawn from EPA, academia, non-governmental organizations and industry, will participate in a discussion that will include audience question and answers. Discussion will include perspectives on changes that have already been made, plans EPA has announced for making further changes and areas where additional improvements would strengthen IRIS.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-I.3 Bekera, B*; Francis, RA; Omitaomu, O; GWU, ORNL; [email protected] Drought Forecasting and Resilience Analysis of Nuclear Power Plants Infrastructure Climate change can affect nuclear power plants through increased temperature and increased incidence of drought. Due to their reliance on water for reactor cooling, if water temperature increases or water resource availability is reduced due to drought, the plant may be forced to dial back on production or even shutdown completely. The objective of this research is to develop risk profiles for the some of the existing and proposed power plants within the Tennessee River basin. In doing so, we hope to provide a valuable insight for making risk-informed investment decisions in this critical infrastructure sector. In this paper, we demonstrate how an uncertainty-weighed systems resilience metric developed by the authors may be used to assess impacts of drought on operations of nuclear power plants. This resilience metric integrates absorptive capacity, adaptive capacity and rapidity service restoration for a system facing unlikely but highly disruptive phenomena such as drought. This is achieved by modeling the system’s response to drought using a compound Poisson process. Finally, we discuss potential implications of our findings for updating facility siting techniques in order to account for imminent scenarios in foreseeable future.

T1-A.5 Bell, MZ*; Yang, ZJ; State University of New York at Buffalo, State University of New York at Buffalo; [email protected] Nuclear media discourse post-Fukushima: The state of media coverage pertaining to nuclear energy before and after the Fukushima 2011 nuclear incident Public and lay perceptions of nuclear energy have long been studied, and so too has media representation of nuclear energy, often cited to be either heavily scaremongering or deceptively rose-tinted in its portrayals. These studies of risk perception and media content have often centered on certain risk events such as Three Mile Island or Chernobyl. Given the timely opportunity, this paper seeks to continue this legacy of nuclear risk studies taking the recent example of the Fukushima Daiichi 2011 incident to explore the ways in which a new nuclear risk affects the representation of ‘nuclear energy’, seeking to understand whether representations in three major US newspapers are different before and after the Fukushima accident. Results show that views represented in the media are less positive during 2011 but the rebound effect may be present with portrayals becoming more positive in 2012. There is also evidence in the media of support in continuance of nuclear energy despite Fukushima, with fewer mentions of alternative sources in 2012. We also see more discussion of natural hazard concerns regarding nuclear energy. The overall message seems to be that Fukushima has had an impact on the newspaper discourse, yet the media continues to see the nuclear revival as pivotal to the energy scene.

W3-C.3 Belova, A*; Narayan, T; Brown, L; Haskell, J; Bozeman, S; Lamb, J; Abt Associates Inc; [email protected] A Framework to Assess Aflatoxin Public Health Impacts in Developing Countries with Application to Nigeria and Tanzania Aflatoxins are naturally occurring poisons produced mainly by the Aspergillus flavus and Aspergillus parasiticus fungi that thrive in hot, humid, and drought-prone climates. They are commonly found in developing countries. Aflatoxins affect staple crops and are detectable only by specialized testing. Chronic exposure to B1 aflatoxins (AFB1) causes hepatocellular carcinoma (HCC) and is linked to cirrhosis, immune suppression, and stunting. Due to low public awareness, ineffective food safety standards, and food security issues in the developing countries, consumption of aflatoxin-contaminated food produced by local agriculture sector occurs. To assess the magnitude of the resulting public health damage, we developed a health impacts framework at the country-level. We applied it in Nigeria and Tanzania, focusing on maize and groundnuts. We quantified HCC incidence due to aflatoxin consumption, accounting for hepatitis B prevalence and consumption variability by age, sex, and geography, and monetized the damage using a benefits transfer. For Nigeria we estimated an AFB1 contamination level of 67ppb, suggesting that 7,761 HCC cases out of 10,130 total HCC cases estimated in 2010 were attributable to aflatoxin exposure. The monetized damage was $380-$3,174 million (in 2010 USD), with the high estimate being ~1.6% of Nigerian GDP in 2010. In Tanzania, AFB1 prevalence is low, but exposure could be significant due to high maize consumption (~400-500 g/day). At the Tanzania AFB1 standard of 5ppb, we estimated 546 excess HCC cases, or ~1/3 of the total estimated HCC cases in 2010. The monetized damage at the AFB1 level of 5ppb was $18-$147 million (in 2010 USD). A sensitivity analysis using zero hepatitis B prevalence resulted in a 3x reduction in health damage. Our damage estimates for various contamination levels could serve as guidelines for the size of a government program to reduce the aflatoxin risks (via standard enforcement, control measure subsidies, immunization).

T2-K.4 Belzer, RB; Regulatory Checkbook; [email protected] How Many Substances Are Illegally Listed in the Biennial Report on Carcinogens? The National Toxicology Program (NTP) attempts to publish a biennial Report on Carcinogens, colloquially abbreviated as the “RoC.” The RoC, especially the 12th Edition published in 2011, has proved to be highly controversial. In a recent monograph (Belzer, 2012), I demonstrated that the core defect of the RoC is the listing criteria used by NTP to determine whether a substance is a “known” or “reasonably anticipated” human carcinogen. In particular, the listing criteria cannot be interpreted scientifically because they have no scientific content. In lieu of science, NTP personnel subjectively interpret ambiguous legal thresholds (e.g., “sufficient evidence,” “adequate evidence”) in an undisclosed, non-transparent, and non-reproducible manner. Moreover, the NTP need not provide any evidence, much less demonstrate scientifically, that there is a causal relationship between the purported positive evidence and human carcinogenicity. The reason is the listing criteria are written in a way that implies causality if the NTP determines that the evidence is “sufficient,” “adequate,” or in certain circumstances, “less than sufficient.” By law, before a substance may be listed, the NTP also must show that “a significant number of persons residing in the United States are exposed” to it. However, substance profiles in the RoC generally do not include this information. Perhaps surprisingly, the absence of exposure information has not been a source of controversy. This paper reviews the 240 substance profiles in the 12th edition of the RoC to ascertain how many do not contain the exposure information required by law. It is inferred that each of these listings is illegal by statute.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-C.4 Ben-Haim, Y; Technion ; [email protected] What Military Strategy can Teach Us about Risk-Management and Uncertainty War entails vast risks and uncertainties. Risk managers can learn valuable lessons from the disputes among military theorists. We discuss a foundational dispute between Clauswitz and Jomini in their attempts to understand Napoleon's overwhelming military success. What was the key to his military invincibility? Clauswitz felt that it is futile to seek governing rules or patterns or laws of successful military strategy. The dominant factors in war, according to Clauswitz, are uncertainty and friction (by which he meant the inherent resistance of reality to the will of men at war). Jomini disagreed. He felt that, while war is not a science like the study of nature, it nonetheless has patterns and rules that can be discovered. He felt that he uncovered the rules that made Napoleon so successful. In modern language we would say that Jomini identified successful methods of operational art: rules for the manipulation of large field units. The main claim of this talk is that the analysis, and even more so the management, of risk, must be approached in the tradition of Clauswitz far more than of Jomini. It is the uniqueness and uncertainty of risk situations that present the special challenges that risk analysts and mangers must face. Once a risk becomes well known and thoroughly understood it is no longer a risk, and it is managed by a routine. Risks are "risky" precisely because of the endless possibility for surprise. Sound science, careful measurement, accurate monitoring and reporting, are all crucial. Nonetheless, a successful risk professional is able to analyze and manage uncertainty, variability and surprise (info-gap theory is one tool). This is very Clauswitzian and not Jominian at all.

M2-H.2 Bennett, SP*; Waters, JF; Howard, K; Baker, H; McGinn, TJ; Wong, DY; U.S. Department of Homeland Security; [email protected] Indicators and warnings for biological events: enhanced biosurveillance through the fusion of pre-hospital data. The United States is vulnerable to a range of threats that can impact human, agricultural, or environmental health. Managing risks from covert or naturally-occurring, accidental, or deliberate biological events such as bioterrorism or emerging infectious disease or bioterrorism is difficult to accomplish through activities that attempt to reduce these events’ likelihood of occurrence. Instead, activities that mitigate these risks largely focus on reducing, managing, or limiting the consequences of biological events once they begin to occur. To do this effectively requires the earliest possible warning that an event is occurring, as well as continuing shared situational awareness throughout the event, to enable effective decision making regarding what management actions should be taken. In this presentation, we will describe advances in the Department of Homeland Security’s National Biosurveillance Integration Center (NBIC) towards integrating and fusing early pre-hospital health data to provide early indicators and warnings. In addition, we will discuss initial concepts for the development of a risk and decision analysis framework to support the use of early warning signals and ongoing event characterization and decision support.

M4-I.4 Benouar, D*; Rovins, J; University of Science and Technology Houari Boumediene (USTHB); [email protected] Forensic disaster investigations (FORIN), a new approach to learn lessons from disasters: A case study of the 2001 Algiers (Algeria) Flood and Debris flow Disasters are increasingly being understood as ‘processes’ and not discreet ‘events’. Moreover, the causes of disasters are driven by complex engineering, socio-economic, socio-cultural, and various geophysical factors. Such interacting driving factors, occurring across a range of temporal and spatial scales, combine in numerous ways to configure disaster risks. Using some selected disasters in Africa, the dynamics of such risks and their configurations will be explored using a new approach and methodology, namely Forensic Disaster Investigations (also called FORIN studies). Forensic task is perhaps similar to solving a picture of a disaster puzzle. Initially, there are dozens or even hundreds of apparently disorganized pieces piled when examined individually, each piece may not provide much information. Methodically, the various pieces are sorted and patiently fitted together in a logical context taking into account all the parameters. Slowly, an overall picture of the disaster emerges. When a significant portion of the disaster puzzle has been solved, it then becomes easier to see where the remaining pieces fit. The Integrated Research on Disaster Risk programme is proposing new methodologies to examine the root issues surrounding the increase in disaster cost both human and economic.This paper attempts, as a case study, to investigate the Algiers (Algeria) floods and debris flows of 10 November 2001 which caused the loss of more than 714 human lives, injured more than 312, made missing 116 and about 10 000 were homeless, damaging more than 1500 housing units and scores of schools, bridges and public works. The objective is to dig more deeply into the causes of disasters in an integrated, comprehensive, transparent, and investigative or forensic style. To establish a sound basis for analysis, FORIN relies upon the actual evidence found and applies accepted scientific methodologies and principles to interpret the disaster in all its facets. Often, the analysis requires the simultaneous application of several scientific disciplines.

T4-C.1 Benromdhane, S.A.*; Hubbell, B.J.; US EPA, OAQPS; [email protected] Air pollution mixtures and related health endpoints - A multipollutant risk assessment framework Abstract Air pollution regulatory and research strategies have largely focused on single pollutants, groups of chemically related pollutants, e.g. volatile organic compounds, or source-related pollutants, e.g. mobile source emissions. However, population exposures occur in locations with complex mixtures resulting from multiple sources of emissions and containing both gases and particles. Because most epidemiological studies focus on linear statistical models which do not account for interactions between pollutants, estimates of risk can suffer from biases and imprecision due to confounding from co-pollutants, or from incomplete model specification. This can lead to difficulties in interpreting the results for individual pollutants, and can potentially bias estimates in risk analyses conducted using those results. These biases can be in the estimates of risk for individual pollutants, as well as estimates of total risks across groups of pollutants. This presentation intends to evaluate the current single pollutant focused risk assessment paradigm and identify strengths and weaknesses that can be utilized to guide the transition to a multipollutant framework. A multipollutant conceptual scheme for toxicity characterization based on mode of action is illustrated in Figure 1. This conceptual model shows how multiple pollutants could contribute additively or synergistically to a specific health endpoint. More comprehensive conceptual models based on the principles of exposure and risk analysis are needed to help inform the development of future multipollutant epidemiology studies; these conceptual models can also form the basis for an improved multipollutant exposure and risk analysis framework.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W4-C.3 Bessette, DL*; Campbell-Arvai, V; Arvai, JL; University of Calgary; [email protected] A decision support framework for developing regional energy strategies Developing energy strategies requires more than just technical skill and subject-matter expertise, it requires the ability to identify and prioritize objectives, construct—and forecast the consequences of—alternatives, and confront tradeoffs. These elements must be incorporated into a decision-making framework that also allows for public participation and deliberation and does so in a rigorous, transparent and defensible manner. If the ultimate goal is to inform policy, then such a process must also be iterative and rapidly deployable. This paper reports on a structured decision-making (SDM) framework deployed in Michigan in 2012 (n = 182) and then again in Alberta, Canada in 2013 (n > 500). This online framework not only incorporated the above elements into the energy-strategy development process, but also for the first time allowed stakeholders to construct their own portfolios of energy alternatives. Previous work in this field has asked participants to compare and assess a small number of portfolios (6-10), with each portfolio focusing on unique objectives. The current framework, using real, context-specific energy constraints and demands, asked participants to build portfolios by choosing from twenty different fuel and energy types, and emission-mitigation technologies, while also examining how those portfolios performed and compared across six key objectives. The framework also included a portfolio-ranking task and swing weighting of objectives. Participants reported high satisfaction, comfort with the process informing real decisions, and low cognitive difficulty and overall stress. By comparing pre- and post-framework knowledge levels, participants exhibited significantly more knowledge gain in treatments that involved portfolio construction, as compared to participants in treatments that omitted the construction element. Finally, participants’ portfolio ranks and a separate rank order generated using a linear additive value model show that participants’ decisions may not be internally consistent.

M2-G.1 Binder, AR*; Zechman, EM; North Carolina State University; [email protected] Recycled water and risk communication: How citizens evaluate new technologies for municipal water systems Population growth, drought, and climate change increase the stresses on water supply, and the potential for increased water scarcity has drawn attention to the possibilities of water reuse for re-engineering the urban water cycle. However, research indicates that consumers may have an initial reaction of disgust to the idea of using recycled water, and scholars have therefore investigated ways to encourage adoption of its use. Because of the possibility of initial negative reactions, there is a substantial set of existing literature on how citizens assess subjective risks of technologies and how those perceptions and interpretations reverberate throughout a social system. The current study builds upon the psychometric and social amplification of risk frameworks by investigating how citizens make sense of a new technology to augment water supplies in their communities. With a representative survey of adults residing in the United States, we measured knowledge, attitudes, interest, behavioral intentions, and other variables surrounding the issue of recycled water. Our data offer a novel look at individual- and social-level sense-making of a new, immediately tangible technology carrying a unique risk signal. Preliminary findings indicate that risk/benefit evaluations of recycled water are not only based on psychological factors such as trust and knowledge, but also highly influenced by social factors such as interpersonal discussion and social networks. Our findings contribute to applied knowledge about the relationship between activities that municipal water utilities may or may not control, such as educational campaigns or word-of-mouth communication, to gain insight to the factors that may drive the success of plans by cities and towns to incorporate recycled water into their water supply infrastructure.

W3-K.3 Bjerga, T*; Aven, T; University of Stavanger; [email protected] Adaptive risk management using the new risk perspectives – an example from the oil and gas industry This paper discusses management of risk in case of large uncertainties, and the use of adaptive risk management in such situations. This type of management is based on the acknowledgement that one best decision cannot be made but rather a set of alternatives should be dynamically tracked to gain information and knowledge about the effects of different courses of action. In the paper we study a case from the oil and gas industry, the main aim being to gain insights on how the adaptive risk management could be implemented when giving due attention to the knowledge and uncertainty aspects of risk. In recent years several authors have argued for the adoption of some new types of risk perspectives which highlight uncertainties and knowledge rather than probabilities in the way risk is understood and measured - the present paper is using these perspectives as the basis for the discussion.

W4-E.2 Boelter, FB; ENVIRON International; [email protected] Risk Assessment as a Core Competency for Industrial Hygiene Industrial hygiene developed out of public health initiatives and evolved with the industrialization of modern society. Industrial hygienists are frequently referenced as the original health risk assessors. We will cover what is risk and how the four step paradigm informs occupational exposure assessments. We will discuss risk characterization and perception as well as the various concepts of risk and the different thoughts about safe and zero risk. We will look at human behaviors and modes of action such as tending to underweight false alarms and waiting to act until a hazard is imminent. We will also examine occupational health parallels to public health and environmental exposures assessments and the concept of “Fit for Purpose”. We will discuss core competencies for the occupational hygienist related to risk assessment, risk characterization, risk management, and risk communication. Often our professional language is one of comparisons following numerical calculations. Most of the time our audience is attuned to a different nontechnical, more intuitive language. We need to effectively convey our information to those whose behaviors we are trying to influence. We will also explore risk related initiatives being undertaken in the occupational hygiene community to develop and make available methods and tools for risk-based decision-making. Such tools include decision analysis, cost-benefit analysis (or benefit-cost analysis), cost-effectiveness analysis, comparative risk analysis, and value-of-information analysis. Many of these approaches were developed outside of the occupational hygiene profession but within the context of environmental, medical, and financial decision-making, yet they are well-suited for use in occupational settings. By conveying the risk of outcomes in a manner that aligns with the verbiage and context of “risk”, occupational hygienists will be able to sit at the planning table and pull the resources needed to continue the important work of focusing resources needed to adequately protecting workers.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M4-E.2 Boelter, FW*; Xia, Y; Persky, JD; ENVIRON International; [email protected] Cumulative Exposures to Asbestos Fibers from Dropped Ceiling Installation and Maintenance Dropped ceilings began around 1940s to be used in commercial and to a lesser degree residential construction for better acoustics, modern appearance, reduction of heating and lighting costs, and compatibility with then-new suspended fluorescent lighting. Several fatal fires including the 1958 Our Lady of the Angels fire in Chicago in which combustible building materials including acoustic tiles were accused of costing loss of life, hastened the inclusion of fireproof asbestos in interior building materials until the mid to late 1970's. We developed an algorithm for the reconstruction of exposures to airborne asbestos fibers resulting from ceiling tile activities and implemented the algorithm using hypothetical work histories for five categories of people – specialty construction contractors, general building contractors, Do-It-Yourself nonprofessionals (DIYer), maintenance trades, and bystanders. We present new exposure data obtained through two field studies on asbestos-containing ceiling tiles removal and replacement and one chamber study involving cutting and installing such tiles. These data, coupled with professional judgment and mathematical modeling (Bayesian decision analysis and stochastic simulation), lead to the estimations of 8 h time-weighted average (TWA) as well as 1-year and 10-year cumulative exposures to asbestos fibers from working with asbestos-containing ceiling tiles. Our results estimate mean 1-year cumulative exposure expressed as f/cc-years to be 0.007 for specialty construction contractors, 0.0004 for general building contractors, 0.00008 for Do-It-Yourself nonprofessionals, 0.004 for maintenance trades, and 0.0004 for bystanders. The distributions of 8 h TWA dust and asbestos fiber exposures estimated for the five worker categories are compared to historical exposure data, while the cumulative exposure estimates are used to evaluate health risks. None of the five categories of workers receive a cumulative exposure (dose) that would significantly increase their risk for disease development.

T2-H.1 Boerman, D*; Gallagher, M; Headquarters, U.S. Air Force; [email protected] U.S. Air Force Risk Assessment Framework The Headquarters of the United States Air Force has adopted major components of the Chairman of the Joint Chiefs of Staff's Risk Assessment System to develop a risk-based decision support approach to inform major decisions, especially with regard to logistics. The major advantages of using the Chairman's system are improved clarity in internal Air Force decisions and improved risk communication between the Air Force and the Joint Staff. Use of a mature, rigorous process improves both identification of risk origin and credibility in communicating risk information. This transition to a Risk Assessment Framework is still underway and lessons learned in implementing a new risk approach within a complex government organization will benefit risk practioners in and out of government

W4-B.3 Boffetta, P*; Mundt, KA; Mundt, DJ; Checkoway, H; Swenberg, J; Adami, H-O; Icahn School of Medicine at Mount Sinai, ENVIRON International Corporation University of washington, Seattle, University of North Carolina at Chapel Hill, Harvard University School of Public Health; [email protected] Integrating toxicological & epidemiological evidence of carcinogenicity: Application of Epid-Tox framework for evaluating relationships between formaldehyde & nasopharyngeal cancer & myeloid leukemia Substantial scientific evidence is available to evaluate the carcinogenicity of formaldehyde. Although several evaluations have been conducted, none has formally integrated toxicological and epidemiological evidence. Applying the Epid-Tox Framework for systematically combining toxicological and epidemiological evidence, as described by Adami et al. (2011), we present causal inference grids for the association between formaldehyde exposure and risks for nasopharyngeal cancer (NPC) and myeloid leukemia (ML). Separate grids are necessary because of the likely different modes of action, as well as the different epidemiological evidence for these two malignancies. For each grid, we applied the following steps separately for the toxicological and epidemiological evidence: 1) we assessed the quality of primary research studies and categorize them as “acceptable”, “supplementary” or “unacceptable”; 2) we performed a weight of evidence evaluation; and 3) we assigned a scalable conclusion regarding the strength or plausibility of the evidence, stated in terms of “evidence of effect” or “evidence of an absence of effect”. The scalable conclusions are simultaneously placed on the causal relationship grid. The key determinant of the overall causal conclusion is highly dependent on the scalable conclusions for both the toxicological and epidemiological evidence. Therefore, we will present the detailed rationale for each of these and discuss the required assumptions and sources of uncertainties in interpreting these placements. Following this method, the evidence enters the “Likely” quadrant for NPC, whereas for ML the evidence enters the “Unlikely” quadrant.

M2-E.5 Bolger, PM*; Ezendam, J; Exponent, Washington DC; National Institute for Public Health and the Environment, Bilthoven, The Netherlands; [email protected] Peanut allergen: Global Burden of Disease A systematic literature review of peanut allergy was performed within the framework of the Foodborne Epidemiology Reference Group (FERG) of the World Health Organization (WHO) that is tasked with estimating the global burden of disease (BOD) for food borne diseases. The symptoms of peanut allergy vary from mild to severe, from swollen lips, shortness of breath to anaphylactic shock, which is potentially fatal. The most important parameters were found to be the number of people who suffer from a peanut allergy and the impact it has on their quality of life. BOD is a measure that quantifies the consequences of a disease by combining the loss of health from impaired quality of life and premature mortality. The prevalence of peanut allergy in Western countries is 0.5 to 1.5 per cent of the population; however, there is a lack of prevalence data from developing countries. Geographical differences in prevalence appear to exist, since peanut allergy is uncommon in Turkey and Israel. Symptoms of the allergy are induced when individuals with a peanut allergy eat products that contain peanuts. Although they can be severe, the symptoms are usually short-lasting. Consequently, they will not have a large impact on BOD. The number of people who die due to a peanut allergy is low which also has a limited impact on BOD. The quality of life of people with a peanut allergy can be significantly impaired, primarily because they are anxious about accidentally eating products that contain peanut. This impairment of quality of life is important in deriving an estimate of the BOD.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-B.2 Borgert, CJ; Applied Pharmacology and Toxicology; [email protected] Integration three ways: Classical versus mode of action approaches to weight of evidence determinations In scientific investigations, the questions posed - i.e. the hypotheses tested - determine the measurements and conditions under which they are taken, the procedures used to record and analyze data, and the context in which results are interpreted. In risk assessment, the questions addressed are typically articulated in the problem formulation phase. Decades ago, regulatory agencies couched problem formulation according to the questions answerable by the science of the day. As regulatory requirements for risk assessment became codified, so too did the rudiments of problem formulation. Unfortunately, codifying problem formulation prevents risk assessment from evolving to keep pace with scientific advancements. Today, more specific questions can be addressed and answered more precisely with more advanced science, but this science is not being used effectively because typically, the risk assessment problem formulation step still poses antiquated questions. Problem formulation needs to be modernized so that modern science can better inform risk considerations. Using a well-studied chemical, chloroform, as an example, three Weight of Evidence approaches - the classical IRIS Approach, the Human Relevance Framework Approach, and a Hypothesis-Based Mode of Action Approach are applied, compared, and contrasted. The analysis illustrates why improving the problem formulation phase is critical to making risk assessment more scientifically accurate, more practical, and more relevant for protecting human health and the environment.

M3-G.4 Bostrom, A.*; Morss, R.E.; Lazo, J.K.; Demuth, J.L.; Lazrus, H.; University of Washington; [email protected] “Every single summer”: Mental models of hurricane risks, forecasts and warnings in Miami Mental models interviews with a random public sample (N=28) from Miami-Dade County in Florida and a follow-on web-based survey of coastal residents in Florida (N=460) illustrate high hurricane awareness, varying depending on length of residence in the area. Residents express concern about wind, flying debris, and precipitation-related flooding, as well as risks encountered in preparing for and cleaning up after hurricanes, but say much less about storm surge. Interviewees describe customary preparations for and activities during hurricane season that paint a picture of a hurricane culture. Inundated with hurricane news during events, some interviewees find the media saturation tiring. The paper concludes with an assessment of reactions to forecasts and warnings, and a discussion of how these relate to causal beliefs about hurricane hazards and implications for the hurricane forecast and warning system. Acknowledgements: Funding from the U.S. National Science Foundation (NSF 0729302) is gratefully acknowledged. Partial support for this research came from a Eunice Kennedy Shriver National Institute of Child Health and Human Development research infrastructure grant, R24 HD042828, to the Center for Studies in Demography & Ecology at the University of Washington.

T4-F.4 Bouder, FE*; Way, D; Lofstedt, RE; Maastricht University; [email protected] Fighting influenza: should European regulators stockpile? This paper presents the final results of a study of the pros and cons of European antiviral stockpiling strategies put in place to combat influenza (interim results presented at SRAE 2013). Recent worries about outbreaks of a new H7N9 strain of bird flu in East Asia and its spread to other continents show the importance of understanding how influenza policy responses are shaped and developed as well as the extent to which these policies may be harmonized. We conducted elite interviews in five European countries (France, Germany, The Netherlands, Sweden and UK) as well as an opinion survey targeting over 3000 European citizens in the same countries plus Spain, carried out in co-operation with Dialogik and Ipsos Mori. The results show significant divergences both among regulators and wider publics as to how seriously they perceive the risks of influenza as well as the benefits and risks of antiviral treatments. The presentation will review the range of antiviral stockpiling strategies adopted in the five countries from the perspective of risk policy and risk communication. The main recommendation is that critical improvements should be based on taking risk communication on board as well as introducing ‘reasoned transparency’ initiatives to help patients make better risk decisions.

T1-A.6 Boyd, AD; University of Calgary; [email protected] Controversy in Energy Technology Innovation: Contrasting Community Perspectives of the Alleged Leak at the Weyburn Carbon Capture and Storage Demonstration Project In January 2011 a local farm couple from Western Canada held a press conference claiming CO2 had leaked from the Weyburn carbon capture and storage (CCS) project onto their land. The Weyburn site is one of the world’s first and largest developments demonstrating the feasibility of CCS in an enhanced oil recovery project. This was the first publicly reported instance of a leak from a CCS demonstration site and the allegations provide an opportunity to examine how a negative event can affect the perceptions of new and emerging technologies. The views of 120 residents in three different communities were explored through in-depth individual and small group interviews. Community case studies included: 1) Weyburn, Saskatchewan the location of the Weyburn CO2 Project; 2) Priddis, Alberta the location of a proposed research project that was halted due to local concerns; and 3) Fairview, Alberta which did not have any plans for a carbon injection project and serves as a comparison community. Results demonstrate that communities perceived the allegations differently. Most participants who lived near the Weyburn CO2 Project stated that there was no CO2 leak and they were not concerned about future negative events associated with CCS. Residents from Fairview and Priddis were concerned about CO2 leaks and the allegations ultimately became a factor in the cancellation of the proposed project in Priddis. This study compares and contrasts the differences in community perspectives and considers the influence of early controversy on the development of emerging technologies.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.136 BRACCA, M; MONZON, A; DEMICHELIS, SO*; ENVIRONMENT LABORATORY - DDPYT - UNLA; [email protected] Bad decisions increases health risks: reopening of an abandoned asphalt plant a case of study. The overall goal of this work is to diminish risk by planning the recovery of use and the habitat remediation of an abandoned asphalt plant (now micro-dump) belonging to the municipality of Lanus. The site is surrounded by housing and obsolescence and waste is an environmental liability that requires early intervention, After EIS, recommended the elimination of landfill, neutralization of obsolete material, the relocation of the asphalt plant and soil remediation. The situation analysis concludes that the ground is not suitable for development and operation of plant and waste and debris existence justifies its intervention. The government proposed not to move the plant, but reactivate. The negative impacts will occur on human health in case of reopening are associated with the appearance of asphalt fumes, with immediate consequences for those exposed directly and increased risk of various cancers. As for the environmental and health damage existing waste cause pest invasion, air pollution, leachate generation that pollute groundwater The transfer proposal showed investment will be much more expensive than the reopening; however, these do not include the health costs it will generate for government since public health is covered by state; therefore preventive measures are proposed as the relocation of the plant is better than reopening, by clearing the dump and the property, remediation of soil gas extraction methods and aeration will result in diminish damages. At present authorities prefers reopen it but they did not perform EIS. If the proposal of eradication fails and the reopening of the plant will take place, a program that includes surveillance and mitigation must be developed.

T4-J.2 Brand, K; University of Ottawa; [email protected] Outcome informed Departures from a Default Science Policy Assumption Default assumptions are integral to chemical risk assessments, serving to bridge knowledge-gaps that would otherwise derail quantitative assessment. Each default assumption generally enjoys both an evidential base (establishing it as a reasonable approach for filling its respective knowledge-gap) and a policy sanction of being more likely to err on the side of caution. Under certain circumstances a departure from a default assumption (e.g., one concerning dose-response relationship shape) may be both warranted and championed; the focus of the champion being upon avoiding the prospect of ``regulatory overkill'' whereby excessive and undue regulatory costs stem from the adoption of an unwarranted default. The question of an appropriate standard of proof to demand before warranting such a departure has been the source of longstanding discussion. In this talk I offer a standard expected utility (SEU) framework for informing this question. Under the SEU framework a sliding scale is prescribed as the appropriate standard of proof, and revealed to depend centrally upon the ex-ante outcomes that hinge upon the choice (between the default and alternative assumption). The implications of this framework for future deliberations over whether to depart from a default assumption are discussed as are some of the challenges that may be posed when implementing it in practice.

T3-B.3 Brewer, LE*; Teushler, L; Rice, G; Wright, JM; Neas, L; ORISE Fellow in the Research Participation Program at the U.S. EPA, Office of the Science Advisor (LEB); U.S. EPA, National Center for Environmental Assessment (LT, GR, JMW); U.S. EPA, National Health and Environmental Effects Research Laboratory(LN); [email protected] Using directed acyclic graphs in cumulative risk assessment (CRA) CRA is an emerging scientific approach for integrating human heath and ecological risks from aggregate exposures to physical, biological, chemical, and psychosocial stressors. A shift in emphasis from the traditional single-chemical risk assessment paradigm to community-focused assessments that combine risks from chemical and non-chemical stressors would benefit from a structured analysis of the potential pathways linking cause and effect. Directed acyclic graphs (DAGs) depict causal associations and provide an improved method for communicating complex relationships linked by quantitative or qualitative data. Rules for constructing DAGs are more rigorous than for conceptual models typically created in the problem formulation phase of a risk assessment, and a simple algorithm applied to the graph allows for the identification of confounders that is critically important to causal inference. As stressor-effect pathways are further elucidated through literature review and data collection, the DAG can be revisited and modified resulting in a model representative of the best evidence available for risk estimation. A DAG can therefore be utilized in three phases of CRAs: to identify potential confounders in planning, scoping, and problem formulation; to clarify assumptions and illustrate the weight of evidence approach in the analysis phase; and to communicate risks to stakeholders and decision-makers in a clear, transparent manner. In sum, DAGs provide the strong logical structure necessary for understanding the complex causal relationships among stressors and effects, and for assessing cumulative risks to human health and the environment. (The views expressed in this abstract are those of the authors and do not necessarily reflect the views or policies of the U.S. EPA.)

M2-D.3 Brinkerhoff, CJ*; Salazar, KD; Lee, JS; Chiu, WA; Oak Ridge Institute for Science & Education, Oak Ridge, TN; ORD/NCEA-IRIS, US EPA, Washington DC; ORD/NCEA-IRIS, US EPA, Research Triangle Park, NC; [email protected] Development of a PBPK Model for ETBE and TBA in Rats and Its Application to Discern Relative Contributions to Liver and Kidney Effects Ethyl tert-butyl ether (ETBE) is an oxygenated gasoline additive. ETBE is rapidly absorbed and metabolized to acetaldehyde and tert-butyl alcohol (TBA). Published studies for both ETBE and TBA in rats have reported liver and kidney effects including increased organ weights, and nephropathy. The magnitudes of these effects vary by chemical and route of exposure. Additionally, exposure to ETBE by inhalation produced an increased incidence of liver adenomas in male rats, but ETBE delivered in drinking water or TBA in drinking water did not. This difference could be due to the higher internal dose of ETBE in the inhalation study compared to the ETBE and TBA drinking water studies. A physiologically-based pharmacokinetic (PBPK) model could estimate internal doses to aid in interpreting these differences in effect however there are no existing models of ETBE in rats or of direct administration of TBA. We have developed and applied a PBPK model of ETBE and TBA in rats. The PBPK model was parameterized with toxicokinetic data in rats from iv and inhalation studies of TBA and from oral and inhalation studies of ETBE. The PBPK model was used to make quantitative comparisons of the internal blood concentrations of ETBE and TBA associated with kidney and liver effects. The dose-response relationships for ETBE blood concentration and liver adenoma incidence across inhalation and oral routes support the hypothesis that the differences in the incidence of liver adenomas are due to differences in internal dose of ETBE. The model is also being used to evaluate differences between administered and metabolized TBA in dose-response relationships for nephropathy and kidney and liver weight changes. The views expressed in this abstract are those of the authors and do not necessarily represent the views or policies of the U.S. Environmental Protection Agency or other affiliations.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.122 Bromfield, KB*; Rowe, AJ; Atapattu, AA; Environmental Protection Authority; [email protected] A New Endophyte Risk Assessment Model Fungal endophytes are microorganisms that occur naturally within plant tissues, and do not usually cause any disease symptoms. They are an important component of the plant microbiome, and affect the plant’s growth and its response to pathogens, herbivores, and varied environmental conditions through the production of secondary metabolites (alkaloids). Recent advances in plant biotechnology mean the plant traits conferred by these endophytes in association with their natural host plant can be transferred into new plant species, in much the same way that traits are manipulated in genetically modified organisms. For example, some grass endophytes are being artificially inoculated into cereal crops to confer insect pest and drought resistance. However, some of the alkaloids produced by these endophytes are known to cause illness in grazing livestock, so there is a need to assess the risks of these changes to the chemical profile of the plants receiving these endophytes. We present a model for assessing the risks associated with the manipulation of endophytes across plant groups. We have tested this model using two case studies, presented here, and we are looking to expand its application further. The questions that drive this risk assessment model will inform any plant biosecurity risk assessment, including the assessment of plants with genetically modified traits. This model is the first of its kind and provides regulators with a simple yet effective approach to risk analysis, while ensuring consistency among decision makers.

W4-C.1 Brookes, VJ*; Hernández-Jover, M; Cowled, B; Holyoake, PK; Ward, MP; 1,5 Univ. of Sydney, NSW, Australia. 2 Charles Sturt Univ., Australia. 3 AusVet Animal Health Services, Australia. 4 Dept of Environment and Primary Industries Victoria, Bendigo, Australia.; [email protected] From exotic to endemic : A stakeholder-driven framework examining disease prioritisation and the biosecurity continuum.

P.19 Brown, LPM*; Lynch, MK; Post, ES; Belova, A; Abt Associates, Inc. ; [email protected] Determining a concentration-response relationship suitable for estimating adult benefits of reduced lead exposure Lead is a highly toxic pollutant that can damage neurological, cardiovascular, and other major organ systems. The neurological effects are particularly pronounced in children. However, the recent literature has found that a wide spectrum of adverse health outcomes can occur in people of all ages. In addition, a threshold below which exposure to lead causes no adverse health effects has not been identified. This suggests that further declines in lead exposure below today’s levels could still yield important benefits. A well-established quantitative risk assessment-based approach to evaluating the benefits of reductions in lead releases for adults does not exist. We will present our efforts to create a rigorous approach to value adult health benefits for endpoints such as cardiovascular mortality. We reviewed recently published government reports and the primary literature. We then assessed the weight-of-evidence for associations between lead exposure and cardiovascular, renal, reproductive, immune, neurologic and cancer endpoints for the purposes of benefits estimation. We closely evaluated the literature and will propose a concentration-response function relating blood lead levels to adverse effects, particularly cardiovascular mortality, in adults. This function could potentially be used to support the benefits analysis of future regulations intended to result in a decrease in lead exposure for adults.

P.3 Burger, J*; Gochfeld, M; Powers, CW; Kosson, D; Clarke, J; Brown, K; Rutgers University, Consortium for Risk Evaluation with Stakeholder Participation, Vanderbilt University; [email protected] Mercury at Oak Ridge: Outcomes from Risk Evaluations can Differ Depending upon Objectives and Methodologies Risk evaluations play an important role in environmental management, remediation, and restoration. Yet when different agencies and groups evaluate risk, the objectives and methods may differ, leading to different conclusions, which can confuse managers, policy-makers, and the public. In this paper we examine two evaluations of the potential risk from mercury contamination deriving from the Y-12 facility at the Department of Energy’s Oak Ridge Reservation (Tennessee, USA). The U.S. Agency for Toxic Substances and Disease Registry (ATSDR) examined the past and present risks from mercury to humans, using data provided in government reports and publications. The Consortium for Risk Evaluation with Stakeholder Participation (CRESP) used a risk-informed prioritization model it developed for managers to evaluate different remediation projects. The CRESP prioritization model considered both human and ecological receptors, as well as future potential risks. Risk was an important component of both evaluations, and both evaluations found that there was a completed pathway of mercury from the source on the Oak Ridge Reservation to offsite human receptors, although the evaluations differed in their final conclusions. In both cases, the pathway to off-site human exposure was through fish consumption. The two evaluations are compared with respect to purpose, specific goals, target audience, receptors, assumptions, time frames, evaluation criteria, and conclusions. When these aspects are considered, the risk evaluations are congruent, although the risk communication messages differ. We conclude that there are many different possible risk evaluations, and the aforementioned variables must be carefully considered when making management decisions, determining remediation goals, and communicating with regulators, managers, public policy makers, and the public.

Following the equine influenza incursion into Australia in 2007, an independent review of Australia’s quarantine and biosecurity arrangements was undertaken (Beale et al., 2008). The report highlighted that zero biosecurity risk is not possible or desirable, pointing out both the benefits and risks of the globalisation of trade and travel. In order to manage biosecurity risks, recommendations were made to shift biosecurity policy emphasis from border biosecurity reducing the risk of entry of pests and diseases, to also include post-border measures to minimise the risk of their establishment and spread. It was recommended that this biosecurity continuum should be a shared responsibility between the general community, businesses, and government, and that it should also be science-based and therefore rigorous. Establishing science-based policy that accounts for more than the economic impacts of a disease incursion is difficult – cultural and social impacts are largely intangible and therefore any measurements tend to be arbitrary and are difficult to reproduce. Public opinion is recognised as a driver of policy, but scientists can’t quantify it and use terms like “outrage factors” and “fear of the unknown” to try and explain it. A stakeholder-directed framework was developed to prioritise and examine the biosecurity continuum of exotic diseases. The framework is presented in the context of exotic disease risks to the domestic pig industry in Australia, and incorporates decision analysis, risk analysis and spatial disease modelling. We demonstrate how stakeholder opinion can be captured quantitatively, and therefore reproducibly, to identify and examine the risks of high priority diseases. The framework ultimately aims to identify appropriate surveillance and mitigation strategies according to the level concern of the industry, rather than only considering direct economic costs of disease incursions. Beale, R.et al., 2008. One Biosecurity: A Working Partnership.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-B.4 Burns, CJ*; Wright, JM; Pierson, J; The Dow Chemical Company; [email protected] Panel discussion on the integration of workshop recommendations to move risk assessment forward Open any introductory to epidemiology textbook and there will be at least one chapter devoted to comparing and contrasting study designs. Investigators must tailor their research to the expected disease prevalence, availability of exposure data, population demographics, and ethical demands of the study subjects. Further, these considerations must be made in the very real context of time, cost and expertise required to complete the study. Integrating epidemiological results into risk assessment applications is often hampered by incomplete or unavailable data or other study limitations. Additional challenges that must be overcome may include problems related to time, cost and available expertise and reliance on traditional approaches. Further, few epidemiological studies are initiated with risk assessment needs in mind; thus more cross-discipline interactions may facilitate providing relevant data and analyses that can be more readily evaluated and applied. Several approaches to characterize and reduce uncertainty, improve exposure assessment, and promote advanced epidemiological methods were recommended by the 36 participants of the ILSI HESI workshop. If broadly implemented, the evaluation of causality based on epidemiological data might be improved and with it, its use in public health decision making. This panel discussion will focus on how to implement these workshop recommendations and break down barriers to their integration into everyday epidemiological evaluation and risk assessment applications.

M4-H.2 Burns, WJ*; Slovic, P; Sellnow, T; Rosoff, H; John, R; Decision Research (AUTHORS 1 AND 2) UNIVERSITY OF KENTUCKY (AUTHORS 3) (AUTHORS 4 and 5); [email protected] Public response to the terrorist attacks on Boston On April 15 terrorists set off two bombs at the Boston Marathon killing 3 people and seriously injuring many others. Within days one terrorist was dead and the other apprehended. A nationwide online survey was conducted on April 16 to determine how the public was responding to this attack. A follow up survey was done with the same panel of respondents on April 30. Respondents were asked about their confidence in DHS to thwart and respond to terrorism, perceptions of terrorism risk, emotional reaction to the attack (e.g. anger, fear, sadness), their willingness to attend public events and the kind of information they sought in the first 24 hours after the attack. This analysis is currently ongoing and the results will be reported during the presentation. These findings inform risk management policy, particularly involving communication with the public during a crisis.

W2-B.3 Burstyn, I; Drexel University; [email protected] On the future of epidemiologic methods in context of risk assessment Incorporation of epidemiologic evidence into human health risk assessment, specifically for a weight of evidence evaluation, is an important part of understanding and characterizing risks from environmental exposures. A thorough synthesis of relevant research, including epidemiology, provides a reasonable approach to setting acceptable levels of human exposure to environmental chemicals. Epidemiologic approaches for causal inference that include computational, frequentist and Bayesian statistical techniques can be applied to weight of evidence evaluations and risk characterization. While there is strong theoretical support for the utility of these approaches, their translation into epidemiologic practice and adoption to the needs of human health risk assessment is lagging. The focus of the epidemiologic methods breakout group of HESI workshop was to address methodologic enhancements and the application of these techniques to regulatory scientific evaluations. The group considered methods that are sometimes, but not frequently, used in epidemiologic studies to increase validity and more accurately portray uncertainties in results. As these results are key inputs for regulatory risk assessments, it is important to apply methodologies that appropriately represent validity and precision. Each of these broad approaches acts to improve the validity and better characterize the overall uncertainty for a single study’s findings and extends to improved characterization of epidemiologic results in weight of evidence assessments. The workshop participants expressed optimism that a widespread application of more appropriate methods of causal analysis and modeling can bring both increased application of epidemiologic results to risk assessments, and increased confidence among the stakeholders that policies based on this improved process would improve the effectiveness of interventions. The presentation will highlight the group’s recommendations and practical discussion points.

M4-B.4 Butterworth, T; George Mason University; [email protected] BPA by the numbers: How the media framed risk The controversy over the safety of the chemical bisphenol a (BPA) has been driven by extensive media coverage, with hundreds of stories over the past six years. This presentation will look at how that coverage framed the purported risk - and how it consistently avoided the key data, quantitative and contextual, that explained that risk.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.155 CABRERA, VB*; DE LAS POZAS, C; Universidad San Sebastian; [email protected] Ammonia removal from waste water from cattle and livestock and its reuse Waste water from some farms ought to be treated with zeolite, which is an aluminum silicate based volcanic mineral that can be used in order to remove heavy metals and other toxins, among them ammonia and ammonium. The zeolite looks like a sand that has a negative charge which attract and bond with the toxins and another toxic elements found in the waste. This treated water now that can be used for watering crops and farmlands in the urban areas of Central Chile. If the treated water is not treated with zeolite, the risk of contamination is a potential danger, not just for crop but for the soil. Infiltration of this water will contaminate water table, aquifers and will leach some layers and stratus, therefore groundwater will be affected. According to Langmuir model zeolite compounds used in waste water from horse stables, cowshed and, livestock barns, piggery shed have given an extraordinary fit for NH4 and NH3 ions

W2-A.1 Calabrese, E*; Yazigi, D; University of Massachusetts/Mercatus Center ; [email protected] Establishing guidelines for more objective risk assessments Major regulations are justified on the grounds that the benefits exceed the costs. Benefit calculations are based on risk assessments (RAs), and thus the real values of many major regulations are dependent on the quality of their accompanying RAs. In 2006, the Office of Management and Budget (OMB) issued draft guidelines to improve the quality of information and analyses disseminated by federal agencies which were ultimately reviewed by the National Academy of Sciences (NAS). We will compile federal agency standards for RAs using the NAS review and other federal risk assessment guidance (from NAS, OMB, and other federal agencies). Using this compilation it will be possible to objectively analyze new and existing RAs and to demonstrate, particularly for the topic of human health, “which agencies do not appear to know what good practices are and which agencies do not have the ability, resources or incentives to meet the standards.” This will pave the road for subsequent research into the quality and methodology behind federal risk assessment guidelines and will shed light on the net benefits of many human health and safety regulations.

P.153 Canales, RA*; Sinclair, RG; Soto-Beltran, M; Reynolds, K; The University of Arizona; [email protected] Simulating Non-Dietary Ingestion of Listeria monocytogenes from Residential Surfaces While infection by Listeria monocytogenes in healthy individuals typically leads to only mild symptoms, in susceptible populations listeriosis has a high fatality rate. In fact the United States Centers for Disease Control lists Listeria as one of the top five pathogens causing foodborne illness resulting in death. The objective of this work is to compose and present a simulation framework for estimating health risks from non-dietary ingestion of Listeria in residential environments. Although there is evidence that principle sources of Listeria include ready-to-eat foods and unpasteurized dairy products, we take a cue from the chemical risk assessment field and realize an exploration of additional pathways of exposure may be warranted. The framework is composed of simulated activities and transfer of Listeria from household surfaces to hands, and subsequent transfer from hands to mouth. Hand and mouth activities are modeled using published behavioral data and incorporate values for the surface area of contact. The framework is applied to data collected from an exploratory study of pathogens on household surfaces in an urban low-income community in Lima, Peru. Approximately 25% of fomites tested were positive for Listeria, with positive concentrations ranging from 0.2 to greater than 75 MPN/10 cm2. Inputs were incorporated as truncated probability distributions and the framework was run as a Monte Carlo assessment – resulting in distributions of non-dietary ingestion estimates and risks. While the resulting exposure and risk estimates were relatively low, the primary insight in constructing the framework is the realization that there are limited data in developing realistic assessments of non-dietary ingestion exposure to Listeria in residential environments. Limited data exist regarding adult behaviors/contacts with surfaces, and transfer of Listeria from surfaces to skin and from skin to mouth. When assessing risk, there are also difficulties since dose-response models for Listeria are inconsistent and poorly understood.

W4-H.5 Canfield, CI*; Bruine de Bruin, W; Wong-Parodi, G; Carnegie Mellon University, Leeds University Business School; [email protected] Designing an electricity bill to motivate savings: The effect of format on responses to electricity use information AIM. Electricity bills could be a low-cost strategy for improving feedback about consumers' home electricity use. Effective feedback would help households to save money on their electricity bills and reduce their environmental impacts. However, complex quantitative feedback may be difficult to understand, especially for consumers with low numeracy or low energy literacy. In a project funded by the US Department of Energy and a partnering electricity company, we built on the health communication literature, which has identified formats for communicating risks to low-numerate individuals. METHOD. In a between-subjects design, 201 adults from diverse backgrounds saw one of three formats for presenting electricity use information including (1) tables, (2) icon graphs, and (3) bar graphs. In their assigned format, each participant saw three information types: (a) historical use, (b) recent electricity use as compared to their neighbors, and (c) historical use, broken down by appliance. RESULTS. Three main findings emerged: First, the table format generated the highest understanding across all three information types for participants of all numeracy and energy literacy levels. Second, the benefit of alternative graphical formats varied depending on information type, in terms of effects on understanding and trust and liking. Third, individuals with lower numeracy and energy literacy understood all formats less. CONCLUSIONS. Graphical displays are known to produce cognitive overload when their message becomes too complex. For communicating electricity use information, we find that tables are better. Our results can be applied to design utility bills that are both understandable and motivational to all consumers.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-E.2 Carrington, C; U.S. Food and Drug Administration; [email protected] Lead: Global burden of disease At higher doses, lead has been known to result in many toxic effects, including hemolytic anemia, peripheral and central nervous system toxicity, renal failure from impaired proximal tubule function, and reproductive toxicity. At lower doses, the effects of greatest concern are impaired neurobehavioral development in children and elevated hypertension and associated cardiovascular diseases in adults. Dose-response relationships for the effects of lead are typically characterized using blood lead as a biomarker. While the diet may be an important source of exposure to lead, sources such soil, dust, and drinking water are important as well. Dietary surveys designed to provide statistically representative estimates of dietary lead exposure in adults and/or children have been conducted in many countries, outside of the Europe, North America and the Pacific Rim, information on dietary exposure to lead is very limited. Based on the data compiled from the available literature, national mean population average daily dietary intakes of lead in children in different countries range from about 5 to 50 µg per person. Adult lead intakes are approximately 50% higher. Based on dose-response analyses that integrated results from multiple epidemiological studies, IQ decrements attributable to lead were estimated for children, while increments in systolic blood pressure (SBP) attributable to lead were estimated for adults. Globally, a modest decrement of about 1.3 IQ points may be attributed to dietary exposure to lead. However, lower and higher average decrements were estimated for some countries (range 0.13 to 2.7 IQ points), and effects in individuals within a region may encompass an even larger range. Projected impacts on SBP were generally very small, which maximum estimated increments of less than 0.2 mm Hg. Estimated increments in relative risk attributable to dietary lead exposure in cardiovascular diseases ranged from about 0.01 to 0.1%.

M4-B.2 Castoldi, AF*; Husøy, T; Leclercq, C; Theobald, A; Pratt, I; EFSA, Parma, Italy; Norwegian Scientific Committee for Food Safety (VKM), Oslo, Norway; Council for Research and experimentation in Agriculture (C.R.A.), Rome, Italy; [email protected] Human health risks related to the presence of BPA in foodstuffs: the assessment of the European Food Safety Authority (EFSA) In the European Union (EU) the use of BPA is authorized (with a migration limit of 0.6 mg/kg food) in all food contact materials other than polycarbonate plastic baby bottles. For the latter articles a temporary ban was decided on a precautionary basis, because of the scientific uncertainties around BPA’s potential effects on the developing organism, expressed by EFSA in 2010. EFSA has thus undertaken a re-evaluation of the health risks for the European population related to the presence of BPA in foodstuffs, encompassing both a new hazard characterization and an updated exposure assessment in light of the most recent scientific evidence (December 2012). In the EFSA evaluation of 2010 the Tolerable Daily Intake (TDI) of 0.05 mg BPA/kg bw/day was based on the NOAEL of 5 mg/kg bw/day from a multi-generation reproductive toxicity study in rats, to which an uncertainty factor of 100 (to account for interand intra-species differences) was applied. For the new risk assessment of BPA a weight of evidence approach is being applied to all publicly available toxicological data on humans, laboratory animals and in vitro according to the endpoint of toxicity and taking into account the developmental stage of the subject at the time of exposure. Occurrence data for BPA in the EU have been collected through literature search and an ad hoc call for data addressed to Members States, research institutions, industries, etc. Both average and high chronic total exposure to BPA are being estimated considering different sources and routes of exposure (oral, inhalation and dermal) in the EU population. Specific scenarios are being developed to cover the exposure patterns in the different age classes and vulnerable groups (fetuses, infants and young children) and in specific groups of consumers. The EFSA’s characterization of BPA-related health risks for the various subgroups of the EU population is still ongoing and the outcome will be presented at the SRA meeting.

M4-G.6 Cha, E*; Wang, Y; Georgia Institute of Technology; [email protected] Risk-informed decision framework for built-environment: the role of ambiguity Managing a risk to built-environment from natural and man-made hazards is an important issue for the prosperity of a nation. Assessing a risk forms the basis for risk management, which often involves epistemic uncertainty, also known as ambiguity, that arises from our ignorance about a risk, such as lack of data, errors in collected data, and assumptions made in modeling and analysis. In contrast, aleatory uncertainty arises from variability of possible outcomes. Epistemic uncertainty exists in the assessment of a hazard occurrence related to its magnitude, the associated likelihoods, and the response of a structure. If ambiguity prevails, risk perception of a decision maker plays a key role in assessing and managing a risk. Furthermore, the role of risk perception in risk management of civil infrastructure becomes significant because of the potential of catastrophic consequences (e.g. casualties, loss of functions of the built-environment, etc.) to the public. Studies have suggested that the risk of low-probability, high-consequence events tends to be overestimated by the public. Consideration of ambiguity and risk perception in risk assessment of a built-environment may lead to a risk management solution that is different from what is obtained when they are not incorporated. We will present a risk-informed decision-making framework that will assist decision makers, particularly governmental agencies, in allocating resources to enhance the safety and security of civil infrastructure. In this framework, epistemic uncertainty is incorporated utilizing the concepts of Choquet capacity and interval probability. The framework will be illustrated with an example of a regional hurricane risk management for residential buildings located in Miami-Dade County, Florida, with the consideration of the climate change effect

T4-F.1 Chakraborty, S; University of Oxford; [email protected] Regulation, Law, and Pharmaceutical Safety This presentation will discuss a study examining the influence and consequences of regulatory and liability mechanisms on decision-making on product safety in the pharmaceutical sector in the EU and UK beginning from the 1970s. The thirty case studies investigated illustrate that the regulatory regime in Europe, capturing observations from medical practice, has been the overwhelming means of identifying post-marketing safety issues with medicinal products. In contrast with widespread in the United States of America on litigation as a regulatory mechanism, product liability cases in Europe have not identified drug safety issues and function merely as compensation mechanisms. This study has profound implications for the design of—and restrictions on—regulatory and liability regimes in Europe. The study found that drug safety decisions have increasingly been taken by public regulators and companies within the framework of the comprehensive and robust regulatory structure that has developed since the 1960s, and that product liability litigation has had little or no effect on the substance of such safety outcomes or regulatory actions. In those few cases where liability litigation has occurred, it has typically been some time after regulatory and safety decisions were implemented. Accordingly, ‘private enforcement’ of public law has been unnecessary, and features associated with it, such as extended liability law, class actions and contingency fees, have not been needed. These findings form a major contribution to the academic debate on the comparative utility of regulatory and liability systems, on public versus private enforcement, and on deterrence versus other forms of behaviour control.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.72 Chang, CH; Chuang, YC; Chen, CC; Wu, KY*; National Taiwan University; [email protected] Probabilistic Assessment of Cancer Risk for N-Nitrosodimethylamine in Drinking Water by Using Bayesian Statistics with MarKov Chain Monte Carlo Simulation N-Nitrosodimethylamine (NDMA) is an emerging nitrogenated Disinfection By-Product (N-DBP) in drinking water formed during chloramination and chlorination. NDMA is genotoxic and carcinogenic to rodents after cytochrome P-450 metabolism by the liver as its target organ. The potential health risk posed by NDMA exposures through water consumption has been of great concern. With the majority of NDMA data in drinking water below detection limit, the research community has yet to perform its cancer risk assessment. In order to deal with a majority of Non-detectable (ND) data of NDMA in drinking water, a Bayesian statistics with Markov chain Monte Carlo simulation was first used to assess probabilistic cancer risk for NDMA in drinking water in WinBUGS1.4 (Bayesian analysis software Using Gibbs Sampling for Windows Version 1.4). The dataset of NDMA concentration was cited from a published study in Taiwan; from which 50 drinking water samples were collected and only one sample was detected at 4.5ng/L. Posterior distributions revealed a mean concentration of NDMA in drinking water at 0.54ng/L. The estimated 95th percentile of lifetime cancer risk was 2.897E-6. These results suggest that the NDMA levels in drinking water in Taiwan do not pose significant cancer risk; regulation of NDMA may be needed in order to fully protect the general public.

T1-H.2 Chatterjee, S*; Salazar, D; Hora, S; CREATE, University of Southern California; [email protected] Frequency-severity relationships for human-caused extreme events Rare but extremely severe events can occur due to intentional and accidental hazards. This study explores the relationship between the frequency and severity of such human-caused events. Tools of extreme value statistics and statistical learning are employed to analyze data from The Global Terrorism Database (GTD)-developed by the START Center at the University of Maryland and The Hazardous Materials Incident Database-developed by the U.S. Department of Transportation Pipeline and Hazardous Materials Safety Administration. Event severity is expressed in terms of fatalities, injuries, and direct property damage. We also investigate the scale invariance property in such human-caused events.

M3-E.3 Che, WW*; Frey, HC; Lau, AKH; The Hong Kong University of Science & Technology North Carolina State University; [email protected] Sensitivity of estimated children PM2.5 exposure to activity patterns, and geographic and seasonal variations Children’s exposure to ambient Particulate Matter (PM) depends on activity patterns and ventilation in micro-environments, which in turn can depend on climate zone and season. Distribution of inter-individual variability in daily PM2.5 exposure for school age children (5-18) were estimated using the U.S. Environmental Protection Agency’s Air Pollutants Exposure model (APEX). The key inputs to APEX include ambient concentration, air exchange rate, penetration factor, deposition rate, census data and activity diary data. Inter-individual variability in exposure of elementary school (6-11), middle school (12-14) and high school (15-18) children were estimated and compared for selected regions in North Carolina (NC), New York City (NYC) and Texas (TX) for four seasons (spring, summer, fall and winter). Home and school are the most important microenvironments for children’s exposure on school days; while home and outdoors are predominant on non-school days. The ratio of ambient exposure to ambient PM2.5 concentration (E/C) is significantly different (P<0.05) on school days (mean=0.51; 90th percentile = 0.57) and on non-school days (mean=0.64; 90th percentile = 0.74) for all simulated children. Daily maximum temperature influences the distribution of high school children exposure. Inter-individual variability in estimated daily average E/C varies by a factor of 2 to 3 over a 95% frequency range. The difference in average daily E/C between the selected regions is 3% to 6% in spring and 22% to 29% in fall. The difference in average daily E/C among seasons ranges from 0%-12% in TX and 15% to 33% in NYC. Thus, population average E/C for school children is sensitive to day type, climate zone and season. The latter two affect residential air exchange rate. These factors lead to inter-city differences in exposure and can introduce error in epidemiological studies if ambient concentration is used as surrogate for exposure.

W3-B.2 Checkoway, H*; Boffetta, P; Mundt, KA; Mundt, D; Lees, P; University of Washington, Seattle, Mt Sinai Hospital, ENVIRON International Corporation, John Hopkins Bloomberg School of Public Health; [email protected] Review of the epidemiologic evidence for formaldehyde as a human leukemogen Formaldehyde is a common industrial chemical with many important uses. It is also generated by our cells, and by the cells of most living things. The International Agency for Research on Cancer (IARC) and the National Institute of Environmental Health Sciences, National Toxicology Program (NTP) both recently concluded that formaldehyde causes leukemia – particularly myeloid leukemia. Central to these classifications was evidence from epidemiological studies of workers in various industries in which they may be exposed to formaldehyde because they use or produce formaldehyde. Among the more influential studies was the large cohort study of industrial workers conducted by the US National Cancer Institute (NCI). The principal findings were moderate trends of increasing risk for myeloid leukemia (ML) with peak and average exposures. However, no overall excess of all leukemia or myeloid leukemia were found, and no associations were observed for ML with cumulative exposure – the conventional dose metric in epidemiologic research – or for other leukemia types with any of the dose metrics. Evidence of associations between formaldehyde exposure and leukemia reported in other occupational cohort studies are inconsistent, and generally weak. Population-based case-control studies show no or very weak associations with any of the leukemias. We regard the currently available epidemiologic evidence for formaldehyde as a leukemogen as insufficient for classifying formaldehyde as a carcinogen. Nonetheless, in view of widespread occurrence of environmental formaldehyde exposure, future research addressing associations with the leukemias is warranted. Improvements in disease classification and exposure assessment as well as integration with emerging evidence on modes of action will be essential to advance understanding of formaldehyde’s potential role in the etiology of leukemia.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.71 Chen, YT*; Chang, CH; Chung, YC; Chen, CC; Wang, GS; Wu, KY; National Taiwan University; [email protected] Probabilistic Risk Assessment of Cisplatin for Medical Staff of Medical Centers in Taiwan Cisplatin, a platinum-based chemotherapeutic medicine, widely used in chemotherapy, is mutagenic and has ability to cause chromosal aberrations, micronuclei and to induce nephrotoxicity, birth abnormality, and reproductive effect. Potential cisplatin exposures to medical staff including pharmacists and nurses who handle the chemotherapeutic medicine have been of great concerns. A preliminary study was conducted to analyze urinary platinum to assess exposures for 126 medical staff in three medical centers in Taiwan, showing that urinary platinum was detectable in only 5 of these study subjects in the range from 1.59 to 89.1 ppt. In this study, the urinary platinum levels were used to reconstruct cisplatin exposures for probabilistic risk assessment by using Bayesian statistics Markov chain Monte Carlo simulation with the WINBUGS software. The results showed that cisplatin exposures were 0.03 mg/kg/day for pharmacists in the departments of chemotherapeutic medicine, the corresponding hazard indexes (HI) were 129.98 ± 12.95; for nurses in oncology wards were 0.02 mg/kg/day, the corresponding HI were 86.53 ± 8.67; and for nurses in the oncology clinics were 0.04 mg/kg/day, the corresponding HI were 173.14 ± 17.23. These results suggest that the intervention should be implemented thoroughly and further studies should be conducted to protect medical staff handling the chemotherapy medicine in Taiwan.

P.145 Chen, NC*; Yates, JY; Texas A&M University; [email protected] Decision Aiding for Extreme Event Evacuation Evacuating a large population from an impending extreme event is fraught with complexity, uncertainty and risk. Evacuees have to make decisions on route planning and point-of-destination while emergency managers need to ensure the appropriate personnel and infrastructure are available and capable of facilitating the evacuation. In evacuation, individual evacuees are exhibiting an increasing desire to communicate with family, friends and the local/state/federal authorities in-situ. We develop an agent-based simulation model to examine the impact of communication within social connections and emergency management during regional evacuations. We show how improved communication among evacuees impacts the evacuation process and we demonstrate how this knowledge can lead to improved public evacuation management. Furthermore, to better enable evacuee communication during an event, we formulate a time-dependent discrete optimization model to determine the location of telecommunications equipment as well as the assignment of evacuees to equipment.

W3-D.2 Chen, Y*; Dennis, S; McGarry, S; Food and Drug Administration ; [email protected] FDA’s Risk Assessment Model for Designating High-Risk Foods Pertaining to Product Tracing Required by FSMA Section 204 of the Food Safety Modernization Act (FSMA) requires FDA to designate high-risk foods for which additional record keeping requirements are appropriate and necessary to protect the public health. FSMA outlines the parameters that FDA should use in developing a high risk foods (HRF) list. These parameters include: (1) the known safety risks of a particular food, (2) the likelihood that a particular food has a high potential risk for microbiological or chemical contamination or would support growth of pathogen microorganisms, (3) the point in the manufacturing process where contamination is most likely to occur, (4) the likelihood of contamination and steps taken during manufacturing process to reduce the possibility of contamination, (5) the likelihood that consuming a food will result in a foodborne illness, and (6) the likely or known severity of a food borne illness attributed to a particular food. A predictive, data-driven risk assessment model with seven criteria that encompass the many factors required by FSMA is being developed. Data from multiple sources are used in implementing the model and provide a list of candidate food-hazard pairs for consideration in designating high-risk foods. The presentation will focus on current Agency thinking and stakeholder input on the development of the HRF list.

T2-C.2 Cheung, C; Friesen, S*; Government of Canada; [email protected] The federal all hazards risk assessment: integrating strategic risk into emergency management planning – a Canadian perspective The purpose of this presentation is to describe the All Hazards Risk Assessment (AHRA) framework that was developed collaboratively between Public Safety Canada (PS) and Defence Research and Development Canada’s Centre for Security Science (DRDC CSS). Launched as a pilot in 2010, the AHRA methodology considers public safety and security threats and hazards, both malicious and non-malicious, of national significance and addresses them using a holistic, cross-government approach involving 25 federal departments and agencies. The AHRA provides a frame that other government departments may populate. The AHRA is a Canadian innovation that uses a comprehensive analytical approach towards scenario selection and development, risk scoring and analysis, and risk evaluation. This presentation discusses how the results from this process are integrated broadly amongst federal institutions. It illustrates how the AHRA has become a key component for ensuring a long-term, viable strategy, whereby federal institutions participate in an annual cycle to evaluate priority threats and hazards to Canada and validate their emergency management plans in relation to the identified scenarios. This presentation concludes by discussing the results from the third cycle of development, the merging of risk assessments with a broader capability assessment methodology, and the exploratory work to build a National AHRA Framework with Provinces, Territories and Regions.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.115 Chiang, SY*; Chang-Chien, GP ; Horng, CY ; Wu, KY ; China Medical U., Taiwan; [email protected] Dietary, occupational, and ecological risk assessment of carbaryl and dimethoate The application of pesticides may cause adverse impacts on human users and environmental receptors. We carried out consumers', occupational, and ecological risk assessment of two commonly-used pesticides, carbaryl (carbamate) or dimethoate (organophosphate). For consumers' health risk assessment based on the current tolerance and highest residue levels in crops, fruits and vegetables, the non-carcinogenic risk index (hazard indexes, HIs) of carbaryl but not dimethoate was less than 1. Further analysis using Monte Carlo Simulation method showed that the means and upper 95% confidence limits of total HIs for carbaryl and dimethoate in different crops did not exceed one. For occupational exposure risk assessment, the distributions of pesticide exposure were assessed by HPLC analysis of personal air sampling tubes and patches from 27 workers and 16 farmers. Some of 95% confidence limits of total HIs for carbaryl and dimethoate were larger than 1, suggesting the importance of strengthening personal protective measures at work. The results from ecological risk assessment show that carbaryl possesses potential risk to aquatic insects, but not to fishes and frogs, whereas, dimethoate has significant ecological hazard effects on minnows, stoneflies, and frogs. The results can be regarded as the reference of government's pesticide administrative decision.

P.48 Chikaraishi, M*; Fischbeck, P; Chen, M; The University of Tokyo; [email protected] Exploring the concept of transportation systems risks In present-day society, transport modes such as transit and automobiles are indispensable tools that are required to maintain a minimum level of wellbeing. Their importance spans the urban-rural dimension. In the U.S., national, state and local governments have generally assumed the responsibility of developing and maintaining infrastructure of transportation systems (e.g., highways and mass transit rail). The balance between mass transit availability and private vehicle use varies dramatically. In areas with effective mass transit systems, personal vehicle use can, in many cases, be considered voluntary thus making the associated risk voluntary as well. However, in areas without mass transit, personal vehicle use is a necessity and a large portion of the associated risks is essentially involuntary. Despite the varying characteristics of personal vehicle risks most traffic risk studies have focused solely on personal vehicle risks (e.g., a number of fatalities or injuries per unit exposure of vehicle travel). In this study, we first propose an alternative transportation risk measure that focuses on the accident risks of the entire system. We particularly argue that the proposed system-risk measure is much more appropriate for policy discussions over allocation of scarce resources to improve safety. Understanding the impact of shifting personal vehicle risks from being involuntary to the voluntary by making mass transit more readily available changes the framing of the problem. In this study we compare differences in vehicle and system risks across multiple exposure measures (i.e., per mile, per trip, per driver) for urban, suburban, and rural areas using data from National Household Travel Survey, Fatality Analysis Reporting System, and American Community Survey.

W3-E.4 Clewell, HJ*; Yoon, M; Wu, H; Verner, MA; Longnecker, MP; The Hamner Institutes for Health Sciences, RTP, NC, Harvard Medical School, Boston, and NIEHS, RTP, NC; [email protected] Are epidemiological associations of higher chemical concentrations in blood with health effects meaningful? A number of epidemiological studies have reported associations of higher blood concentrations of environmental chemicals such as perfluoroalkyl acids (PFAAs) and PCBs with a variety of health effects including low birthweight, delayed onset of menarche or early onset of menopause. However, the effect of physiological changes during these life stages on the kinetics of a particular chemical are complex and difficult to elucidate without the quantitative structure provided by a physiologically-based pharmacokinetic (PBPK) model. To address the question of whether associations between age of menarche/menopause and chemical concentration can be explained by pharmacokinetics, we have developed human PBPK models that incorporate age-dependent physiological changes. We present two examples of how PBPK models can be used to evaluate associations in epidemiological studies between concentrations of a chemical in blood and physiological outcomes: (1) PFAAs and age at menarche, and (2) PCBs and birthweight. The models indicate that the relationships between blood levels and health outcomes can be explained on the basis of pharmacokinetics rather than toxic effects. In both cases the internal chemical exposure is driven by physiological changes. In the case of PFAAs, menstruation is an important route of excretion; therefore, onset of menstruation links chemical concentration in blood to age at menarche. In the case of PCBs, differing degrees of maternal weight gain during pregnancy, and resulting variation in the fat volumes in the pregnant women, serves as a hidden variable underlying the apparent relationship between birthweight and chemical concentration in blood. Many other examples exist of epidemiologic associations between exposure biomarker concentrations and outcomes that, by using PBPK models, may be explainable on the basis of chemical kinetics rather than causality.

M4-A.4 Clewell, HJ*; Gentry, PR; Yager, JW; Hamner Institutes for Health Sciences, RTP, NC, ENVIRON International, Monroe, LA, and University of New Mexico, Albuquerque, NM; [email protected] A risk assessment approach for inorganic arsenic that considers its mode of action Despite the absence of a complete understanding of mechanism(s) underlying the carcinogenicity and toxicity of inorganic arsenic, an alternative to the default linear extrapolation approach is needed that is more consistent with the available evidence suggesting a nonlinear dose-response. Contributory factors to the mode of action for arsenic carcinogenesis include DNA repair inhibition under conditions of oxidative stress, inflammatory and proliferative signaling, leading to a situation in which the cell is no longer able to maintain the integrity of its DNA during replication. It has been suggested that the dose-response for cancer risk assessments could be based on quantitation of molecular endpoints, or “bioindicators” of response, selected on the basis of their association with obligatory precursor events for tumorigenesis (Preston, 2002). We have applied this approach to inorganic arsenic using benchmark dose (BMD) modeling of gene expression changes in human target cells (uroepithelial cells) treated with arsenic in vitro. The BMDs for cellular gene expression changes related to the carcinogenic mode of action for arsenic were used to define the point of departure (POD) for the risk assessment. The POD was then adjusted based on data for pharmacokinetic and pharmacodynamic variability in the human population.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W3-B.4 Clewell, III, HJ*; Andersen, M; Gentry, PR; The Hamner Institutes for Health Sciences, ENVIRON International Corporation; [email protected] Pharmacokinetics of formaldehyde and the impact of endogenous levels on uptake Formaldehyde is present endogenously, with measurable concentrations in the tissues and exhaled breath of humans. While formaldehyde is highly water soluble and readily absorbed in the respiratory tract, there is little evidence of systemic delivery. The presence of endogenous concentrations of formaldehyde adds complexity to understanding the contribution of exogenous exposure to target tissue concentrations that may result in adverse effects. Computational Fluid Dynamic (CFD) modeling has been conducted to investigate the impact of the presence of endogenous formaldehyde on the absorption of exogenous formaldehyde from the nasal cavity of rats, monkeys, and humans. Based on the CFD modeling, exogenous exposure to concentrations below 0.2 ppm via inhalation are not expected to cause increases in tissue concentrations above background, even at the site of contact. At high exposure concentrations, formaldehyde concentrations are much greater in the air than in the tissue, which leads to rapid absorption in the anterior nasal passages due to the high rate of formaldehyde partitioning into nasal tissues. At low exposure concentrations, however, the concentration gradient between air and tissue is greatly reduced due to the presence of endogenous formaldehyde in nasal tissues, leading to reduced tissue dose. Biologically-based dose-response models have previously been developed to characterize both the toxicokinetics and toxicodynamics of formaldehyde in the rat and human; however, these models do not consider endogenous production of formaldehyde and thus the results at low concentrations may be questionable. Newer PK models can now track cellular formaldehyde and differential formation of DNA-adducts from both endogenous and exogenous formaldehyde. These results suggest that understanding endogenous concentrations of a compound such as formaldehyde are of critical importance in characterizing the shape of the dose response curve in the low dose region for risk assessment.

W4-J.1 Coglianese, C; University of Pennsylvania; [email protected] Moving Forward in Looking Back: How to Improve Retrospective Regulatory Review Since 1982, the U.S. federal government has implemented an extensive regulatory review process that considers the costs and benefits of proposed regulations -- before they are adopted. By contrast, the federal government has little by way of institutionalized review of regulations' costs and benefits after they have been adopted. The Obama Administration -- like several earlier administrations -- has adopted an ad hoc retrospective review of existing regulations, or what it calls "regulatory look-back." This paper assesses the Obama regulatory look-back initiative, e.g., what it has undertaken and accomplished, but also analyzes additional steps that could be taken to improve, and even institutionalize, the practice of retrospectively evaluating existing risk regulations.

T3-J.1 Coglianese, C*; Carrigan, C; University of Pennsylvania; [email protected] Why Politicians Think Regulation Kills Jobs...When Economists Don't This paper juxtaposes the intense political controversy over the connection between jobs and regulation with the rather benign findings of the economics literature. For many politicians, regulations are "job-killers," while most of the economic research suggests that regulation yields no substantial net change in employment. This paper analyzes and explains the disjunction, showing that politicians respond to the distribution of job impacts while economists consider aggregate effects. While for many years, the impact analyses that accompanied new regulatory proposals contained little attention to employment effects, this has been because the economists' view prevailed. Since the 2008 recession, however, the political process has demanded more explicit focus on job impacts. This paper considers the rationale for job impact analysis of new risk regulations from the standpoint of democratic theory.

T1-B.2 Cogliano, V; US Government; [email protected] Enhancing IRIS: Progress to Date and Future Actions EPA has made numerous enhancements to its Integrated Risk Information System (IRIS) program, which evaluates scientific information on health effects that may result from exposure to environmental contaminants. These enhancements are intended (1) to improve the fundamental science of IRIS assessments, (2) to improve the productivity of the program, and (3) to increase transparency so issues are identified and debated early. The IRIS program has also enacted measures to ensure that the assessments it undertakes will be important to public health and to EPA’s priorities. As part of improving the fundamental science of IRIS assessments, the IRIS program is adopting principles of systematic review in all its new assessments. An expanded section on hazard identification will identify all health hazards where there is credible evidence of an effect. Assessments will include toxicity values for multiple health effects, increasing their utility in subsequent risk analyses and decisions. Before work on an assessment begins, conversations with EPA’s program and regional offices will ensure that IRIS assessments meet their varied needs, and input from external stakeholders will help in formulating the problems to be addressed in the assessment. Improved science, improved productivity, and improved transparency – these will be evident in the new, enhanced IRIS program. The views expressed in this abstract do not necessarily represent the views or policies of the U.S. Environmental Protection Agency.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M4-A.3 Cohen, SM; University of Nebraska Medical Center; [email protected] A common mode of action for arsenical toxicity Inorganic arsenic increases the risk of cancer in humans, primarily of the urinary bladder, skin, and lung. Systematic investigation of the mode of action of urinary bladder carcinogenesis in rats and mice strongly supports a mode of action involving generation of reactive trivalent arsenicals which bind to sulfhydryl groups of critical proteins in the target cells, leading to cytotoxicity and consequent regenerative proliferation, increasing the risk of cancer. Arsenicals are not DNA reactive. Evidence for indirect genotoxicity indicates that this only occurs at extremely high concentrations in vitro or in vivo. Intracellular inclusions that occur in mice and humans, similar to those observed with other metals, have been mistaken for micronuclei in a variety of epidemiology studies. Although the evidence for cytotoxicity and regenerative proliferation as the mode of action is strongest for the urinary bladder, evidence is accumulating that a similar process occurs in the lung and skin, and is likely for other epithelial cell systems and for other, noncancer effects. This mode of action is consistent with a nonlinear dose response with a threshold. In rodents, the no effect level is 1 ppm of the diet or drinking water, and in vitro the no effect level is greater than 0.1 µM trivalent arsenic. Administration of inorganic arsenic is necessary above doses of 1 ppm to generate urinary or tissue concentrations above 0.1 µM. In effect, arsenicals produce a preneoplastic lesion, toxicity and cell death with regenerative proliferation, leading to increased risk of cancer if continued over time. This mode of action in animals is consistent with inorganic arsenic epidemiology and other studies in humans for these cell types. Reaction of trivalent arsenicals with critical sulfhydryl groups in target cells is the basis for inorganic arsenic toxicity for non-cancer and cancer effects.

W3-H.4 Coles, JB*; Zhuang, J; University at Buffalo; [email protected] Ideal Disaster Relief?: Using the IFRC Code of Conduct in model development Bridging the gap between research and practice has been a recognized problem in many fields, and has been especially noticeable in the field of disaster relief. As the number and impact of disasters have increased, there has been great interest from the research community to model and provide solutions for some of the challenges in the field. However, this research has not always been guided by an understanding of the complex and nuanced challenges faced by people working in disaster relief. In this talk we propose a model for the International Federation of Red Cross and Red Crescent Societies (IFRC) Code of Conduct (CoC) for use in relief operations. The CoC provides organizations involved in disaster relief with a clear set of expectations and objectives for behavior in a relief operation. The CoC is nonbinding and is designed to help organizations self-assess and refocus disaster relief operations. Additionally, the code provides a list of standards that could be used to assess a potential partner to ensure operational excellence and make sure that investments in relief are conducted with the utmost integrity. Though there are several standards and codes that apply aid and disaster relief, the CoC is of particular interest because it examines the methodology of operations rather than just a minimum goal (such as the SPHERE standards) or an overarching philosophy (such as the Seven Fundamental Principles of the IFRC).

P.142 Coles, JB*; Zhuang, J; University at Buffalo; [email protected] Model Validation in Disaster Relief Partner Selection and Maintenance In this research we study how optimization, simulation, and game theory models could help agencies make better decisions after a disaster. To better understand the behavioral dynamics of interagency interaction, we interviewed over 60 agencies about their network behavior using an ego-centric approach, and used this data to propose a set of experiments to examine agency decision making in disaster relief operations. The full process of network development is complex, but in this poster we focus on the process of partner selection in an environment that is both cooperative and competitive. The partner selection model proposed here was developed from interviews conducted with agencies involved in disaster relief operations in response the 2010 earthquake in Haiti, the 2011 Tornado in Joplin Missouri, and Hurricane Sandy along the east coast of the United States. The model proposed will be initially validated using student data to provide a granular estimate of how interagency dynamics work. Once the initial validation is complete, we will conduct a secondary validation process with decision makers working in disaster relief agencies.

P.108 COLON, L; MONZON, A; DEMICHELIS, S*; National University of Lanus; [email protected] Contamination risks and effects on suburban areas by a ceramic and tiles factories: a case of study The aim of the study is to evaluate the contamination produced by a factory of ceramic and tiles which is located in the Industrial Park of Burzaco, Almirante Brown County, Buenos Aires Province, Argentina. There is an urbanization surrounding this industrial area without any kind of barrier The tiles factory delivered into the media uncontrolled residual waters from productive process in an artificial pond and which is in contact with population since eventually it discharges into a natural stream and due it was not adequately built affecting groundwater. Waste water, soil and air sources of pollution were analyzed On the other hand, there is a neighborhood under risk, which had been surrounded by the Industrial park growth and which is directly affected by particulate material disposed in open areas on the soil and by uncontrolled gaseous emissions. The population vulnerability is increased by strong rains since run off transports particulates from accumulation areas and pollutants from the lagoon overflows.. This work presents an integral system of environmental and waste water management, that integrates technological improvements including the use of a effluent treatment system, an intervention protocol for administrative and productive sectors in order to guarantee a correct life cycle of their products and diminish inhabitants risks .

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-J.2 Colyvan, M; University of Sydney; [email protected] Value of Information Models and Data Collection in Conservation Biology I will look at recent uses of value of information studies in conservation biology. In the past, it has been mostly assumed that more and better quality data will lead to better conservation management decisions. Indeed, this assumption lies behind and motivates a great deal of work in conservation biology. Of course, more data can lead to better decisions in some cases but decision-theoretic models of the value of information show that in many cases the cost of the data is too high and thus not worth the effort of collecting. While such value of information studies are well known in economics and decision theory circles, their applications in conservation biology are relatively new and rather controversial. I will discuss some reasons to be wary of, at least, wholesale acceptance of such studies. Apart from anything else, value of information models treat conservation biology as a servant to conservation management, where all that matters is the relevant conservation management decision. In short, conservation biology loses some of its scientific independence and the fuzzy boundary between science and policy becomes even less clear.

M2-I.4 Connelly, EB*; Lambert, JH; Thekdi, SA; University of Virginia, University of Virginia, University of Richmond; [email protected] Robust supply chain investments for disaster preparedness and community resilience: An application to Rio de Janeiro, Brazil Effective disaster preparedness and response requires investment in resilient and agile emergency management systems. Meanwhile there are scarce resources for emergency supply chains and related operations. Resource allocations to these systems must consider multiple criteria and deep uncertainties related to population behaviors, climate change, innovative technologies, wear and tear, extreme events, and others. The methods demonstrated in this paper help to prioritize among emergency supply chain investments by employing an integration of scenario analysis and multi-criteria decision analysis. The results will aid emergency management agencies in maintaining and increasing performance of emergency supply chains and logistics systems. The methods will be applied to disaster reduction initiatives of firstresponder agencies in Rio de Janeiro, Brazil, whose overall society and favela populations are vulnerable to landslides, blackouts, radiological events, etc., and will host in the next few years the World Cup and the Olympics.

W2-F.1 Conrad, JW, Jr*; Paulson, G; Reiss, R; Patterson, J; Conrad Law & Policy Counsel; [email protected] Legal context for US federal agency peer reviews Any discussion of federal agency peer reviews must begin with the legal framework that constrains them (and any desired reforms). For agency-administered peer reviews, the principal legal authorities are the Federal Advisory Committee Act, the Ethics in Government Act and even the federal criminal code. For peer reviews administered by agency contractors, federal acquisition regulations largely govern. This dichotomy is increasingly being recognized as problematic by concerned stakeholders. Executive Branch and agency policies are also highly important, particularly OMB’s Peer Review Bulletin. This presentation will lay out the relevant legal framework and then explore current controversies and proposed solutions.

W3-C.4 Convertino, MC*; Liang, SL; University of Minnesota; [email protected] Unveiling the Spatio-Temporal Cholera Outbreak in Cameroon: a Model for Public Health Engineering Cholera is one of the deadliest and widespread diseases worldwide in developing and undeveloped countries. Education, water sanitation, and human mobility are together the major factors affecting the disease spreading and these factors can be enhanced by unregulated land development and climate change. Here we investigate the cholera outbreak in the Far North region of Cameroon in 2010 that has seen 2046 cases of infection with 241 cases and a fatality rate of 12% (600 deaths) at the peak of infection. In this study, we further develop a metacommunity model predicting the spatio-temporal evolution of the cholera outbreak by incorporating long-term water resource availability and rainfall event dependent resources. Susceptible, infected, and recovered individuals are modeled in the region as a function of their mobility and pathogen spread. We apply a novel radiation model of human mobility to better characterize the secondary pathway of transmission. The model is capable to predict the spatiotemporal evolution and prevalence of the 2010 cholera epidemic with an average accuracy of 88 % with respect to the epidemiological data. We find that cholera is a highly heterogeneous and asynchronous process in which multiple drivers have different relative importance in space. Using global sensitivity and uncertainty analysis, we find hydrogeomorphological and social controls on the distribution and emergence of outbreaks in different health districts. Particularly, human mobility and the available water resources are predominantly important in urbanized mountain and floodplain regions, respectively. The model predicts cases at a scale that is two orders of magnitude finer than the health district scale, which allows one a precise healthcare planning and response after the onset. The model is designed as a parsimonious model to be readily applicable to any country and scale of analysis facing cholera outbreaks. Moreover, because of the generality of its structure the model can be easily tuned to different pathogen ecology types for waterborne diseases.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.4 Convertino, MC*; Liang, SL; University of Minnesota ; [email protected] Food Safety? A Supply Chain Matter: Probabilistic Risk Model based on the Agro-Food Trade Network Food safety is a major issue for the worldwide population. Foodborne outbreaks in the USA caused in 2010 a cost of $ 152 billion related to 325,000 hospitalized persons and 5000 deaths due to a foodborne illness. To fight this increasing trend a risk-based system built upon data-driven analysis to inform the efficient targeting of efforts to minimize foodborne risks to the US consumer. Here we propose a model for the assessment of the potential total health risk of food based on the food supply chain (FSC) as a subset of the international agro- food trade network. The number of connected countries, the betweenness centrality of the exporting countries, and the average path length are the supply network variables considered. Considering the safety of each country and the network variables we introduce a global safety index (GSI) for characterizing the riskiness of each country based on local and FSC variables. The intermediary country risk, the food-pathogen health risk, and the company reliability are the second most important factors for the total health risk. Policies that act on both the supply chain variables and the safety index by means of the GSI reduce of 44% the average total health risk. This reduction is much larger than the reduction of policies focused on individual risk factors of the food life-cycle. The proposed FSC model is scalable to any level of the global food system and of- fers a novel perspective in which the global public health is conceived, monitored and regulated.

P.41 Convertino, MC*; Munoz-Carpena, RMC; Kiker, GK; Perz, SP; University of Minnesota; [email protected] Metacommunity Resilience of the Amazon Tropical Forest Facing Human and Natural Stressors Climate extremes and rapid urbanization are stressors that both shape and threat ecosystems. Thus, questions arise about future scenarios for ecosystems and how we as society can potentially control ecosystem evolution considering natural variability and human needs. Here we reproduce biodiversity patterns of the Amazon’s MAP (Madre de Dios - Acre - Pando) tropical rainforest affected by the construction of the transoceanic highway and climate change with a neutral metacommunity model at different scales and resolutions. The influence of environmental variability in species loss and richness increases with scale and decreases with tree clumpiness heterogeneity. At the ecosystem scale drought sensitivity is 37 % higher than at the plot scale where the difference in scales is of seven orders of magnitude. Conversely, the anthropic disturbance played by the road is much larger at the plot scale, and undetectable at the ecosystem scale because dispersal is not affected. A non trivial pattern is found between the species cluster size and the persistence time. Bimodal distributions of clumpiness results in highly stable species richness and persistence time distributions. The species persistence time follows a power law function whose exponent increases with the magnitude of disturbance. This power law is preserved together with the distribution of tree cover despite changes in the shape of the species richness distribution. We propose the product of the persistence time, its probability of occurrence and the average species cluster size as a measure of metacommunity risk of ecosystems as a function of its resilience. A spatial resilience index as ratio of metacommity risks in the disturbed and undisturbed case is calculated to identify the most resilient communities. Our results show that societal development pressure should consider the ecosystem tree distribution of to minimize and maximize biodiversity loss and persistence time, respectively. The spatial resilience index can be used to plan agricultural and urban expansion that preserve resilient communities.

T5-C.1 Cox, T; Cox, Tony; Cox Associates and University of Colorado; [email protected] Possible Futures for Risk Analysis Both the field and the journal Risk Analysis have made major contributions to risk-informed policy making in recent years. Demand is high for skilled assessment and communication of risks and uncertainties, creation and validation of trustworthy risk models, less expensive and more effective procedures to implement risk management guidelines, and better principles and methods for deciding what to do when not enough is known to formulate a conventional decision analysis model. Risk analysis is responding to these and other vital conceptual and practical challenges, and will continue to do so. New and rapidly growing areas, such as infrastructure risk analysis, are likely to add to the importance and impact of the journal. At the same time, the very success of the field has created important challenges to its long-term integrity and credibility. Strong demand for risk-informed decision-making has encouraged short-cuts and methods of dubious reliability which are still branded as risk analyses. Among these are use of unvalidated guesses and input assumptions elicited from selected experts; use of unvalidated statistical and computer simulation risk models and risk projections; presentation of authoritative-looking outcome probabilities and confidence intervals which hide key uncertainties about their underlying assumptions; and ad hoc risk scoring, rating, and ranking procedures deployed without critically assessing whether they actually lead to improved risk management decisions. A strong journal will help to overcome these and other methodological and practical challenges. By showcasing excellent work, encouraging the development and application of sound methods, and explaining and exposing important risk analyses to the scrutiny of a community which cares about genuinely trustworthy and valuable analysis, we can dodge the pitfalls and magnify the value of risk analysis in supporting better policy decisions.

M3-C.2 Cox, T; Cox Associates and University of Colorado; [email protected] Adapting Risk Management to Reduce Regret Two principles for choosing among alternative risk management policies are: (a) Seek to maximize ex ante expected social utility (roughly equivalent to expected net benefit); and (b) Seek to minimize ex post regret, defined as the difference between the maximum value (or net benefit) that could have been achieved, as assessed in hindsight, and the value that actually was achieved. We show that these two principles typically lead to different recommended choices, for both individuals and groups, especially when there are uncertainties or disagreements about probabilities or preferences. Under these realistic conditions of conflict and uncertainty, effective policy-making requires learning to make choices that adaptively reduce or minimize regret. Risk-cost-benefit and expected utility maximization principles that instead seek to identify the best next action, using realistically imperfect information, are subject to over-fitting and other biases that typically over-estimate the net benefits from costly interventions. We discuss conditions under which policy-making can be improved by switching from trying to maximize expected net benefits to trying to minimize ex post regret. This change helps to resolve some long-standing difficulties in risk-cost-benefit analysis, such as how to avoid over- or under-discounting of far future effects and how to decide what to do when future preferences and effects of current actions are highly uncertain.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W4-J.3 Cox, T; Cox Associates and University of Colorado; [email protected] Have Historical Reductions in Ozone and Fine Particulate Matter Caused Reductions in Mortality Rates? Between 1999 and 2010, levels of air pollutants in counties in Texas changed significantly, with fine particulate matter (PM2.5) and coarser particulate matter (PM10) declining by over 25% in some counties, and ozone exhibiting large variations from year to year. This history provides an opportunity to compare changes in average ambient pollutant levels from year to year in different counties to corresponding changes in all-cause, cardiovascular, and respiratory mortality rates. We test the hypothesis that changes in historical pollution levels caused or help to predict changes in mortality rates. The hypothesis of a significant linear relation between changes in pollutant concentrations from year to year and corresponding changes in all-cause, cardiovascular, or respiratory disease risks is not supported by the historical data from Texas counties. Nonparametric tests (Spearman’s rank correlations) also show no significant ordinal associations between yearly changes in pollutant levels and corresponding changes in disease mortality rates. These findings suggest that predicted substantial short-term human benefits of incremental life-saving from reducing PM2.5 and ozone, predicted by the United States Environmental Protection Agency (EPA) and others based on statistical models of exposure-response associations elsewhere, may not hold in Texas counties. This possibility emphasizes both the potential for heterogeneity in air pollution health effects across regions, and also the high potential value of information from accountability research that compares model-based predictions of health benefits from reducing air pollutants to historical records of what has actually occurred.

T1-A.3 Cuite, CL*; Hallman, WK; Rutgers, The State University; [email protected] Social Media and Food Crisis Communication This study explores how to use social media to effectively communicate with the public about food risks. Using an Internet-based survey with a national sample of 1904 adults, we tested how the format of a URL affects how people respond to a message and the effectiveness of different sources of a social media message. We randomly assigned participants to see one of four “tiny” URLs at the end of a social-media style message (in which no source was identified) concerning a food contamination event. The URLs are from actual government tweets, two with usa.gov (go.usa.gov/YXNC and 1.usa.gov/RvvLKI) and two with random text (is.gd/FfQIDl and ow.ly/emXG7). The two URLS with “usa.gov” were significantly more likely to be perceived as being from the government (F (3, 1167) 20.78, p <.000) and less likely to be seen as a hoax (F (3, 1175) 11.32, p <.000), and respondents were more likely to say that they would click on the link to seek more information (F (3, 1182) 13.65, p <.000). To test source effects, we randomly assigned participants to see a second social media message that was identified as coming from either the company involved in a foodborne illness outbreak, MSNBC, Fox News, the state police, US DHS, or US FDA. There was a significant effect of source on all related dependent variables (understanding: F (6, 1125) =3.94, p<.001; perceived accuracy: F (6, 1125) =6.17, p<.000; likely to avoid: F (6, 1125) =3.42, p<.002; authenticity: F (6, 1125) =6.75, p<.000). Identifying the company involved in the outbreak and not identifying the source of the message were least effective in terms of perceived accuracy, understanding, and plans to avoid the contaminated food. These increased with attribution to media sources, and increased even more for government sources. These findings are important and actionable because they clearly suggest that social media messages are most likely to be useful when clearly identified as coming from the government.

T4-E.1 Dale, A*; Barton, L; Therezien, M; Lowry, G; Casman, E; Carnegie Mellon University (authors 1,4,5), Duke University (authors 2,3); [email protected] An extensible multi-compartment model for nanoparticle risk assessment Within the past decade, nanotechnology has grown into a multi-billion dollar industry. Metal and metal oxide nanoparticles (NPs) represent the largest class of NPs by production volume and are found in many everyday items including paints, sunscreens, clothing, and cosmetics. Increased production and use imply greater potential for environmental release, which is troubling in light of the known toxicity of many metals. Unfortunately, the complex transport behaviors (e.g., aggregation) and surface-area dependent transformations exhibited by NPs have, to date, hindered the development of appropriate exposure models for risk assessment. Powerful and flexible new computational tools are needed to inform government and industry decision-making for the safe adoption of nanotechnology. We present the first generation of a multi-compartment modeling tool to predict the fate and transport of metal and metal oxide NPs released into sewer systems at end-of-life, focusing first on nanosilver (AgNP). The underlying mass balance framework links modules representing a wastewater treatment plant (WWTP), agricultural soils (land application unit, or LAU), surface waters, and sediments. Modules are dynamic and track daughter products (Ag+, Ag2S, Ag=POC, and Ag=FeOOH) as well as pristine Ag(0) NPs and sulfidized AgNPs. Parameter values for the time-dependent transport and transformation of AgNPs in the WWTP and LAU are determined experimentally. Settling behavior in the surface water/sediment module is compared to predictions from a more complex heterogeneous aggregation model based on the Smoluchowski equation. Preliminary findings suggest AgNPs released to the environment via WWTP biosolids and effluent will persist and accumulate in agricultural soils and aquatic sediments in a relatively non-toxic form (Ag2S) compared to Ag(0), though slow release of toxic silver ion will occur under specific environmental conditions and biouptake of Ag2S nanoparticles may result in exposures. Regulatory implications are discussed.

P.133 Datko-Williams, L*; Young, B; Wilkie, A; Madden, M; Dubois, JJ; Wichers Stanek, L; Johns, D; Oesterling Owens, B; U.S. Environmental Protection Agency; U.S. Centers for Disease Control and Prevention; [email protected] Quantitative assessment of in vivo toxicological interactions from criteria pollutant mixtures containing oxides of nitrogen The U.S. EPA sets National Ambient Air Quality Standards (NAAQS) for individual criteria air pollutants by evaluating sources, ambient concentrations, and health impacts. The Agency recognizes that air pollution exists as a complex mixture; however, biological interactions between mixture components are not well characterized. We reviewed literature cited in EPA’s Integrated Science Assessments and Air Quality Criteria Documents to identify studies of criteria pollutant mixtures and in vivo toxicological interactions. The current analysis considered mixtures containing oxides of nitrogen (NOX) and all health endpoints, although most studies focused on mixtures of NOX + O3 (ozone) and respiratory system effects. Studies with complete response data (mean, variance, n observations) for each treatment group were included in a quantitative analysis of the relationship between combined effects and individual component effects in the mixture. For each endpoint, the interaction was categorized as additive, greater than additive, or less than additive. Additivity was defined as the absence of a statistical difference between the sum of responses to individual pollutants and the response to the mixture of pollutants. Departures from additivity were tested using analysis of variance (H0: combined effects = sum of individual effects, p = 0.05). At least one endpoint deviated from additivity in all animal studies (n=17), while the majority of endpoints in the human studies (n=9) were additive. When studies were compared, no pattern among endpoints or exposure conditions emerged. Thus, this analysis suggests that deviations from additivity exist in health impacts of criteria air pollutant mixtures, although most endpoints and mixtures were not different from additive. The views expressed in this abstract are those of the authors and do not necessarily represent the views or policies of the U.S. EPA.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-D.4 Davidson, VJ*; Kenny, MF; Fazil, A; Cahill, S; Clarke, R; VJD University of Guelph, MFK Food & Agricultural Organization of the United Nations, AF Public Health Agency of Canada, SC FAO, RC FAO; [email protected] MCDA-ranking of food safety issues to inform policy-makers in Uganda In 2012, the Food and Agricultural Organization of the United Nations (FAO) initiated a project to develop improved tools for formulating food safety policies based on broad consideration of multiple risk factors. The approach has been to work at country level initially and the first pilot country study is taking place in Uganda in parallel with a study being conducted by the World Health Organization Foodborne Disease Burden Epidemiology Reference Group (FERG) and the Ministry of Health, Uganda. Evidence developed by the WHO/FERG study about health risks is integrated with information about economic, social and food security risks that are associated with foodborne hazards in Uganda. A series of workshops have been conducted to develop the risk criteria and metrics that are relevant in Uganda. Multi-criteria decision analysis (MCDA) tools are used to prioritize food safety issues resulting from biological and chemical hazards in different food sources. The MCDA tools provide transparency in terms of factors considered and weighting of individual risk criteria. The overall goal of the FAO project is to bring together rigorous evidence about public health, economic, social and food security risks as a solid foundation for developing food safety policies in Uganda. Experiences from Uganda as a pilot country will illustrate the feasibility and use of multi-criteria approaches in a developing country context and inform the development of FAO guidance and tools with global application.

M3-G.1 Demuth, JL; NCAR and CSU; [email protected] Examining the role of personal experience on weather risk perceptions and responses As Hurricane Sandy took aim at New Jersey in October 2012, many residents likely recalled their experiences with Tropical Storm Irene which made landfall nearby only a year earlier. This is but one example of a hazardous weather situation in which in one’s past hazard experience is a relevant and potentially critical factor that influences how one responds to a future weather risk. Hazardous weather is common relative to risks posed by many other types of natural hazards (e.g., earthquakes, wildfires, tsunamis), offering people many opportunities to build reservoirs of experiences with forecasts of the event as well as with the event itself, which they can then apply to future threats. It is generally thought that past experience influences one’s recognition of, perceptions of, and beliefs about a risk, which increases their behavioral motivation and intentions to protect oneself and thereby one’s actual response behaviors. Yet the empirical literature reveals mixed findings, with experience having a positive, negative, or lack of influence. Part of the reason for these mixed results may be that past hazard experience has been both simply and inconsistently conceptualized and measured. This presentation will briefly summarize how past hazard experience has been operationalized in the context of tornado, hurricane, and flood risks, and how experience has been shown empirically to relate to people’s risk perceptions and responses. Then, it will suggest a fuller, more valid way of characterizing this important risk construct.

T1-B.3 Denison, R; Environmental Defense Fund; [email protected] IRIS Improvements: Getting the Balance Right in Scientific Quality, Timeliness, Stakeholder Engagement and Peer Review Over the past several years, EPA’s IRIS program has been plagued with controversy, whipsawed by the differing demands and priorities of various stakeholders. As it struggles to regain its footing, IRIS must undertake the difficult task of striking the right balance between three sets of competing objectives: achieving acceptable scientific quality versus ensuring timeliness of its assessments; providing sufficient transparency and “due process” versus ensuring its process delivers balanced stakeholder input; and tapping necessary expertise versus avoiding conflicts of interest in peer review of its assessments. In this presentation I will first describe how the problems IRIS has faced in recent years can be traced to a lack of balance across these three sets of competing objectives. I will then examine recent enhancements in the IRIS program through the lens of how well they achieve the needed balance.

T2-E.2 Dennerlein, T; Rodriguez, D; MacDonald-Gibson, J*; University of North Carolina at Chapel Hill; [email protected] Predicting the Effects of Urban Design on Public Health: A Case Study in Raleigh, North Carolina The rise in obesity and chronic diseases in the United States has been partially attributed to decreased physical activity from lack of pedestrian-friendly urban designs. This project presents a framework to predict the health benefits of neighborhood designs that increase physical activity. The framework employs the principles of risk assessment, using measures of the built environment in place of the pollutant doses employed in conventional environmental pollutant risk assessments. We constructed a model to simulate pedestrian “exposure” to urban form characteristics associated with increased physical activity and then to connect this exposure to “dose-response” functions that relate physical activity to health outcomes. We demonstrate the model in a case study neighborhood in Raleigh, North Carolina, for which the City Council recently commissioned a new small-area plan intended to promote non-motorized transportation. Like much of Raleigh, this neighborhood is experiencing rapid growth; the current population of 10,400 is projected to reach 59,750 by 2040. We estimate that proposed built environment modifications would double the time neighborhood residents spend walking for transportation, when compared to current conditions. By the year 2040, this increased physical activity is expected to decrease the annual number of premature deaths by 6 (95% CI: 3-8) and the annual cases of four chronic diseases by 680 (95% CI: 400-900). This project is the first to provide a quantitative prediction of the health impacts of urban design in North Carolina. It shows that simple urban design changes could significantly improve the health of communities.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.20 Deveau, M*; Krewski, D; Nong, A; University of Ottawa; Health Canada; [email protected] Assessing the impact of human metabolic variability on the health risks of occupational and environmental exposures to chloroform Approximately 15,000 Canadians are occupationally exposed to chloroform, primarily in the recreational sector. Non-occupational exposures can occur when chloroform is formed as a disinfection byproduct in drinking water. Occupational and environmental exposure limits are designed to prevent liver toxicity from metabolites and neurological effects from the parent compound. Because chloroform is primarily metabolized by the 2E1 isoform of cytochrome P450 (CYP2E1), variability in the levels of the enzyme in the human population could influence susceptibility to the compound. The objective of this research was to investigate the effect of interindividual variability in CYP2E1 activity on the health risks of chloroform and to identify whether existing exposure limits sufficiently account for these differences. To do this, a human physiologically based pharmacokinetic (PBPK) model for chloroform was used, and distribution data on CYP2E1 in human liver were inputted into the model to simulate exposure scenarios for selected occupational and environmental exposure limits. Estimates were obtained for 5th percentile, average and 95th percentile metabolizers. As expected, the 5th percentile group metabolized less chloroform, resulting in higher blood chloroform concentrations. Likewise, the 95th percentile metabolizers had higher levels of metabolism. However, the differences amongst the groups were less than 2-fold, despite much higher variability levels for CYP2E1 and microsome concentrations; therefore, these factors only have a minor impact on the risks of liver toxicity and acute neurological effects within the population. The population variability in CYP2E1 appears to be sufficiently addressed in the selected occupational and environmental exposure limits.

M4-D.1 DeWaal, CS; Center for Science in the Public Interest; [email protected] Risk Communication: Preparing for the Unexpected Risk communication is a central aspect of risk analysis. This multi-faceted activity is essential to effectively managing a food safety event, including an outbreak of disease or food/ingredient contamination event. Broadly defined, it encompasses communication between technical experts, regulators and the public about threats to health, safety or the environment. During a food safety emergency, dissemination of accurate information is essential. Risk communicators armed with a strong understanding of underlying risks and using non-technical terms can ensure that the public responds to a food safety hazard appropriately, and reduces the likelihood of the dissemination of misinformation leading to increased consumer concern. This paper will examine two case studies on risk communication, including the emergence of BSE in the US and antibiotic resistance in foodborne pathogens. It will also discuss databases available to assist risk communicators.

W4-H.3 Dieckmann, N*; Peters, E; Gregory, R; Oregon Health & Science University; Decision Research; The Ohio State University; [email protected] The Motivated Evaluation of Numerical Uncertainty Ranges Numerical uncertainty ranges are often used to convey the precision in a forecast as well as the range of possible future states of the world. However, the interpretation of uncertainty ranges is often ambiguous. In two studies, we examine the extent to which end users vary in their perceptions of the relative likelihood of values in a numerical range and test specific hypotheses about how these perceptions are generated and how they might affect decisions. We discuss four primary findings from these studies: 1) There is substantial variation in how people perceive the distribution underlying numerical ranges; 2) Common cues to the correct interpretation (i.e., including a best estimate) explain some, but not all, of the variance in perceptions; 3) People show a tendency to perceive the distribution underlying a range in worldview-consistent ways, particularly in controversial domains like climate change; 4) The influence of distributional perceptions on choices was much stronger among the less numerate. We feel there are significant opportunities to improve uncertainty communication to maximize the likelihood that users will make accurate, unbiased evaluations of uncertain quantities. In highly charged domains like climate change, any changes we could make to help facilitate communication between opposing stakeholders would be well worth the effort.

T1-G.1 Dietz, T*; Henry, AD; Michigan State University; [email protected] Social Learning for Climate Change Governance An effective human response to climate change will require an adaptive risk management approach based on strategies that promote successful social learning. The importance of social learning is driven by uncertainties in how coupled human and natural systems will respond to climate change, in how climate change will interact with many other ongoing social and environmental changes, and in how soci-technical systems will evolve. These factors ensure that effective strategies for addressing climate change will need to be based on flexible strategies that allow societies, organizations and policy networks to learn from ongoing experiences and adjust responses over time given new information and environmental conditions. We examine the known obstacles to effective social learning by drawing policy network research, and offer suggestions on how social learning might be facilitated through the innovative design of risk management institutions.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-K.5 Dietz, T; Michigan State University; [email protected] Gene Rosa: Dedication to a humane societal development Eugene Rosa was a major figure in risk scholarship and in environmental sociology. This paper will review his strategy of examining the ontological and epistemological foundations of risk knowledge, and applying that understanding within specific technological and cultural contexts to understand societal responses to risk. It will examine both his specific contributions to risk scholarship and the lessons to be learned from his overall research strategy.

M4-H.1 Dillon-Merrill, RL*; Tinsley, CH; Georgetown University; [email protected] Structuring Public Private Partnerships to Encourage Near-Miss Reporting Despite decades of research on disasters, these events remain all too common-place. Scholars across a wide range of disciplines agree that one of the most viable approaches to reducing the occurrence of future disasters is to observe near-misses--situations where a bad outcome could have occurred except for the fortunate intervention of chance--and use these events to identify and eliminate problems in the system before they produce such a catastrophe. Unfortunately, because of a natural propensity for individuals to focus on outcomes, these warning signals can be ignored if the events are perceived as successes rather than near-misses (or near-failures). For many industries, the recognition of near-misses can be improved through the examination of data capturing anomalies across companies. First, since disasters and even near-misses are often rare events, data from multiple companies will increase the sample size, which always helps in scientific analysis. Second, near-misses can be difficult to interpret for companies in an unbiased way because of the "normalization of deviance" phenomenon, but data from multiple companies would allow comparisons to judge whether or not their events were normal relative to others in the industry. Yet significant obstacles exist in most industries to preclude such data pooling and sharing. For example, potential lawsuits can scare companies into guarding rather than sharing any "dirty laundry", even if these mishaps might help the collective. A public-private partnership should be the solution for this collective dilemma. In such a partnership, private companies would be properly protected and correctly incentivized to report the data to government agencies for aggregation. In this research, we examine two existing systems from aviation safety and medical patient safety and then make recommendations for developing a comparable system for cyber risk management.

P.93 Dixon, GN; Cornell University; [email protected] Uneven recall and inaccurate risk assessments from reading balanced news articles of controversial risk topics: The role of exemplars and affect This paper examines how the uneven placement of emotional pictures in a two-sided news article influences readers' information processing and risk perceptions. In study 1, participants (n=198) were randomly assigned to one of three balanced (i.e., two-sided) news articles on vaccine safety – one article with an emotional picture exemplifying vaccine safety arguments only; one article with an emotional picture exemplifying vaccine danger arguments only; and one article with no picture (control condition). In both experimental conditions, readers recalled risk arguments from the side with the exemplar significantly more than the side without it. The control condition yielded no significant difference in recall between the two sides. Study 2, which is currently ongoing, investigates the extent to which affective reactions toward the exemplar mediate the relationship between exemplar exposure and recall. Furthermore, it is hypothesized that the ease with which readers recall such information will significantly influence their risk perceptions. For scientific controversies where the evidence supports only one side of a two-sided news article, uneven placement of an affect-inducing exemplar might lead readers to primarily recall the side that is supported by little or no evidence. This is important because journalists who believe they are presenting a “balanced” article on a risk controversy might unknowingly influence their readers to largely process and recall only one side of a two-sided message, subsequently leading to inaccurate risk assessments.

P.137 Djouder, S; Chabaat, M*; Touati, M; Built Environment Research Laboratory, Dept of Structures and Materials, Civil Engineering Faculty, University of Sciences and Technology Houari Boumediene; [email protected] Kinetics and micromechanics associated with crack growth in brittle materials. In this study, kinetics and micromechanics associated with crack growth in brittle materials are considered. It is known that crack growth characteristics contain information on the material strength of fracture mechanisms and that there are sufficient experimental data evidencing that in most cases a crack growth is surrounded by a severely Damage Zone (DZ) which often precedes the crack itself. During its propagation, the DZ is characterized by few degrees of freedom (elementary movements) such as translation, rotation, isotropic expansion and distortion. On the basis of a stress field distribution obtained by the use of a Semi-Empirical Approach (SEA), which relies on the Green's functions, these driving forces corresponding to the mentioned degrees of freedom are formulated within the framework of the plane problem of elastostatics. A number of theoretical models have been proposed for the description of a stress field and kinetics of a damage zone [1, 2]. The traditional one identifies the DZ as a plastic zone and uses the well developed technique of the plasticity theory for the determination of its size, shape, energy release rates etc… According to recent experimental results, some damage patterns do not yield any model of plasticity and the shape of the DZ can be difficult to model. Then, a plasticity criteria is not adequate for damage characterization. However, elastoplastic solution is currently employed due to the lack of other approaches. Throughout this study, SEM is proposed for evaluating the stress field and the different energy release rates. This approach is based on the representation of displacement discontinuities by means of the Green’s function theory [3, 4]. This latest has been used in a purely theoretical context. Herein, we suggest a more realistic model (arbitrary orientations of discontinuities rather than rectilinear ones) for which the result can be obtained using the experimental data and thus avoiding the difficulties of analytical solutions.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-F.4 Dreyer, M*; Kuhn, R; Dialogik non-profit institute for communication and cooperation research; [email protected] Pharmaceutical residues in the water cycle: a case for communicating ‘risk’ to the public? Pharmaceutical residues in the water cycle are emerging anthropogenic contaminants mainly in the sense that there is growing recognition of their potential significance as a risk management challenge. In some countries of the Western World the notion is spreading that a precautionary approach is required in light of studies showing that certain substances can have negative effects in the water flora and fauna and that some substances are even present in drinking water, and given an aging population and an increasing use of several prescription and over-the-counter drugs. Should a precaution-based management program include public awareness-raising programs on prudent use and correct disposal of drugs, and what would definitely need consideration in the design of such a program? This question is attracting increasing attention, but is an open and largely unresearched issue. In order to address this question, it is vital to take into account the way in which people perceive the issue. So far, we have hardly any empirical insights into public perception of possible risks of pharmaceutical residues to environmental and human health. Our presentation therefore draws largely on theoretical and empirical insights from the broader literature on risk perception and communication and presents a set of (tentative) conclusions. The presentation will highlight that public awareness programs face the challenge that people interpret current knowledge about and growing scientific interest in the issue as an “early indication of insidious danger” to drinking water supplies. It points out that it is a key challenge for risk communication aimed at awareness-raising and behavioral adjustment to balance assertion that there is the need (and possibility) of preventing the (possible) risk from materializing with reassurance that there is currently absolutely no reason to be worried or even alarmed (about “contaminated” drinking water and/or denial of required medication).

M4-F.1 Driedger, SM*; Jardine, CG; University of Manitoba; [email protected] Strategies to engage knowledge users in understanding best practices for communicating about risk characterized by uncertainty Risk communication research must be responsive to the needs of various health and environment agencies and organizations. Likewise, it is critical that communication best practices gleaned from research be adequately and effectively translated into practice. Risk researchers spend a great deal of time designing studies to try and measure what may work (or not) in communicating uncertainty to the public. Some studies aim to assess aspects of testing theory, refining variables, or working with experimental scenarios with large numbers of participants to evaluate issues from an exploratory perspective. However, these studies are not always able to conclude what strategies will be most effective. Likewise, the context specific nature of risk communication – varying highly by the issue itself (i.e., is the degree of outrage potentially high, is the risk known/voluntary or not, etc) as well as the audience (i.e., is it a broader public or a specific community directly affected) – are also important in identifying which communication mechanisms, strategies or processes will be helpful as opposed to harmful. The objective of this presentation is to examine the practice of communicating the science/best practices from a systematic review to policy influencers and personnel in environment, public health and health system agencies. To connect our research findings to this group of policy knowledge end-users, we incorporated novel strategies to create dialogue regarding difficult and contentious subjects, including how admitting uncertainty (i.e., not knowing the answer) can increase rather than undermine trust. These strategies included using humor and specific visual illustrations of ineffective means of communicating uncertainty. We provide reflections from the ‘trenches’ as well as feedback obtained throughout the knowledge translation process with the aim to encourage dialogue and debate among risk communication specialists and practitioners in the ensuing panel commentary and open discussion of this symposium session.

W4-J.5 Dudley, SE; The George Washington University; [email protected] A Look Back at Regulatory Lookback Efforts Too often, ex ante predictions of regulatory outcomes (reductions in health risks, benefits and costs) are not verified with empirical data ex post. This presentation will survey past efforts at conducting retrospective review of regulatory effects, and offer recommendations based on this experience. Despite experience in other areas, and requirements established by Congress and each of the previous 6 presidents, ex post review of regulations tends to take a back seat to ex ante analysis of regulations before they are issued. While ex-ante regulatory impact analysis has a long tradition in the United States and elsewhere, such analyses necessarily depend on unverifiable assumptions and models, and are thus hypotheses of the effects of regulatory actions. Better retrospective review would allow us to test those hypotheses against actual outcomes, but it too poses challenges. For retrospective review of regulations to be successful, we need better tools for ex post analysis, and better incentives for conducting it.

P.6 Dumitrescu, A*; Lemyre, L; Pincent, C; University of Ottawa; [email protected] Spatial analysis of risk perception. The case of Nuclear Power Plant The beginnings of research in risk perception can be traced in the mid-60s to the public debate over use of nuclear energy and associated industries that have promised a cheap, clean and safe energy. Over the past decades, the study of public understanding and perception of risk has developed into a wide-ranging and interdisciplinary field of research and have been studied from different psychometric-cognitive and cultural perspectives. Among all these approaches and theories, there is no doubt that the psychometric paradigm has emerged as a leading theory in this field. While psychometric research has made important contributions, it has been criticized for its limitation in approaching the perception of risk across different geographical locations. Indeed, studies that have been conducted on public understanding and perception of risks in relation to physical space, proximity and place have produced mixed and conflicted results. The National Health Risk Perception Survey 2012 involving a representative sample of approx. 3000 respondents among Canadian population was used to study the proximity effect. Our research proposes a new methodology, which aims to explore the relationship between proximity to a hazard and perceptions of risk by analysing the spatiality of risk perception. By geocoding risk perception data, we compared risk perception between the population living in the proximity of nuclear power plant and the rest of Canadian population. Our results are consistent with the findings of other studies which shown that risk perception is lower among people living close to nuclear power plants. Additionally, correlations between living distance from nuclear power plants and risk perception were explored. The analysis of spatial dimension of risk perception provides an exceptional level of integration of individual, environmental and contextual variables and provides also a link between risk assessment attributable to a specific hazard and risk perception.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-H.2 DuMont, M. K.; Office of the Secretary of Defense; [email protected] Defining Risk to the Defense Strategy The Office of the Under Secretary of Defense for Policy assists the Secretary of Defense in defining strategic risks and trade-offs in order to inform both short- and long-term policy decisions. This paper will consider the challenges of incorporating risk into defense policy decision-making, including how the risk picture changes with different defense strategies and the differences between contingency risks and enduring risks. It will also highlight the unique aspects of the defense environment that affect risk identification, assessment, and mitigation; among them are the size and scope of the defense establishment and the fact that some missions are “no fail.” It will use examples to illuminate how senior defense officials use risk in deciding between alternative strategic options.

W3-I.2 EKANEM, NJ*; MOSLEH, A; University of Maryland, College Park, MD, USA; [email protected] A Model-based, Scenario-driven Human Reliability Analysis Method As a discipline, Human reliability analysis (HRA) aims to identify, model, and quantify human failure events (HFE) in the context of an accident scenario within probabilistic risk assessment (PRA). Despite all the advances made so far in developing HRA methods, many issues still exist which include; the lack of an explicit causal model that incorporates relevant psychological and cognitive theories in its core human performance model, inability to explicitly model interdependencies between influencing factors on human performance, lack of consistency, traceability and reproducibility in HRA analysis. These issues have contributed to the variability in results seen in the applications of the different HRA methods and also in cases where the same method is applied by different HRA analysts. In an attempt to address these issues, a framework for a “model-based HRA” methodology has been recently proposed which incorporates strong elements of current HRA good practices, leverages lessons learned from empirical studies and the best features of existing and emerging HRA methods. It is aimed at enabling more credible, consistent, and accurate qualitative and quantitative HRA analysis. We are presenting the set of steps for the practical implementation of the methodology, covering both the qualitative and quantification phases. Bayesian Belief networks have been developed to explicitly model the influence and interdependencies among the different components (HFEs, error modes, contextual factors) of this methodology for more accurate HEP estimation. These models will have the flexibility to be modified for interface with several existing HRA quantification methods and also used to demonstrate a cause-based explicit treatment of dependencies among HEPs which is not adequately addressed by any other HRA method. While the specific instance of this method is used in Nuclear Power Plants, the methodology itself is generic and can be applied in other industries and environments.

T4-A.2 El Haimar, AE*; Santos, JS; The George Washington University; [email protected] Stochastic Input-Output Modeling of Influenza Pandemic Effects on Interdependent Workforce Sectors Influenza pandemic is a serious disaster that can pose significant disruptions to the workforce and associated economic sectors. This paper examines the impact of influenza pandemic on workforce availability within an interdependent set of economic sectors. In particular, it presents a simulation and analysis of the impacts of such a disaster on the economic sectors in a given region. We introduce a stochastic simulation model based on the dynamic input-output model to capture the propagation of pandemic consequences across the National Capital Region (NCR). The analysis conducted in this paper is based on the 2009 H1N1 pandemic data. Two metrics were used to assess the impacts of the influenza pandemic on the economic sectors: (i) inoperability, which measures the percentage gap between the as-planned output and the actual output, and (ii) economic loss, which quantifies the monetary value of the degraded output. The inoperability and economic loss metrics generate two different rankings of the economic sectors. Results show that most of the critical sectors in terms of inoperability are sectors that are related to hospitals and healthcare providers. On the other hand, most of the sectors that are critically ranked in terms of economic loss are sectors with significant total production outputs in the NCR region such as federal government agencies. Therefore, policy recommendations relating to potential risk mitigation and recovery strategies should take into account the balance between the inoperability and economic loss metrics. Although the present study is applied to the influenza pandemic disaster in the NCR region, it is also applicable to other disasters and other regions.

P.92 Eller, EG*; Calderon, AA; Stephenson Disaster Management Institute, Louisiana State University; [email protected] Managing communication in times of crisis through ambiguity: A framework for crisis communication It has been suggested that the control of ambiguity as part of a crisis communication strategy can be an effective mechanism to manage and affect the perception of organizations by their stakeholders. Previous research on the perception of ambiguity suggests that positive and negative effects can be attained by both: (1) the communicating organization (e.g. through flexibility, credibility, and other outcomes) and (2) the recipient of the message (e.g. stakeholders with varied levels of trust, confusion, etc.). The purpose of the presented work is to contribute to the understanding of how, if any, ambiguity should consciously be managed in crisis communication. We consider ambiguity as a multidimensional construct, so we argue that in crisis communication, ambiguity can be found and managed on several levels such as in the content of the message, in the context of the relationship between the communicating parties, and in the form and context of the communication. We also suggest several factors of the recipient of the message affecting the interpretation and impact of ambiguity. The present work offers a practical framework for the management of ambiguity in crisis communication based on prior research and critiqued by a group of experts. This paper presents through translational research an applied research framework for the use of scholars and decision makers at all levels, while taking into consideration their perspectives, experiences, concerns, comments and ideas during Hurricane Katrina (2005) and the Deepwater Horizon BP Oil Spill (2010). We believe that the presented framework offers a new perspective on the management of ambiguity in times of crisis and thereby provides a basis for future research and also provides a practical framework that can be used to collect data to further educate the field of crisis communication.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.7 Elmontsri, ME; Higher Institute of Occupational Safety and Health; [email protected] Risk Perception in Libya: An Overview “Risk” has become increasingly topical in the recent decades in the Western Countries based on a socio-psychological approach. The aims of those studies were to understand how people react and perceive specific types of risks which helps decision and policy makers in understanding what the society is worried about and how such risks affecting their decisions. The science of risk analysis has been studied in details in the developed world whereas in developing countries such type of research remains limited. Therefore, the aim of this paper is to examine the ways in which Libyan people perceive the various societal risks that confront them by adopting the psychometric paradigm, which involves using a survey strategy to obtain the required data. It is aimed that this piece of research provides a valuable and crucial insight into the current risk perception of the Libyan public. It will also provide a starting base-line for further research in this field.

P.147 Enger, KS; Murali, B; Birdsell, D; Gurian, P; Wagner, DM; Mitchell, J*; Michigan State University; [email protected] EVALUATING LONG TERM INACTIVATION OF BACILLUS SPORES ON COMMON SURFACES Bacillus spores resist inactivation, but the extent of their persistence on common surfaces is unclear. This work addresses knowledge gaps regarding biothreat agents in the environment in order to reduce uncertainty in related risk assessment models. Studies were conducted to investigate the long term inactivation of B. anthracis and three commonly used surrogate organisms -B. cereus, B. atrophaeus, and B. thuringiensis. Models of inactivation kinetics were subsequently evaluated for fit. Spores were applied to 25 cm2 rectangles of three materials: laminate countertop, stainless steel, and polystyrene Petri dish. They remained at 22ºC and 50% relative humidity. Viable spores were measured at 1, 30, 90, 196, and 304 days by swabbing rectangles and eluting swabs in phosphate buffered saline. After serial dilution, colonies were grown and counted. R (cran.r-project.org) was used to fit persistence models to the data: exponential, logistic, Juneja and Marks 1 (JM1), Juneja and Marks 2 (JM2), Gompertz, Weibull, lognormal, gamma, biphasic spline, and double exponential. B. thuringiensis counts increased at 24 hours on all materials, with a subsequent decline. Several experiments showed evidence of a U shape; decrease followed by an increase in spore counts (B. anthracis & B. atrophaeus on laminate; B. anthracis & cereus on steel). Spores on polystyrene showed little inactivation. The maximum inactivation was 56% (B. atrophaeus spores inactivated on steel at 196 days). Fitting models to the data from laminate and steel indicated that the gamma, lognormal, JM1, and JM2 models fitted the data better than other models (by lowest BIC, or within 2 units of the lowest BIC). Models fitted to data from the polystyrene material were uninformative because little inactivation was observed. Spore inactivation was not loglinear. U-shaped inactivation curves might be explained by lower adhesion to the surface as the spores age, enhancing recovery.

T1-D.4 England, M; Brouwer, A; Gale, P*; AHVLA; [email protected] Modelling the risks of introduction of ticks infected with Crimean-Congo haemorrhagic fever virus into GB Crimean-Congo haemorrhagic fever virus (CCHFV) is a tick-borne zoonosis. Recently, large outbreaks of CCHFV have occurred in new foci in south-east Europe and the Balkans, most notably in Turkey. Migratory birds have previously been suggested as a factor in the spread of CCHFV into Europe but here for the first time we present data for international horse movements as a potential risk factor in the spread of CCHFV-infected ticks. The numbers of CCHFV-infected unfed adult ticks in GB that could potentially bite and infect a human from these two pathways were predicted and compared. CCHFV has never been detected in GB and no infected ticks have been reported on birds or horses in GB. GB does not have competent Hyalomma spp. tick vectors for CCHFV and transmission within GB is only a theoretical possibility assumed here. A spatial analysis of GB under current climatic and land cover conditions predicted the areas of GB where imported Hyalomma spp. ticks could survive to the next life stage following detachment from imported horses or migratory birds. Hyalomma spp. ticks would enter GB on birds as nymphs and on horses as fed and mated adults. A total of 143 CCHFV-infected unfed adult Hyalomma spp. ticks was predicted to be present in GB as a result of importations on horses between 1st April and 31st July each year under current habitat conditions. For the same time period, a total of 11 CCHFV-infected Hyalomma spp. adult ticks was predicted to be present in GB as a result of importations on migratory birds. Although a greater number of CCHFV-infected ticks were imported into GB on migratory birds each year than on horses, the ability of each female to lay an average of 6,500 eggs following detachment in GB led to an overall greater number of CCHFV-infected ticks that could bite a human from the horse pathway than from the bird pathway.Empirical data would be required to justify these predictions.

T4-B.1 Evans, AM*; Rice, GE; Teuschler, LK; Wright, JM; Oak Ridge Institute of Science and Education; [email protected] Using Secondary Data to Evaluate Diverse Groups of Chemical and Nonchemical Stressors in Cumulative Risk Assessment A main impediment of performing cumulative risk assessments (CRAs) is having data for multiple chemical and nonchemical stressors in the same individuals or populations. Therefore, secondary data analysis can be utilized as a screening approach to integrate population characteristics, as well as time/activity patterns, and exposure distributions to multiple stressors. Two CRA case studies will be examined to highlight potential challenges of integrating heterogeneous data. The issue of dissimilar variables across data source (i.e., NHANES vs. Census) was encountered in the first case study (i.e., joint noise and volatile organic compound (VOC) exposure related to hearing impairment); therefore, many important variables (e.g., time activity patterns) were ultimately excluded from the analysis and were only examined in sensitivity analyses (e.g., the average decrease in noise exposure across all populations due to spending 90% of time indoors). Spatial misalignment (i.e., data collected at different scales and/or resolutions) is a common issue in secondary data analysis that was also experienced in the first case study where noise and VOC exposures were estimated at the area and individual level, respectively. To address this issue, we extrapolated area level VOC estimates from the individual estimates, this forced all subpopulations in the same area to have the same average exposure; therefore, we were unable to evaluate subpopulation differences in VOC exposure within areas of similar noise exposure. The second case study addresses quantifying nonchemical stressors, specifically chronic psychosocial stress. Despite these noted challenges, the advantages of secondary data analysis for informing CRAs include identification of population vulnerabilities and differential exposures that can be used by risk management and public health officials to prioritize resources regarding screening, intervention, and prevention. The views expressed in this abstract are those of the authors and do not necessarily reflect the views or policies of the U.S. EPA.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-D.4 Evans, AM*; Rice, GE; Teuschler, LK; Wright, JM; AME-Oak Ridge Institute of Science and Education; GER, LKT, JMW-U.S. Environmental Protection Agency; [email protected] Considering Buffers in Cumulative Risk Assessments Cumulative risk assessments (CRAs) quantitatively or qualitatively evaluate the risks of combined exposures to chemical and nonchemical stressors. CRAs also examine vulnerabilities (e.g., pre-existing health condition, genetic predisposition, poverty) as these may lead to variability in response to stressors. Resilience, the ability to adapt and/or recover from certain stressors (i.e., reduced risk), has rarely been considered in CRAs. Buffers (e.g., good nutrition, physical activity, healthy weight) may increase resilience. Existing CRA methods (e.g., the hazard index approach) may not work or may need to be altered for some buffers (e.g., due to U-shaped curves). Here, we examine the contribution of buffers in CRAs using fish consumption as a case study. Fish consumption is associated with exposure to multiple chemical stressors including methyl mercury (MeHg) and polychlorinated biphenyls (PCBs) associated with adverse neurodevelopment, as well as increased exposure to polyunsaturated fatty acids (PUFAs). PUFA exposures are potential buffers of adverse neurodevelopment, as they are associated with improved cognitive development. To characterize the joint neurodevelopmental hazard to MeHg, PCBs, and PUFAs, quantitative and qualitative approaches for CRA will be examined and general recommendations regarding the integration of buffers and resilience factors will be discussed. Case studies comparing the use of current and proposed methodologies will be important for the future incorporation of buffers and resilience factors in CRAs. The views expressed in this abstract are those of the authors and do not necessarily reflect the views or policies of the U.S. EPA.

M2-G.3 Evensen, DT*; Stedman, RC; Cornell University; [email protected] Fractured discourse: Social representations of shale gas development in the USA and Canada In the five years since discussion of shale gas development proliferated in eastern North America, this topic has dominated conversation for many scientists, policy makers, elected officials, and citizens at large. Our interest is in how this conversation evolves and moves forward. How do communities come to focus on certain issues related to shale gas development? Why do people discuss risks or opportunities associated with shale gas development in specific ways? We sought to answer these questions through the theoretical lens of social representations theory, which examines how complex, scientific issues are translated into and communicated via common sense language. Social representation theory also postulates that representations of complex phenomena are not only socially held, but also socially emergent, meaning that they develop through social processes and exchanges. To study social representations of shale gas development, we conducted in-depth interviews in nine communities across three regions currently or potentially exposed to shale gas development (three each in the states of New York and Pennsylvania, and in the province of New Brunswick). We spoke with individuals heavily involved in shaping or facilitating discourse on shale gas issues (total n = 50). Interviews revealed that issues of community character and an aspiration to preserve or foster a desired community lifestyle dominated representations from individuals in favour of shale gas development and from those opposed to it; these representations are often absent in mass media coverage. Also common on both sides of the issue were representations of shale gas development as a policy issue that needs to be informed by factual information. Both sides were quick to characterise the other side as misinformed and short-sighted in its perceptions of risks and benefits. We discuss implications of this research for creative ways to improve the policy discourse on shale gas development.

W3-D.4 Evers, EG*; Chardon, JE; National Institute for Public Health and the Environment; [email protected] A swift quantitative microbiological risk assessment (sQMRA) - tool: Improved version We developed a simplified QMRA model especially aimed at comparing the public health risk of pathogen - food product combinations and education. Here we present an improved version as a follow-up of the first published version. The swift Quantitative Microbiological Risk Assessment (sQMRA) – tool is implemented in Excel/@Risk and pathogen numbers are followed through the food chain, which starts at retail and ends with the number of human cases. Relative risk (compared to other pathogen-food product combinations) and not absolute risk is considered the most important model output. The model includes growth/inactivation during storage at home (categories: room/fridge/freezer), cross-contamination (yes/no) and heating (done/undercooked/raw) during preparation in the kitchen and a dose response relationship (binomial/betabinomial). The model optionally includes variability of e.g. pathogen concentration and food product heating (time, temperature) in the kitchen. The general setup of the sQMRA tool consists of 12 consecutive questions for values of parameters, always followed by intermediate model output broken down into treatment categories. In a separate sheet, model input and output are summarized and exposure (probability of a contaminated portion, number of cfu) as well as number of human cases are attributed to treatment categories. As a measure for relative risk, intermediate (number of contaminated portions, number of cfu) and final (number of human cases, DALYs) model outputs of the considered pathogen-food product combination are compared with results for other combinations. The sQMRA-tool proves to be useful for quickly obtaining relative public health risk estimates of pathogen - food combinations, which can serve as a guide for risk management or selection of combinations for applying classical QMRA. It is also useful for educational purposes because of the insightful presentation of intermediate and final model output.

P.21 FAN, KC*; HO, WC; LIN, MH; CAFFREY, JL; WU, TT; PAN, SC; CHEN, PC; WU, TN; SUNG, FC; LIN, RS; China Medical University; [email protected] Ambient air pollution and allergic disease among children The prevalence of childhood eczema, allergic rhinitis, and asthma has been increasing worldwide. Air pollution related to allergic diseases has been an important public health issues, especially for highly sensitive group like children. Critical exposure window could occur for air pollution related to allergic diseases, especially during embryo, 0 to 1 years old, and 1-2 years old. The purpose of this study is to assess the potential adverse health effects of air pollution related to allergic diseases (eczema, allergic rhinitis and asthma). There are two databases used in this study: 1) Longitudinal Health Insurance Database 2005 (LHID2005) and 2) Environmental Protection Agency (EPA) air monitoring database. Geographic Information Systems (GIS) will be used in estimating air pollution exposure. Furthermore, Cox proportional hazard regression models will be used in adjusting sex, geographic area, urbanization level, household Environmental Tobacco Smoking (ETS) exposure and lead concentrations in air within three periods of exposure time, 10 months before birth, 0 to1 years old, and 1 to 2 years old. All statistical analyses will be performed with the SAS version 9.3 (SAS Institute, Cary, NC, USA). In other study indicate that components of PM2.5 were associated with hospitalization for several childhood respiratory diseases including pneumonia, bronchitis, and asthma. Therefore we find that long-term air pollution exposure not only associated with asthma, but also affect children’s lung function and cause allergic disease.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T3-D.4 Fanaselle, WL*; Hoelzer, K; FDA, CFSAN; [email protected],gov Reducing the potential for norovirus foodborne illness through surface disinfection Norovirus (NoV) infections are the leading cause of foodborne illness outbreaks worldwide. Disinfection of environmental surfaces is of paramount importance to prevent, contain or control outbreaks, but continues to present considerable practical challenges. Environmental disinfection of this virus is particularly challenging because of the extremely high numbers of virus potentially shed in human stool and vomitus, together with the low infectious dose and high stability in the environment. A systematic evaluation was conducted of the available data on NoV disinfection from surfaces, focusing in particular on what is and is not currently known about the biological, epidemiological and mechanistic determinants of disinfection efficacy. In addition, we evaluated the potential usefulness and limitations of disinfection protocols in outbreak situations, with particular focus on outbreaks in food preparation and service establishments. In this presentation, we will present the results of this data comparison and explain how this data can be used in future risk assessments on NoV.

T2-A.2 Fann, NL; Walker, KW; Gilmore, EA*; U.S. Environmental Protection Agency; [email protected] Characterizing the long-term PM2.5 concentration response function: a comparison of estimates from expert judgment, meta-analysis, and integrated research estimates Regulatory impact assessments (RIAs) for rules that affect fine particle levels (PM2.5) levels routinely estimate monetized benefits in the tens or hundreds of billions of dollars, attributable largely to the reductions in the risk of premature death. The quantitative relationship between changes in exposure to PM2.5 and the risk of death (i.e. the concentration-response function), and associated assumptions about the likelihood that such a relationship is causal, are key inputs to these analyses. Given the magnitude of monetized benefits associated with reductions of PM2.5, policy-makers have expressed interest in better characterizing of the magnitude, functional form and the uncertainties in this concentration response (C-R) relationship. To meet the demand for this information, researchers have applied a range of alternative research synthesis approaches, including meta-analysis, expert judgment and integrated research estimates, to essentially the same policy question. For example, in 2004, the USEPA undertook an Expert Elicitation in an attempt to capture a fuller range of uncertainties associated with the C-R relationship. More recently, under the Global Burden of Disease project, a collaborative scientific effort has developed an integrated C-R function, with associated uncertainty that would span the full range of global ambient PM concentrations. In this respect, the PM2.5 mortality relationship is a unique test-bed that allows us to compare across approaches to synthesizing data to inform critical policy questions. In this talk, I first review these approaches as applied to the PM2.5 C-R function and evaluate how these techniques provide different types of information for the analysis of US Environmental Protection Agency (USEPA) rulemaking. Second, I draw broader lessons to evaluate the factors that may make particular approaches more or less suited to informing policy decisions.

M4-J.3 Fann, NF; Fulcher, CM; Baker, KR; Roman, HA*; Gentile, MA; U.S. Environmental Protection Agency; Industrial Economics Incorporated; [email protected] Characterizing the Distribution of Recent and Projected Air Pollution Risk Among Vulnerable and Susceptible Individuals Recent studies have characterized well the recent and projected total health burden of air pollution at the national scale. The literature has also explored the distribution of air pollution risks, and the level of risk inequality, among and between susceptible and vulnerable populations at the urban scale. This presentation will build upon this literature by demonstrating how source apportionment techniques can be used jointly with inequality coefficients to: attribute the nationwide level and distribution of total air pollution risks across vulnerable and susceptible populations in 2005 and 2016 to 7 emission sectors; and, characterize the change in the level of risk inequality among these populations over time. We define population vulnerability and susceptibility using characteristics identified elsewhere in the literature; these include baseline health status, socioeconomic status and other attributes that are empirically linked to air pollution-related risk. We calculate inequality coefficients including the Atkinson index. Our results suggest that reduced emissions among certain sectors between 2005 and 2016, including Electricity Generating Units and mobile sources, have significantly reduced the air pollution health burden among susceptible and vulnerable populations.

P.47 Farber, GS; US EPA; [email protected] Design of Institutional Mechanisms for Effective Risk Management: Assignment of Responsibility in the Case of Waste Disposal Policy schemes for disposal of hazardous materials are designed to facilitate risk reduction by reducing exposure to toxic and radioactive hazards. Why do some policy schemes for disposal of hazardous materials work effectively, while others function poorly and fail to mitigate those risks? A great deal of attention is paid to engineering design of waste management units and structures, but insufficient attention is given to the institutional mechanisms and incentives in designing policies. This paper examines the role of several of these mechanisms in obtaining effective risk management outcomes, focusing on the schemes for assigning responsibility for waste disposal.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-J.1 Faria, F; Jaramillo, P*; Carnegie Mellon University; [email protected] The Risk Of Increased GHG Emissions From Hydropower Development In The Brazilian Amazon In the last five years Brazil has started building a series of hydroelectric power plants in the Amazon region to fulfill its growing demand for electricity. Hydropower is expected to continue to play an important role in the Brazilian power generation sector, as 60% of the 56 Gigawatts (GW) of forecasted electricity requirements by 2021 will come from hydropower. The Brazilian Amazon region is the main frontier for this development because of high available potential. Until the 1990s hydropower was seen as renewable energy source with no greenhouse gas emissions. However, in the last decades many studies have addressed the issues related to carbon emissions in reservoirs. The objective of this work is to evaluate the risk of significant releases of greenhouse gases through the development of hydropower projects in the Brazilian Amazon. There are three mechanisms through which greenhouse gases may be released by hydroelectric projects: reservoir emissions due to the decomposition of flooded organic matter, the loss of net primary productivity (NPP) linked with the reservoir creation, and the indirect deforestation associated with the construction and operations of hydroelectric power plants. Several publicly available models will be used to estimate the potential greenhouse gas emissions of seventy-four proposed dams in the Brazilian Amazon. These emissions will then be compared with emissions from other power sources.

P.109 Farris, AM; Buss, SD*; Cardona-Marek, T; Alaska Department of Environmental Conservation and SPB Consulting; [email protected] Setting a Regulatory Cleanup Level for the Emerging Contaminant Sulfolane Sulfolane is an industrial solvent used in oil and gas processing. When sulfolane was first detected in the groundwater at the North Pole Refinery in North Pole, Alaska, it was not a regulated compound by the State of Alaska or United States Environmental Protection Agency (US EPA). In addition, little data was available on the long-term toxicity of the compound. In 2004, the Alaska Department of Environmental Conservation (ADEC) set a cleanup level for sulfolane in groundwater at 350 parts per billion (ppb) based on toxicity levels from the Canadian Councils of Ministers of the Environment report. This concentration was not exceeded at the refinery boundaries, so no further characterization was completed until additional monitoring wells were installed in 2009 and concentrations were higher than expected. Sulfolane was then tested for and discovered in over 300 private drinking water wells downgradient of the refinery, as well as the city municipal wells. This discovery led to ADEC coordinating with the State Department of Health, the Agency for Toxic Substances and Disease Registry (ATSDR), and US EPA toxicologists to better evaluate the chemical’s potential health effects and re-evaluate the cleanup level. Nearly 30 toxicologists reviewed the available data on sulfolane and made the recommendation to substantially lower the established level. The State of Alaska calculated a cleanup level of 14 ppb in groundwater based on an US EPA Provisional Peer Reviewed Toxicity Value protective of the ingestion of groundwater and site specific exposure parameters. Beyond setting the cleanup level, ADEC nominated sulfolane to the National Toxicity Program for additional toxicity research. The nomination was accepted and additional studies are underway.

T1-J.4 Farrow, S; UMBC; [email protected] Integrating risk and economic performance measures for cybersecurity Economic performance measures which include risk such as expected net benefits or option price are poorly developed and infrequently used in many homeland security applications. That generalization appears to apply to cybersecurity as well. The publicly available economics literature contains almost nothing on the economics of cybersecurity to which the concept of network externalities would seem to be crucial. The advent of the economics of information has led to conceptualizing and modeling “network externalities” whereby the number of units attached to a network can confer positive indirect effects ( e.g. originally the number of people on a telephone network), or negative indirect effects (e.g. cascading overloads or the spread of a computer virus). Consequently a risk to an individual node may be limited to that node but may cascade or impact other nodes as well either virtually or exiting and mediating through real systems such as power systems. This paper focuses on the conceptual development to frame the cybersecurity problem in terms of risk and economic performance measures which integrate risk.

W2-D.2 Fazil, A; Public Health Agency of Canada; [email protected] Development of a Multifactorial Risk Prioritization Framework for Foodborne Pathogens The effective management of food safety hazards requires the establishment of priorities in an objective and consistent manner. Public health impact has often been the basis for this prioritization, but it has increasingly been recognized that economic costs and socio-economic factors also play a role. The development of a prioritization framework in Canada for foodborne risks that considers public health impact as well as three other factors (market impact, consumer risk acceptance and perception, and social sensitivity), including the outranking multicriteria decision analysis (MCDA) method used to integrate the information will be presented. Overall, the framework can support policymakers in complex risk prioritization decision making when different stakeholder groups are involved and when multiple pathogen-food combinations are compared.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.23 Fiebelkorn, SA*; Bishop, EL; Breheny, D; Cunningham, FH; Dillon, DM; Meredith, C; British American Tobacco, Group R&D; [email protected] Assessment of benzo(a)pyrene (a tobacco smoke toxicant) as a driver of genotoxicity Over 5,600 constituents have been identified in tobacco smoke, some with well established toxicological properties. Our proposed framework for the risk assessment of tobacco smoke toxicants combines both computational and in vitro experimental components. Initially we use Margin of Exposure (MOE) calculations to segregate tobacco smoke toxicants into high and low priority for risk management action, using guidelines developed by the European Food Safety Authority (EFSA). We conduct Mode of Action (MOA) analyses, using the International Programme on Chemical Safety (IPCS) framework, on these prioritised toxicants. Experimentally, we then test individual priority toxicants for their activity in several in vitro assays using the MOA for each toxicant to inform assay selection. Here we describe our findings from use of this risk assessment framework, using benzo(a)pyrene (BaP) as the prototypical tobacco smoke toxicant. Following a detailed literature search, we generated twelve MOEs for BaP ranging from 16,805 to 2,400,000 indicating a lower priority for risk reduction research. Our MOA analysis for BaP proposed four key events; genotoxicity, mutation, cell proliferation and tumour formation. From our in vitro toxicity data, the concentrations of BaP equating to a point of departure were 1.0-1.28 µg/plate (Ames), 0.75-1.0 µg/ml (micronucleus) and 1.4-1.5 µg/ml (mouse lymphoma assay). These data confirm the genotoxic and mutagenic potential of BaP, supporting the first two key events in the proposed MOA. The data has subsequently been used to generate in vitro MOEs and these support the in vivo MOE conclusions (1,200,000-30,000,000). Additional in vitro data sets from disease models provide further weight of evidence for the postulated MOA key events. Future refinement of our conclusions for BaP would include the use of PBPK models to predict tissue dose within the respiratory tract of smokers, and a cumulative risk assessment on the various polycyclic aromatic hydrocarbons present in tobacco smoke.

M4-I.3 Figueroa, RH*; Morgan, MG; Fischbeck, PS; Carnegie Mellon University; [email protected] An assessment of the risks of building collapse for the City of Nairobi based on an investigation into East Africa’s construction quality control processes. In developing countries, poor quality control in construction has led to spontaneous building collapse and, in the event of even moderate seismic activity, to major disaster. While earthquake-resistant designs have greatly improved international building codes that are accessible to designers everywhere, builders in developing countries often fail to meet acceptable standards. This paper examines the state of the industry with respect to compliance with standards for concrete used in structures, and assesses the risks of building collapse under different scenarios for the city of Nairobi. The state of the industry is assessed in two ways: 1) with a comparison of test results reported by established laboratories in Nairobi from a sample of new construction projects, to non-destructive-test data collected at twenty-four construction sites; and 2) through the elicitation of experts in construction familiar with the Kenyan industry. The findings suggest that there is widespread fraud and that the current quality control practices are not effective in ensuring structural reliability. Therefore, regulators routinely certify buildings as safe for occupation based, in part, on inaccurate of false laboratory reports. These findings highlight an example of laxity in quality control in the construction industry that could be pervasive in many developing countries, as the recent tragedy in Bangladesh and the disaster in Haiti in 2010 suggest. The risks of collapse is assessed by combining building inventory data, seismic performance models of common types of building in Nairobi, and estimates obtained by expert elicitation into a probabilistic risk model. Thousands of dangerously weak buildings will be built, and unless better policies are implemented, millions of people would likely be exposed to unnecessarily higher risks for generations. The methodology presented can be implemented in many other regions with minimal adjustments.

T3-J.2 Finkel, AM; University of Pennsylvania Law School; [email protected] Lessons from risk assessment controversies for the “job-killing regulations” debate As our talents for collecting data, discerning causal relationships, and refining empirical models continue to improve, risk scientists and economists are struggling in their own ways to provide “high-quality quantification.” Those responsible for developing, supporting, or criticizing estimates of regulatory costs in general, and of the effects of regulation on jobs in particular, can either rise to or dodge the challenges of analyzing thoroughly, humbly, transparently, objectively, logically, and responsively (to public values and preferences). These are all challenges that quantitative risk assessment (QRA) has already confronted, and surmounted with varying degrees of success, over the past several decades. This presentation will draw out various parallels between recent improvements in QRA and the unfinished work of improving job-impact analysis for proposed regulations. I will focus on six such analogies: (1) the attempts to reduce excessive “conservatism” in estimation; (2) the supplanting of point estimates with ranges and distributions acknowledging uncertainty; (3) the emphasis on considering net effects, not merely first-order ones; (4) the commitment to enumerating and justifying all “defaults” used, including the “missing defaults” problem identified by various National Academy committees; (5) the moves to “harmonize” disparate effects and aggregate them using a common currency; and (6) the importance of considering effects across the inter-individual spectrum of susceptibility. I conclude that economists should strongly consider emulating the improvements in QRA, in order to dispel some of the misinformation surrounding the “job-killing regulations” controversy.

T4-A.4 Finkel, AM*; Berk, RA; University of Pennsylvania Law School; [email protected] Using statistical profiling to improve OSHA’s capability to locate workplaces posing grave risks The U.S. Occupational Safety and Health Administration (OSHA) faces a daunting mismatch between the size of its enforcement corps (roughly 2,000 inspectors nationwide) and the number of worksites under its jurisdiction (more than 9 million). OSHA currently targets establishments for inspection via their self-reported injury rates; this system leads the agency to spend time visiting many “sheep in wolves’ clothing” (firms that are fully compliant) and fails to find many true “wolves” (firms inaccurately reporting low injury rates or that expose workers to conditions that have not yet caused grave harm). Predictive targeting is a non-intrusive way to listen better for the signals that firms are sending by their day-to-day behavior. We hypothesize that different “red flags”—in particular, indicators of financial turmoil and evidence that firms are flouting other regulatory requirements—are more strongly related to the severity of workplace hazards than the injury rates are. With a grant from the Robert Wood Johnson Foundation, we have merged into the roughly 100,000 records of OSHA inspections during 2008-09 two other large enforcement databases: one that tracks dollar penalties and the number of calendar quarters in non-compliance for three major EPA programs (air pollution, water pollution, and hazardous waste disposal), and one that tracks similar information for several other Department of Labor (DOL) programs dealing with fair wages, overtime pay, employment discrimination, and the like. The data show that two unmeasured covariates are strongly associated with the severity of hazardous workplace conditions: (1) establishments with frequent non-compliance with EPA and DOL wage/hour regulations; and (2) those located in communities with a high percentage of minority residents. We have recently added to the merged datafile a large time-series of measures of credit scores, indebtedness, sales growth, ownership changes and other financial and managerial indicators at the establishment level, and will present results from analysis of these characteristics.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W3-A.3 Fiorino, D ; American University; [email protected] Policy Learning, Chemicals, and Risk: Can Policy Innovation Keep Up with Technology Change? The context for safe management of chemicals has changed dramatically in the last four decades, since the passage of the Toxic Substances Control Act of 1976. For example, the development of nanotechnologies and synthetic biology have dramatically changed the regulatory contexts, as have other technology changes. The focus of this presentation is on the need for new models of regulation or complements to regulation for responding to these changes in technologies. The rapid changes in the regulatory context mean that the old model of regulation increasingly is poorly-suited to chemical risk management issues. This presentation suggests "policy innovations" that offer ways of managing this new chemicals challenges effectively. Among them are collaborative governance models, new regulatory frameworks, combinations of regulatory and voluntary programs, and third-party codes and certifications. Each of these are linked conceptually by their focus on flexibility, adaptive management, and policy learning. This presentation builds upon the presenter's previous work on oversight of nanotechnology issues and on the need for new models of environmental regulation.

T1-K.1 Fitzpatrick, JW*; Schoeny, R; Gallagher, K; Ohanian, EV; U.S Environmental Protection Agency; [email protected] EPA’s Framework for Human Health Risk Assessment to Inform Decision Making The Framework for Human Health Risk Assessment to Inform Decision Making was developed by a technical panel of the EPA’s Risk Assessment Forum to be responsive to the decision making needs of the agency. Most importantly, it addresses the recommendations presented in NRC’s Science and Decisions (2009) relative to the design of risk assessments, including planning, scoping, and problem formulation. This Framework will be instrumental in facilitating the implementation of existing and future EPA guidance for conducting human health risk assessments and in improving the utility of risk assessment in informed risk management decision-making process. In accordance with longstanding agency policy, it also emphasizes the importance of scientific review and public, stakeholder and community involvement. This presentation will discuss details of the Framework and associated implementation plans by the agency.

W2-J.1 Fitzpatrick, BG*; Angelis, E; Polidan, EJ; Tempest Technologies; [email protected] Estimating The Risk of Rabies Entry into The State of Hawaii Considered one of the oldest documented infectious diseases, rabies remains a dangerous zoonotic disease threat today exhibiting a global geographic spread, a broad spectrum of mammalian reservoirs and vectors, and possesses the highest case fatality rate of any disease-causing agent. Transmitted via the saliva of an infected mammal, the rabies virus aggressively attacks the central nervous system and if left untreated is almost always fatal. Health care costs associated with the detection, prevention and control of rabies in the U.S. are estimated to exceed $300M annually. Distinguishing itself from all other U.S. states, Hawaii maintains a rabies-free status and has never reported an indigenous rabies case (animal or human). The state has upheld this status due to a quarantine law enacted in 1912 that requires a post-arrival confinement period of 120 days for imported dogs and cats. Beginning in 1996, the state has conducted a series of risk assessments to identify new alternatives and modifications to the 120-day confinement law, with current law maintaining the 120-day quarantine and providing a very rigorous 5-day quarantine. In this paper, we examine some additional quarantines of interest to the state, including 30-, 60- and 90-day confinements. The procedure we employ uses scenario trees that structure the individual risk components involved in bringing an animal into Hawaii. Using data from a number of rabies epidemiology studies, we estimate parameters for a Bayesian model for risk probabilities, and we present a number of comparisons of rabies introduction likelihoods.

M4-A.1 Fitzpatrick, S*; Carrington, C; US Food and Drug Administration; [email protected] Dietary exposure to inorganic arsenic from food in general and rice in particular. Given its widely appreciated toxic properties, arsenic in food has always been an important topic in food safety. It has also long been known that the organic species of arsenic in seafood is far less toxic than inorganic arsenic in food or water. However, modern chemical analytical methodology has led to the realization that there are many different forms of arsenic in food that a different toxicological properties. At the very least, there are three major categories of arsenic; inorganic arsenic species that are the most toxic, methylated arsenic species that are moderately toxic, and arsenic complexes that are practically nontoxic. While arsenic complexes are the predominant form in fish, and inorganic arsenic is the predominant form in drinking water, arsenic in most foods are comprised of a combination of inorganic and methylated species. Recent survey work conducted by the USFDA as well as other agencies indicates that while most foods have some inorganic arsenic at a relatively constant level, most of the variation in total arsenic concentrations is attributable to the presence of methylated arsenic species in highly varying amounts. In addition to considering the chemical form of arsenic in food, dietary exposure assessments must be tailored to the temporal component of the toxicological evaluation and to the individuals or populations that consume a particular food. Thus, while per capita averages of lifetime intake may serve well as characterizations of public health for some health endpoints (e.g. lifetime cancer risk), risk estimates intended to inform individual consumers are better served by the use of exposure estimates that are based on frequency of consumption. Although drinking water and smoking can also be dominant sources of exposure for some people, the major source of exposure to inorganic arsenic in the United States in the diet. In particular, rice can be the principle source of exposure in individuals and populations who are frequent consumers.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M4-C.1 Flage, R*; Aven, T; Zio, E; Baraldi, P; University; [email protected] Concerns, challenges and directions of development for the issue of representing uncertainty in risk assessment In the analysis of the risk related to rare events that may lead to catastrophic consequences with large uncertainty, it is questionable that the knowledge and information available for the analysis can be reflected properly by probabilities. Approaches other than purely probabilistic ones have been suggested, for example based on interval probabilities, possibilistic measures, or qualitative methods. In the present paper we look into the problem and identify a number of issues which are foundational for its treatment. The foundational issues addressed reflect on the position that “probability is perfect” and take into open consideration the need for an extended framework for risk assessment that reflects the separation that practically exists between analyst and decision maker.

P.94 Flander, LB*; Keogh, LA; Ugoni, A; Ait Oaukrim, D; Gaff, C; Jenkins, MA; University of Melbourne; [email protected] “Magical thinking” in high risk cancer families About half of people from mutation-carrying families decline genetic counselling and/or testing to identify their mutation status and risk of colorectal cancer. We report on perceived CRC risk and qualitative analysis of reasons for declining in this group. We studied 26 participants (mean age 43.1 years, 14 women,) in the Australasian Colorectal Cancer Family Registry who were relatives of mismatch repair gene mutation carriers; who had not been diagnosed with any cancer at the time of recruitment and who had declined an invitation to attend genetic counselling and/or testing at the time of interview. A structured elicitation protocol was used to capture bounded estimates of perceived risk over the next 10 years. Understanding of genetic testing and CRC risk, reasons for declining testing and self-reported colonoscopy screening were elicited during a 45-minute semi-structured interview. A sub-group of decliners (31%) unconditionally rejected genetic testing compared to conditional decliners who would consider genetic testing in the future. They were confident their decisions would avoid the potential negative impact of testing. Mean perceived 10-year risk of CRC was 54% [95% CI 37, 71] in unconditional decliners, compared with the mean perceived 10-year risk of CRC of 20% [95% CI 5,36] in people who conditionally decline genetic testing. This difference remained after adjusting for potential confounding (age, gender and reported screening colonoscopy). This group perceive themselves to be at about 2.6 times higher risk than conditional decliners. Their biased judgment under perceived high risk may be “magical thinking,” which becomes a heuristic to avoid "tempting fate" (Risen and Gilovich 2008). Defensive motives to protect against threatening health information may contribute to unconditional declining of genetic testing (Etchegary and Perrier 2007).

T4-E.3 Flari, V*; Kerins, G; Food and Environment Research Agency; [email protected] Synthetic Biology: prospective products and applications for food/feed and requirements for regulation During their introductory stages, the development of new materials, products and technologies is often hampered by high levels of uncertainty and knowledge gaps regarding the technical risks and benefits to human health and the environment. Lessons learnt (e.g. from genetic modification technologies) indicate that it is better to introduce these aspects at an early stage of development of such products to assure producers that benefits are properly evaluated and reassure consumers that risks are properly managed under an efficient regulatory regime. Our project brought together a multidisciplinary team with expertise in a wide range of relevant areas, including molecular biology, emerging sciences and technologies, elicitation of expert judgment, risk assessment and risk analysis, uncertainty analysis, social science, decision making, and regulatory frameworks. The team worked together to identify (a) potential synthetic biology food/feed products/applications, and (b) relevant uncertainties and gaps in relation to the UK and EU regulatory frameworks that are currently in place. In collaboration with regulators, external experts (e.g. academia), and major stakeholders (e.g. industry, consumer representatives), the team reviewed the regulatory frameworks, and assessed whether they are sufficient to cover all likely requirements. This work aimed to address challenges that regulatory bodies will face when dealing with synthetic biology products/applications; these would include how to: (a) develop approaches to ensure best protection of human health and environment in the light of so many uncertainties, (b) provide regulatory frameworks that would not overburden and hence hinder the development of new products/ applications, (c) develop transparent decision making approaches that incorporate comprehensive uncertainty analysis and clear communication strategies, (d) ensure public participation in the policy making, and (e) develop appropriate mechanisms for implementation.

M2-G.2 Fleishman, LA*; Bruine de Bruin, W; Morgan, MG; Carnegie Mellon University; [email protected] Informing Science Teachers’ Knowledge and Preferences of Low-Carbon Electricity Technologies through a Continuing Education Workshop Do U.S. middle school and high school teachers have the knowledge they need to correct students’ common misunderstandings about strategies to limit emissions of carbon dioxide from the generation of electricity? This paper examines that question with a sample of 6th-12th grade science teachers from Pennsylvania. We find that many of these teachers shared public misunderstandings such as: believing that all Pennsylvania’s electricity needs can be met with wind and solar power, underestimating the cost of solar power, believing nuclear plants emit CO2, and being unsure whether it is possible to capture and sequester carbon dioxide. We found that teachers with more pro-environmental attitudes were more likely to have incorrect knowledge about these topics. In a second stage of the study, we presented teachers with comprehensive and balanced information materials about electricity technologies as part of a continuing-education workshop. Overall, teachers who entered the workshop with less knowledge learned more from our information materials. Moreover, teachers were able to use the materials to form consistent preferences for technologies and to construct low-carbon portfolios of these technologies that were similar to preferences reported in previous work with members of the general public. Teachers reported that the information materials and continuing-education course were useful, and could be easily adapted to their high-school classrooms. We conclude that the materials and continuing-education workshop could benefit science teachers and ultimately their students.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W4-G.3 Flowers, L*; Keshava, C; Chiu, W; National Center for Environmental Assessment, Office of Research and Development, USEPA; [email protected] Assessing the Human Health Risks from Exposure to Naphthalene EPA’s Integrated Risk Information System (IRIS) Program develops human health assessments that provide health effects information on environmental chemicals to which the public may be exposed and is currently developing an updated health assessment for naphthalene. Naphthalene is ubiquitous in the environment from the combustion of wood and fossil fuels, and can also be released into air, soil and water from its production and use as an intermediate in chemical synthesis. IRIS assessments contain information to support the first two steps (hazard identification and dose-response analysis) of the risk assessment paradigm and develop quantitative toxicity values for cancer and noncancer health effects. IRIS assessments are not regulations, but they provide a critical part of the scientific foundation for decisions to protect public health across EPA’s programs and regions under an array of environmental laws. For naphthalene, perhaps the most scientifically debated topic has focused on the mode of action of carcinogenicity following inhalation exposure in animals, particularly with respect to the human relevance of the observed tumors and the shape of the dose-response curve at low dose. The development of the naphthalene IRIS assessment will be discussed, including how 1) recent enhancements to the IRIS process will impact the assessment, 2) information from recent scientifically relevant workshops will be utilized, and 3) recent recommendations from the National Research Council on IRIS assessment development will be incorporated. The views expressed are those of the authors and do not necessarily reflect the views or policies of the U.S. EPA.

P.8 Forshee, RA*; Lu, Y; Izurieta, H; Egger, J; Cooney, D; Lash, T; Fox, M; Brown, J; U.S. Food and Drug Administration, SciMetrika LLC, Emory University, Boston University,Harvard Pilgrim, ; [email protected] Using quantitative bias analysis to characterize the uncertainty of inputs based on epidemiological data Many important public health decisions are informed by epidemiological data, but the results of epidemiological studies may be affected by several potential sources of bias including selection bias, unmeasured and unknown confounders, and various forms of misclassification. A growing body of literature is seeking to account for these potential sources of bias in a more rigorous and transparent manner by using quantitative, probabilistic methods. We have adapted quantitative bias analysis methods--such as probabilistic bias analysis and multiple bias modeling--for use in post-market studies of vaccine safety. The project examines the potential impact of several forms of misclassification and missing and partially missing confounders on the estimates of the possible relationship between vaccination and risk of an adverse event. Quantitative bias analysis methods were developed for several study designs that are commonly used in post-market safety analyses including cohort, case-control, self-controlled case series, and vaccinee-only risk interval studies. Quantitative bias analysis was used to generate an adjusted distribution of the risk based on both the statistical uncertainty and the other potential sources of bias. These approaches could be used to create inputs to probabilistic risk assessment approaches that better characterize the uncertainty associated with post-market safety data.

T4-C.5 Foster, SA; Chrostowski, PC*; CPF Associates, Inc.; [email protected] Recent Findings from Human Health and Ecological Risk Assessments of Waste to Energy Technologies There has been limited new development of waste-to-energy (WTE) facilities in North America for many years, but interest in these projects is increasing due to costs of long-range transport of solid waste, efforts to divert waste from landfills, and heightened interest in developing more sustainable and energy-producing solid waste management solutions. Many communities are currently considering a variety of thermal treatment options for management of post-recycled municipal solid waste. Despite these motivating factors, and decades of positive experience in the operation of WTE facilities in North America and Europe, new thermal waste treatment projects typically encounter local opposition due to concerns about potential public health and environmental impacts. On a site-specific basis, human health and ecological risk assessments can help to address these concerns. This paper will summarize and compare the results of 10 risk assessment studies that have been conducted since 2001 for major WTE projects in North America. Elements that will be addressed include size and location of each facility, compounds selected for evaluation, potential pathways of exposure, the location and types of receptors addressed, risk metrics evaluated and compounds and pathways accounting for the majority of risks. The results will also be evaluated relative to applicable regulatory guidelines and current risks of morbidity and mortality associated with everyday life. Details will be provided from one comprehensive risk assessment project conducted for a WTE facility in Palm Beach County, Florida; this project has been approved and is under construction. This case study will elaborate on some unique risk assessment methods employed for fate and transport modeling. In addition, issues of community concern and the extent to which the risk assessments addressed these concerns will also be discussed.

P.9 Fowler, G*; Erikson, L; Ahern, R; Caton, B; Gutierrez, W; Griffin, R; United States Department of Agriculture; [email protected] Analyzing the effects of unintended uses of commodities on phytosanitary risk: The example of U.S. potato exports to Mexico Diversion of commodities from intended use is a recurring plant health issue in agricultural trade because consumption is generally low risk while other uses, e.g. propagation, may not be. Consequently, having mechanisms to characterize the risk of this occurring is highly relevant to policy makers and trade dispute arbitration. Here we analyze the risk of U.S. table stock potatoes being diverted for seed by Mexican potato producers as an example of characterizing the risk of unintended commodity use in a plant health context. We constructed probabilistic pathway models characterizing the movement of quarantine significant white, yellow and Russet potatoes from the United States into Mexico at current and double export volumes. We then modeled the likelihood of these potatoes being diverted for seed and the subsequent establishment of bacteria, nematode, surface feeder and virus pests in Mexico. This analysis can be adopted as a mechanism for modeling the unintended use of other commodities and informing trade policy.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-A.4 Fraas, , A*; Lutter, R; Resources for the Future, Washington D.C.; [email protected] Analysis of Regulatory Effectiveness: The Case of Mandatory Information Disclosure President Obama’s recent Executive Orders on regulation (i.e., E.O. 13563, E.O. 13579, and E.O. 13610) have elevated the importance of systematic retrospective review and analysis of federal regulations. E.O. 13563 states the regulatory system “must measure and seek to improve the actual results of regulatory requirements” and directs agencies to engage in “periodic review of existing significant regulations”. We conduct a comprehensive review of all economically significant final rules mandating the disclosure of information to third parties, such as consumers or private investors. We test whether rules issued by executive branch agencies operating under E.O. 13563 were accompanied by more rigorous analysis of the effects of information disclosure than rules exempt from E.O. 13563. Using methods of prior research assessing the effectiveness of mandatory information disclosure (e.g., Lacko and Pappalardo, 2010), we measure rigor by several criteria. These include whether agencies conducted studies of comprehension by consumers or other target audiences, such as investors, and whether such studies involved focus groups or surveys. If such studies involved surveys we assess whether the surveys involved 1. control groups, 2. alternative types of information disclosure, and 3. random assignment of alternative forms of information disclosure to survey respondents. For rules issued without a survey, we assess whether agencies issued plans to evaluate in the future the effectiveness of mandatory information disclosure, and whether such plans included “periodic review” using controlled surveys with random assignment. We summarize our results by type of agency and also discuss their implications for the administration of the Paperwork Reduction Act by the federal Office of Management and Budget.

W3-A.1 Francis, RA*; Gray, GM; Tanir, JY; George Washington University; [email protected] Competing Considerations for Making Safer Chemical Decisions This paper discusses the opportunities at the intersection of life cycle impact analysis, risk analysis and exposure assessment, environmental policy learning, which are becoming the focus of emerging regulations encouraging alternatives analysis for selected commercial chemicals. This research takes a multi-disciplinary view of this problem, growing out of discussions of the ILSI Health and Environmental Sciences Institute (HESI) subcommittee on Frameworks for Alternative Chemical Assessment and Selection of Safer, Sustainable Alternatives, Subgroup 2, and focusing on the role of risk from several dimensions. Each of the following topics will be discussed in greater detail in the companion papers in this symposium: First, chemical risk should be examined at each stage of the product life cycle. This requires a life-cycle informed view of hazard by considering exposures at each unit process in the production of the chemical or product. Second, regulatory stakeholders are beginning to introduce mandated decision processes based on a multi-attribute decision context. Third, we discuss how regulatory stakeholders are actively engaged in policy learning to improve environmental and public health outcomes relative to prevailing approaches to chemical risk evaluation. Finally, we discuss the potential economic impacts—both benefits and potential obstacles to innovation—faced by industrial actors affected by these emerging regulations.

W3-F.3 Frankel, MJ; Scouras, J*; Ullrich, GW; Johns Hopkins University, Penn State University, Shafer Corporation; [email protected] Assessing the consequences of nuclear weapons use: The challenge of incomplete knowledge The considerable body of knowledge on the consequences of nuclear weapons employment—accumulated through an extensive, sustained, and costly national investment in both testing and analysis over two-thirds of a century—underlies all operational and policy decisions related to U.S. nuclear planning. We find that even when consideration is restricted to the physical consequences of nuclear weapon employment, where our knowledge base on effects of primary importance to military planners is substantial, there remain very large uncertainties, in no small part because many questions, such as the impacts on the infrastructures that sustain society, were never previously asked or investigated. Other significant uncertainties in physical consequences exist because important phenomena were uncovered late in the test program, have been inadequately studied, are inherently difficult to model, or are the result of new weapon developments. Even more difficult to quantify non-physical consequences such as social, psychological, political, and full economic impacts were never on any funding agency’s radar screen. As a result, the physical consequences of a nuclear conflict tend to have been underestimated and a full spectrum all-effects assessment is not within anyone’s grasp now or in the foreseeable future. The continuing brain drain of nuclear scientists and the general failure to recognize the post-Cold War importance of accurate and comprehensive nuclear consequence assessments, especially for scenarios of increasing concern at the lower end of the scale of catastrophe, do not bode well for improving this situation.

T1-A.7 Friedman, SM*; Egolf, BP; Lehigh University; [email protected] What has Google Reported about Nanotechnology Risks? Over time, coverage of nanotechnology risks has gradually disappeared from most traditional newspapers and wire services. Much of this coverage now appears on the Internet, and when people want to find out information about nanotechnology, Google is the first place they will probably look. Google will direct them to an ever-changing array of websites, blogs, online newspapers and news releases, all discussing various aspects of nanotechnology. This presentation will review Google alerts for nanotechnology risks for 2010 and 2011. These alerts were saved to provide a retrievable and unchanging set of articles for analysis because tracking information over time with Google is difficult for technical reasons. Various types of Google information sources that included nanotechnology risk news will be categorized to evaluate which ones were included in Google alerts most often. Discussions of information about nanotechnology health, environmental and societal risks in a randomly selected group of the Google alerts will be compared to coverage of similar risks that appeared in the New Haven Independent, an online newspaper that provided dedicated nanotechnology coverage during the same period. Comparisons also will focus on the types of nanotechnology materials covered, whether events, reports or news releases drove the coverage, if uncertainty was discussed, and whether positive information about nanotechnology was included in these risk articles. Discussions of regulation issues, plans and programs also will be compared.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T1-E.4 Furukawa, K; Radiation Effects Research Foundation; [email protected] A Bayesian semi-parametric dose response estimation in radiation risk assessment Characterizing the dose effect relationship and estimating acceptable exposure levels are the primary goal of quantitative risk assessments. In analysis of health risks associated with exposure to ionizing radiation, while there is a clear agreement that moderate to high radiation doses cause harmful effects in humans, we have insufficient information to understand the possible biological effects at low doses, e.g., doses below 0.1 Gy. A conventional approach to dose response analyses tends to base inference on an idealized model that is chosen primarily due to the lack of statistical power, which may be misleading in characterizing the low dose effects and, especially, estimating their uncertainties. As an alternative approach, this study proposes a Bayesian semi-parametric model that has a piecewise linear dose response function with auto-regressive priors as a smoother, applied to data grouped in closely spaced dose categories. A simulation study shows that the proposed approach can yield relatively robust and efficient estimations under various situations assuming dose response relationships that are often considered as possible models of radiation oncogenesis at low doses. The new approach is applied to the cancer incidence data of the Life Span Study cohort of Japanese atomic bomb survivors, which has long been an important data source for developing quantitative estimates of risk from exposure to radiation.

W3-F.5 Fusco, MP; Global Catastrophe Research Institute; [email protected] Christian Apocalyptic Literature in Theological Scholarship & The 'Prepper' Movement There has been an increase in the number and variety of movements in the United States that are preparing for a global catastrophic event. The motivation inspiring these ‘preppers’ is as varied as their understanding of how future scenarios (political, social, environmental, extraterrestrial, technological, biological, etc.), will necessarily lead to devastation on the global level. Each of these approaches provides data points given its unique methods and interpretative frameworks. Preppers often take their interpretation and projection of hard scientific data as being commensurate with their religious presuppositions. In the presentation we will outline how theological scholarship understands the scriptural account of the last days as recorded in the book of Revelation, as one means to frame a conversation with preppers.

T3-E.1 Gaborek, BJ; Bellin, CA; Dellarco, M; Egeghy, P; Heard, N; Jensen, E; Lander, DR; Tanir, JY*; Zaleski, RT; Sunger, N; DuPont ; [email protected] Optimizing a Tiered Exposure Framework to Aid Risk Assessment Decision-Making The science of risk assessment is utilized on a world-wide basis by regulatory authorities and industry to protect the health and welfare of humans and the environment from a broad range of chemicals, biological materials, and consumer products. Risk assessment is based on evaluating both the hazard (toxicological impact) and the exposure (likelihood and frequency of contact) components of risk, most often in a quantitative manner. During the last several years, exposure has been incorporated earlier and more prominently into the risk assessment process. In addition, numerous organizations across the globe have initiated a “tiered” approach to utilizing exposure data. Generally in a tiered exposure assessment, the first quantified estimates are deterministic and tending toward overestimation. These lower tier exposure estimates are often useful in screening or prioritizing additional efforts. With advancement to higher tiers, the estimates become progressively less conservative and more certain. Consequently, these exposure predictions often facilitate decision-making at a more chemical or product-specific level. As part of the ILSI Health and Environmental Sciences Institute’s RISK21 initiative, the Exposure Science Sub-team focused on developing a novel, streamlined, and tiered approach for estimating exposure that maximizes use of readily available information (existing approaches, tools, and data) and that aligns with available hazard data. The goal of this effort was to identify efficiencies that, if implemented in risk assessment, would facilitate quicker decision-making and focus resources in areas with greatest information value. As an introduction to subsequent presentations in this symposium, this discussion introduces the proposed framework, briefly compares it to other tiered frameworks, and then describes the Tier 0 level of the framework with some degree of detail. This abstract does not necessarily reflect U.S. EPA policy.

P.24 Gadagbui, B; Maier, A; Nance, P*; JayJock, M; Franklin, C; Toxicology Excellence for Risk Assessment; [email protected] A Decision Tool for Assessing Polymers and Polymeric Substances with Potential Hazards to Human Health Polymers display a wide variety of characteristics - e.g., presence of non-bound residual monomers, polymerization chemicals, degradation products, and additives - that may pose a potential health hazard. There is a paucity of direct testing data on many polymers to adequately evaluate their toxicity, but several regulatory agencies have provided guidance for assessing polymer safety. We evaluated each of these approaches and identified the strengths and weaknesses of each. No single published model appears to cover all characteristics of interest. This suggests the need to develop a comprehensive decision tool to identify polymeric substances that may pose potential toxicological hazards to human health. We developed a decision tool that incorporates a weight of evidence approach integrating information for many individual hazard flags. Hazard flags were placed into four broad categories: (1) empirical hazard information on the polymer or residual monomer; (2) evidence of toxicity based on structural properties (i.e., based on polymer class, monomer components, or reactive functional groups); (3) potential for significant tissue dose (i.e., based on molecular weight distribution or systemic bioavailability); and (4) hazard based on foreseeable special use considerations. Some of these hazard flags have not been considered previously by the regulatory agencies. We tested this approach for a number of polymers to demonstrate how the new tool (integrates) incorporates all available regulatory approaches as well as the new features and provides a comprehensive decision framework for evaluating polymer safety.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T1-K.4 Galloway, L*; Dolislager, F; Stewart, D; Tucker, K; University of Tennessee, Knoxville; [email protected] RELEASE OF OSRTI’S ONLINE RISK CALCULATOR L. Galloway, F. Dolislager, D. Stewart (The University of Tennessee, Knoxville) K. Tucker (Ingenium Professional Services, Inc.) The U.S. Environmental Protection Agency Office of Remediation and Technology Innovation (OSRTI) through an interagency agreement with Oak Ridge National Laboratory (ORNL) developed an online risk calculator for assessment of environmental media. This new tool allows users to enter concentrations in environmental media (soil, sediment, groundwater, surface water, fish, produce, beef and milk) for the calculation of cancer and noncancer risks. This tool is analogous, yet more comprehensive than the existing regional screening Level (RSL) calculator. Chronic daily intakes (CDIs) are calculated and combined with toxicity data to produce risk results. Users have the ability to not only select chemicals from a pick list and hand enter concentrations but also upload a simple data file containing media designation, exposure point concentrations, chemical name, CAS number and detection status. Baseline exposure assumptions are provided as defaults, however, the user is allowed to change and save their site-specific exposure parameters, chemical parameters and toxicity values in a user file for future use. Once risk results are obtained, they can be saved along with their site-specific data for later modification. Output will be formatted into a RAGS part D template, including toxicity metadata. This new tool will be useful for concerned citizens, risk assessors and risk managers.

M4-C.5 Gelyani, A.M.*; Abrahamsen, E.B.; Asche, F; Heide, B; University of Stavanger, Stavanger, Norway; Safetec Nordic, Norway; [email protected] A note on what the effects are of a safety measure It is well known that investments in new safety measures do not always give the intended effect, as new safety measures are sometimes offset by behavioural changes. In this article we show that another cause for a reduced effect is that competition for resources can lead new safety measures to crowd out existing measures. As the resources spent on safety measures are usually scarce, a new safety measure can lead to reduced resources allocated to other measures. If this aspect is not taken into consideration, the effects of a single measure might be considered too high. An overinvestment in new safety measures might then occur.

W3-K.4 Gelyani, A.M.*; Abrahamsen, E.B.; Selvik, J.T.; Authors 1 and 2: University of Stavanger, Norway; Author 3: International Research Institute of Stavanger (IRIS), Norway; [email protected] DECISION CRITERIA FOR UPDATING TEST INTERVALS FOR WELL BARRIERS In this paper we discuss whether or not decision upon test intervals for well barriers should adopt the same decision criteria as those recently suggested for Safety Instrumented Systems (SIS). We conclude that the criterion suggested for halving the test intervals for SIS is appropriate to use also for well barriers. The criterion for doubling the test interval for SIS is however not considered appropriate, as this criterion does not give a sufficient weight to the cautionary principle. A new type of criterion for doubling the test interval for well barriers that better reflects the uncertainties is suggested.

W4-D.4 Geraci, CL; National Institute for Occupational Safety and Health; [email protected] Closing research gaps for safer design principles for multiwalled carbon nanotubes; molecule, process, and products The multiwall carbon nanotube (MCNT) is one of the most widely studied materials to come out of nanotechnology. As a class of materials, the MWCNT has shown great promise in a wide range of applications; however, the research on applications has outpaced research on the potential implications for human health and the environment. Recent toxicological findings and dose-response based risk assessments support the need for an active risk-based approach to manage the development of MWCNT-based product applications. Unfortunately, there is little information available regarding actual human or environmental exposures to structure an informed risk characterization and develop a risk management approach. Research is needed to close key knowledge gaps that will, in turn, allow a more complete analysis of the risk associated with commercializing MWCNT-enabled products. More research is needed to at the molecular level to identify changes in the physical and chemical characteristics of MWCNTs that can alter the biologic behavior and result in a ‘safer’ MWCNT. Actual release and exposure assessments are needed during the manufacture and use of MWCNT so that more accurate risk characterizations can be performed. Research that develops and refines more specific and quantitative release and exposure data will allow for the development of more effective risk management approaches. Once MWCNTs are incorporated into an intermediate or product, the risk profile changes based on the potential for release of the MWCNT. Methods to identify and characterize release scenarios are needed to complete the life cycle risk analysis for MWCTs. Until results from the research needs identified here are available, risk managers will have to rely on minimizing or eliminating potential human exposures. More progressive approaches would include designing safer MWCNT, designing processes that have minimal releases, and designing products with a low potential for releasing MWCNT.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.132 Gernand, JM*; Casman, EA; Penn State University; [email protected] Treed Exponential Models for Evaluating Factors Affecting Nanomaterial Dose-Response and Setting Occupational Exposure Limits Existing research has demonstrated that some materials produce significantly increased toxic responses when particles are sized in the ultrafine or nano-range (<100 nm). Further investigation revealed that even small changes in the characteristics of these nanomaterials can result in divergent outcomes following exposure. Understanding which controllable properties of nanomaterials may be responsible for differences in toxicity is critical for appropriate risk assessment, setting regulatory policy on exposure limits for these materials, and understanding the biological mechanisms involved. Multiple regression techniques can provide some insight, but traditional linear models and even new machine learning models make assumptions of linear or constant dose-response relationships that violate the best current understanding in toxicology. This work presents a new modeling framework, treed exponential models, for evaluating the effects of changes in specific nanomaterial properties on dose-response. We demonstrate this modeling technique on a collection of published nanoparticle pulmonary toxicity experiments. This technique combines the benefits of machine learning regression tree (RT) models with the accumulated mechanistic knowledge contained in traditional dose-response exponential curve models. These models facilitate comparisons between different types of nanomaterials and other toxins, and provide quantitative guidance regarding when different types of nanomaterials should be considered distinct groups.

M2-F.2 Gerst, MD; Wang, P; Ding, P; Borsuk, ME*; Dartmouth College; [email protected] An integration of multiple paradigms for integrated assessment of climate policy Most models used to analyze international climate policy treat the problem as if it operates on only a single level. This is the result of a reliance on integrated assessment models (IAMs) that, for reasons of analytical tractability, typically employ assumptions that allow the economy to be modeled as if it is managed by a single, utility-maximizing central planner without regard to the influence of either lower-level actors or international pressures. While game theoretic models have been used to study international negotiations, they generally do not consider feedback from domestic actors who have heterogeneous beliefs and vulnerability to climate change. We provide an overview of a policy modeling framework, called ENGAGE, styled after the Putnam two-level game in which interactions among negotiators at the international level are linked with the preferences of constituents at the domestic level. Domestic constituents in our model include firms and households who function as agents within an evolutionary representation of economic growth, energy technology, and climate change, resulting in a two-way dynamic feedback between international agreements and domestic policy outcomes.

M3-H. Gibb, SK; The Scientific Consulting Group, Inc.; [email protected] Roundtable: Risk in Changed Circumstances: Views of the News Editors This session will focus on how risk assessment issues – whether emerging or long-standing concerns – are covered and communicated by key environmental news publications. A panel of science policy journalists will discuss trends in their coverage, whether risk assessment as a focus is being marginalized by other environmental concerns such as sustainability, and their view of the future evolution of risk approaches in light of recent National Academies’ reports on harmonizing cancer and non-cancer approaches, the initiation of new EPA Cumulative Risk Assessment Guidelines, and emerging toxicity testing technologies. The editors will reflect on the challenges of covering contentious issues such as Bisphenol A (BPA) and climate change, how agency press policies may be changing their access to scientists and their ability to gather information, and how recent budget cuts are affecting federal risk assessment efforts. Each editor will present for 10 minutes and a moderated question and answer session will follow. Science -- Erik Stokstad M.S. – Staff writer joined Science magazine in 1997. He covers environmental research and policy with a focus on natural resources and sustainability. Risk Policy Report – Maria Hegstad M.S.J. – Managing Editor joined the publication in 2008 and manages all aspects of coverage including researching, writing and editing stories, covering SRA conferences, and writing for the InsideEPA.com website. Chemical and Engineering News – Cheryl Hogue M.S. – Senior Correspondent, focuses on articles and social media regarding EPA regulation of chemicals and research, international climate change policy, and federal regulatory policies. Chair/Moderator: Steve Gibb M.S. – Project Manager, The Scientific Consulting Group, Inc. Worked over a decade as an award-winning environmental policy reporter, program officer at the National Academies, and currently supports client missions in the areas of knowledge mobilization, technical writing and communicating cumulative risks.

M2-E.1 Gibb, HJ; Tetra Tech Sciences; [email protected] Foodborne epidemiology reference group: Chemical and toxins task force In 2006, the World Health Organization (WHO) launched the Foodborne Epidemiology Reference Group (FERG). The purpose of the FERG is to provide an estimate of the global burden of disease from foodborne viruses, bacteria, parasites, chemicals and toxins. To accomplish this work, the FERG was divided into several task forces. These task forces initially included the Enteric Task Force (viruses and bacteria), the Parasitic Task Force, and the Chemicals and Toxins Task Force. The Source Attribution, Country Studies, and Computational Task Forces were subsequently added. The presentations at this symposium relate to the work of the Chemicals and Toxins Task Force (CTTF). The work of the CTTF has been a combination of contracted efforts by WHO to international experts and of in-kind contributions. The Chemicals and Toxins Task Force at its initial meeting discussed and evaluated criteria to prioritize the chemicals and toxins on which to base its global estimates. The chemicals and toxins that were eventually selected are aflatoxin, peanut allergen, dioxin, cyanide in cassava, methyl mercury, arsenic, lead, and cadmium. Age-specific estimates of disease incidence resulting from exposure to these foodborne chemicals or toxins have been developed or are being developed. These estimates of incidence and case-fatality rates for the disease will be used by WHO to make estimates of the Disability Adjusted Life Years (DALYs). The DALY estimates are expected to be published by WHO in 2014.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-J.3 Gilmore, EA*; Hendrickson, P; University of Maryland; [email protected] Evaluating Proliferation Resistance of Small Modular Nuclear Reactors Nuclear energy can make an important contribution to reducing greenhouse gas emissions. Small modular reactors (SMRs), defined as units with a generating capacity of less than 300 MW that are delivered to the site fully assembled, may represent a viable alternative to large reactors since they require smaller initial capitalization, better match existing demand for energy and capacity requirements, may be easier to site and may engender less public opposition. Thus, this configuration opens new opportunities for nuclear energy use, especially in developing countries. However, there has been little effort to evaluate SMR designs, manufacturing arrangements and fuel cycle practices on the risks for proliferation. Here, we evaluate the designs and fuel cycle arrangements, manufacturing and delivery systems, and policy regimes are most likely to result in proliferation resistant SMRs. We start by reviewing the characteristics of existing SMRs. We compare this to a notional SMR with sealed fuel compartment that would not give users access to nuclear materials. In addition, these SMRs would be produced in a “hub and spoke” arrangement with a small number of manufacturing facilities worldwide and end of life recovery of the modules for recycling and disposal. We then apply and adapt a number of existing methods used to evaluate the proliferation resistance for conventional light water reactors to assess the SMRs. Adapting these models, we find that the technological features of SMR systems can reduce the proliferation risks compared to light water reactors, although strong international regimes are required to support this outcome.

P.10 Gilmore, J*; Martinez, C; Pagliarulo, M; Ontario Ministry of Environment; [email protected] Interim action values for management of contaminants in soils for protection of human health risks The Ontario Ministry of the Environment has developed interim action value (IAVs) for several contaminants as part of its Soil Assessment Protocol, which informs investigation, analysis and risk reduction measures (RRMs) to address contaminated soil. An IAV represents the upper limit beyond which interim risk reduction measures should be considered. As the name implies, IAVs are intended to inform short-term risk management and mitigation decisions, which may need to be reconsidered over time as more information becomes available on the exposure conditions or on the science underpinning the IAV. IAVs can be developed for various media including soil, groundwater, soil vapour and indoor air, as needed. Interim action values are generally developed from generic soil standards (GSS) by: a) Reviewing the relevant human health component values (HHCVs) underpinning the GSS (e.g., direct contact for incidental ingestion and dermal contact); b) Adjusting the target risk levels for non-cancer and cancer effects to the selected range for risk management; c) Selecting the more stringent effect and ensure that risks posed by other effects (e.g., acute effects) are not exceeded. The IAV may be also refined by reviewing available data from biomonitoring or other studies, if available. Using arsenic as an example, an IAV of 200 µg/g was developed. This value is within the range of soil concentrations studied that showed no significant elevated arsenic exposure, reflects a 2 in 10,000 (or 1 in 5,000) incremental cancer risk and is within the range of risks posed from total inorganic arsenic exposure in the general Canadian population (1 in 1,000 to 1 in 10,000), and is equivalent to approximately ten times an upper estimate of background soil concentrations of arsenic at 18 µg/g.

P.36 Glynn, ME*; Pierce, JS; Williams, B; Johns, LE; Adhikari, R; Finley, BL; Cardno ChemRisk; [email protected] Residential and occupational exposure to wood treating operations and bladder cancer: A meta-analysis The wood treating industry has operated for over 100 years in the United States, with sites commonly operating for more than decades. Over time, concerns have been raised regarding the potential chronic health effects associated with wood treating-related exposures. In at least one case it has been suggested that there might be an association between risk of bladder cancer and exposure to chemicals associated with historical wood treating operations (e.g., creosote, coal tar and associated polycyclic aromatic hydrocarbons [PAHs], and pentachlorophenol [PCP]). A literature search was conducted to identify all published and unpublished analyses that reported risk estimates for bladder cancer in (1) residents of communities surrounding wood treating operations, (2) wood treating workers, and (3) non-wood treating workers who were exposed to chemicals associated with wood treating operations (e.g., creosote/coal tar/PAHs and PCP). A total of 18 studies, including independent cohort, record-linkage, and case-control studies, were included in the meta-analysis. Using a random effects model, meta-relative risks (meta-RRs) were calculated for each exposure group. The summary relative risk (meta-RR) for bladder cancer overall was 1.04 (95% confidence interval [CI]: 0.93, 1.17). No statistically significant meta-RRs were observed among residents of communities in the vicinity of wood treating operations (meta-RR=0.99; 95% CI: 0.73, 1.34); wood treating workers (meta-RR=1.11; 95% CI: 0.53, 2.04); workers exposed to coal tar, creosote, and associated PAHs (meta-RR=1.04; 95% CI: 0.86, 1.27); and workers exposed to PCP (meta-RR=1.00; 95% CI: 0.82, 1.23). In conclusion, the studies reviewed provided no evidence of an association between residential and occupational exposure to wood treating operations and an increased risk of bladder cancer.

M4-D.2 Gombas, D; United Fresh; [email protected] Produce Industry Perspective: Predicting the Unpredictable Protecting consumers is the top priority of the fresh produce industry. But, without a “kill step”, produce food safety must rely on prevention of contamination at every point in the supply chain, from field to fork. Good Agricultural Practices(GAPs) have been successfully used to prevent large scale contamination events in the field. Yet recalls and outbreaks linked to fresh produce demonstrate that GAPs cannot be the entire answer. So how does the industry proceed? This session will explore the industry’s current path, how speculations can divert food safety resources from more effective practices, and how opportunities have been missed in developing better approaches to predict, prevent and detect sporadic contamination events.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-B.1 Goodman, JE*; Rhomberg, LR; Gradient; [email protected] Bradford Hill Viewpoints and Hypothesis-Based Weight of Evidence In a seminal 1965 paper, Sir Austin Bradford Hill identified nine factors that suggest causation, particularly when evaluating epidemiology data: strength, consistency, specificity, temporality, biological gradient, plausibility, coherence, experiment, and analogy. These "viewpoints" were presented as aids for thinking through the evidence for causality – in more current terms, as guidance for evaluating the weight of evidence. They were not intended as "criteria" – the term most often used to describe them today. Hill said: "What [my nine viewpoints] can do, with greater or lesser strength, is to help us make up our minds on the fundamental question – is there any other way of explaining the set of facts before us, is there any other answer equally, or more, likely than cause and effect?" That is, Hill called for an evaluation of how well patterns among the whole "set of facts before us" can be accounted for. The degree of credence one should place in the causal role of a substance in question is a function of how much more likely it would be for the set of observations to occur if the substance were causal vs. if it were not. This is the guiding principle behind the Hypothesis-Based Weight-of-Evidence (HBWoE) approach. We will describe this approach, and the role of Bradford Hill's viewpoints in it, for evaluating epidemiology data. We will also describe how recently proposed "extended Hill criteria" can be used to conduct an HBWoE evaluation of epidemiology, toxicology, mechanistic, and other kinds of data in an integrated fashion.

T2-A.3 Goodman, JE*; Sax, SN; Thakali, S; Beyer, L; Gradient; [email protected] Rethinking Meta-analysis: Applications for Air Pollution Data and Beyond A meta-analysis is a type of systematic review that can be a powerful tool for integrating the results of individual studies. A variety of meta-analysis applications can be illustrated by evaluations that have been or could be conducted in the context of the National Ambient Air Quality Standards (NAAQS) mandated by the Clean Air Act. In its NAAQS evaluations, US E P A r e v ie w s d a ta f r o m d if f e r e n t d is cip line s, including epidemiology, toxicology, atmospheric science, and exposure science. US EPA and others have used meta-analyses to some degree, particularly for epidemiology and controlled human exposure data, but there are further opportunities to use this methodology in air pollution research. We will discuss the utility of meta-analyses to integrate results from several individual studies, with a focus on determining novel applications of the methodology. More specifically, we will discuss the strengths and limitations of conducting meta-analyses and consider whether there may be opportunities, such as for toxicology and mechanistic data, where meta-analyses have not been widely used historically to evaluate associations. Although our focus is on air pollution evaluations, we will demonstrate when and with what kinds of data meta-analyses can be useful across a variety of disciplines and provide strategies for defining sets of studies to evaluate. We will also discuss how the results of separate meta-analyses can be brought together to address a larger question, where different kinds of data need to be brought to bear.

W4-A.3 Goodman, JE*; Prueitt, RL; Sax, SN; Bailey, LA; Rhomberg, LR; Gradient; [email protected] Incorporation of weight-of-evidence best practices in the National Ambient Air Quality Standards review process The National Academy of Sciences Formaldehyde Review Panel's report called on the United States Environmental Protection Agency (EPA) to undertake a program to develop a transparent and defensible methodology for weight-of-evidence (WoE) assessments. The report contains a proposed "roadmap" for reform and improvement of the risk assessment process. We followed the recommendation of the NAS roadmap and conducted a survey to evaluate best practices for WoE analyses based on almost 50 frameworks, including the National Ambient Air Quality Standards (NAAQS) Causal Framework, to come to insights about their methods, rationales, utility, and limitations. We found that the NAAQS WoE framework has many important features that are necessary for a balanced WoE evaluation. However, the framework needs to be more explicit in some cases, and it is missing some important features. Because of this, it has not been applied consistently in past evaluations of causality, and this has led to biased conclusions regarding causation, as we demonstrate with several ozone examples. We propose specific changes to the EPA NAAQS Causal Framework so that it is consistent with WoE best practices. The full and consistent application of this revised framework will ensure that future assessments of the potential health effects of criteria air pollutants will be thorough, transparent, and scientifically sound.

W3-C.2 Greco, SL*; Belova, A; Huang, J; Ghio, C; Abt Associates; [email protected] A global calculator for estimating the benefits of urban fine particulate matter reductions Data that can inform human health benefits assessments in developing countries are limited. We adapted existing benefits frameworks in order to readily calculate the benefits resulting from projects that reduce air pollution in select urban areas worldwide (e.g., mass transit) when data are limited. These screening-level calculations could be used to promote or to assist the allocation of resources for such projects. Our calculator estimates the monetized benefits of reductions in fine particulate matter (PM2.5) emissions in three steps by estimating: (1) the reduction in an urban area’s ambient PM2.5 concentration resulting from the project, (2) the avoidance of premature mortality in the area, and (3) the economic value of this avoided mortality. The reduction in ambient PM2.5 concentrations is estimated from a specified change in PM2.5 emissions using city-specific intake fractions and 2010 urban population estimates. The avoided cases of PM2.5-induced premature mortality are estimated from concentration-response functions that are based on U.S. populations and adjusted for local ambient PM2.5 levels and adult mortality rates. The value of the avoided mortality in locations other than the U.S. is estimated using a benefits transfer method that adjusts the U.S. value of a statistical life estimate for differences in per capita income (at purchasing power parity) and assumes that mortality risk reductions are a luxury good in developing countries. We illustrate the impact of a 1 metric ton reduction in PM2.5 emissions in 2010 for the 43 cities currently included in the calculator. The median number of premature deaths avoided over a 20 year period was 0.3 deaths (range: 0.02-2). The associated median present value benefit (using a 3% discount rate) was $0.6 million (range: $0.01-$5) in 2010 U.S. dollars. The authors plan to discuss the factors influencing these results as well as expand the calculator to include many more cities.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T1-C.4 Greenberg , MR; Rutgers University ; [email protected] Predicting Individual Risk-Reducing Behaviors Before, During and After Major Hazard Events Much of the literature about major hazard events separates pre-event preparedness from during event behaviors. The paper proposes and tests life-cycle disaster event hypotheses using data collected by the author four months after Superstorm Sandy struck New Jersey on October 29, 2013. The author first tests the ability to predict individual preparedness, and then the expectation that that more preparedness leads to more proactive behaviors during and shortly after events. The author uses previous experiences with disasters, flashbulb memories of previous events, and other respondent attributes to predict preparedness. Then the author examines the relationship between preparedness and behaviors during and shortly after events. Of particular interest is age and pre-existing health as factors that lead some respondents to be victims of events, whereas others are aids not victims, and some are both victims and aids. The life-cycle perspective described and tested here is notably different from much of the medical literature that does not view an event as a life cycle. A policy implication that follows is that communities contain a cadre of individuals who are part of community support groups and these active respondents are critical players before, during and after events due to their preparedness and practiced ability to respond to events rather than be victims of them.

W2-E.1 Greene, CW*; Wilkes, C; Koontz, M; Shubat, PJ; Minnesota Department of Health; Versar, Inc.; [email protected] Computer-based exposure modeling to support drinking water guidance The Minnesota Department of Health (MDH) develops health-based guidance values for contaminants of emerging concern (CECs) in drinking water. To account for non-drinking water exposures, MDH uses a chemical-specific Relative Source Contribution (RSC) factor to allocate only a fraction of the toxicological reference dose to drinking water exposure. Pharmaceuticals, personal care products, pesticides, and other CECs found in drinking water sources have complicated exposure profiles that may include ubiquitous exposure at low levels, deliberate self-exposure, high exposures to infants and children, and exposures approaching the reference dose. CEC-specific data to quantify these exposures and accurately understand cumulative and relative risks are not readily available and, until now, MDH has relied on default assumptions based on U.S. EPA guidance. Working with a contractor, MDH explored the potential for using computer models to improve upon the default approach by estimating multipathway, multiroute exposures. We have identified the key media and exposure routes of concern, evaluated numerous models that cover these key media and routes, and developed a set of preferred models based on the model’s fitness to meet our exposure evaluation needs, its strength in representing actual physical/chemical processes, its input demands, and its user friendliness. The preferred models include EPA’s Exposure and Fate Assessment Screening Tool (E-FAST) and Multimedia, Multipathway, and Multireceptor Risk Assessment (3MRA) model, the California Population Indoor Exposure Model (CPIEM), and the Total Exposure Model (TEM). We developed a set of procedures to apply these models to the problem of estimating RSC values for CECs. The procedures were evaluated using a test group of six CECs that cover a range of exposure pathways and routes. Beyond estimation of the RSC, the modeling process may also be of use to risk managers seeking to target resources towards reducing total exposures.

T4-D.2 Grieger, KD*; Laurent, A; Miseljic, M; Christensen, F; Baun, A; Olsen, SI; RTI International, Technical University of Denmark (DTU), COWI A/S; [email protected] Complementary use of life cycle assessment and risk assessment for engineered nanomaterials: Lessons learned from chemicals? Successful strategies to handle the potential health and environmental risks of engineered nanomaterials (ENM) often rely upon the well-established frameworks of life cycle assessment (LCA) and risk assessment (RA). However, current research and specific guidance on how to actually apply these two frameworks are still very much under development. Through an in-depth review, this study evaluates how research efforts have applied LCA and RA together for ENM with a particular emphasis on past “lessons learned” from applying these frameworks to chemicals. Among other results, it appears that current scientific research efforts have taken into account some key lessons learned from past experiences with chemicals at the same time that many key challenges remain to applying these frameworks to ENM. In that setting, two main proposed approaches to use LCA and RA together for ENM are identified: i) LC-based RA, similar to traditional RA applied in a life cycle perspective, and ii) RA-complemented LCA, similar to conventional LCA supplemented by RA in specific life cycle steps. This study finds that these two approaches for using LCA and RA together for ENM are similar to those made for chemicals, and hence, there does not appear to be much progress made specifically for ENM. We therefore provide specific recommendations for applying LCA and RA to ENM, for which the need to establish proper dose metrics within both methods is identified as an important requirement.

P.150 Guan, P*; Shan, X; He, F; Zhuang, J; University at Buffalo, SUNY; [email protected] Incentives in Government Provision of Emergency Preparedness and Disaster Relief The goal of this project is to help provide a solid foundation for motivating more comprehensive ways to assess the risk tradeoffs in multi-stakeholder disaster management and resource allocation. This will be accomplished by taking advantage of theoretical decision frameworks such as game theory and prospect theory, and will use robust optimization techniques to address the uncertainty that surrounds disasters. This project will address under-studied questions such as: (a) How should governments and private sectors balance between the funding for emergency preparedness and the funding for disaster relief, when they are uncertain about the disaster location and consequences? (b) How should governments distribute incentives to reduce vulnerability to disasters? and (c) How should decision makers balance equity, efficiency, and effectiveness when preparing for and responding to disasters? As a 2012 National Research Council report states, "there is currently no comprehensive framework to guide private-public collaboration focused on disaster preparedness, response, and recovery." If successful, this project will help to address this issue by providing insights, practical guidelines, and decision support tools to help save lives and property in the face of disasters. This research will engage many graduate, undergraduate, and high school students, including those from under-represented groups. The models, results, and insight gained will be shared with international, federal, and local representatives through seminars, conferences, publication, media coverage, and websites.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-C.3 Guan, PQ*; Zhuang, J; University at Buffalo-SUNY; [email protected] Modeling public-private partnerships in disaster management--A sequential game with prospect utilities Public and private sectors have invested significant efforts to fight against both man-made and natural disasters. The objective of this paper is to help to enhance the social resilience of the community through the efficient public and private partnerships. Optimal public-private partnerships (PPPs) are studied in disaster management using a sequential game where the government is the first mover. Both expected utility theorem and prospect theory are used to evaluate private sector's decision under uncertainty. We strike to identify (nearly) optimal PPPs when qualitative human behaviors and quantitative risk factors are considered. This paper evaluates the effectiveness of incentive provisions based on the various strategic responses of private sector; thus, making it possible to identify optimal public investment policies. This research provides insights on (a) how to construct an optimal public and private partnerships when the qualitative human behaviors and quantitative risk factors are considered; and (b) whether, when, and to what extent investment in preparedness and relief could form better private and public partnerships.

M2-J.1 Guidotti, TL; Medical Advisory Services; [email protected] Health, Risk, and Sustainability: A Taxonomy of Relationships Sustainability can be envisioned as the mirror image of conventional risk management, in that it seeks to construct conditions for high-probability outcomes with low risk of adverse consequences. The present work seeks to develop a taxonomy or typology for relationships between health and issues of sustainability, identifying fundamental principles for each type. These types should be understood as different ways of looking at problems, not mutually exclusive categories: • “Catastrophic failure”, characterized by existential risk for viability on a large scale because of effects beyond capacity to adapt or mitigate, and therefore are characterized by an “all or nothing” tipping point. (Global climate change.) • “Pollution issues”, characterized by exposure-response relationships for severity and/or frequency of response, follow principles of toxicology or pathogenicity as applied to populations. (Described by the toxicological and/or epidemiological exposure-response relationship.) • “Ecosystem change and stochastic disease risk” define a class of sustainability-health interactions in which ecosystems are destabilized and have indirect consequences mediated by biological mechanisms other than direct toxicity, expressed by increased frequency of disease. (Infectious disease risk and the conventional “public health triad”.) • “Ecosystem change and mediated health risk”, characterized by entropy and loss of access to resources or economic opportunity and mediated by social or economic mediation. (Health effects of unemployment after depletion of fishing stocks.) • “Degradation of environmental services” after cessation or diminution in natural functions of economic value that affect health. (Land use decisions that affect airsheds and water reuse.) • “Urban ecosystem management problems”, characterized by inadequacy of management of artificial ecological systems in human communities. (Wastewater disposal.)

M3-C.3 Guikema, SD; Johns Hopkins University; [email protected] Is Risk Analysis Predictive? Prediction, Validation, and the Purpose(s) of Risk Analysis Risk analysis methods make statements, often probabilistic, about future states of the world. There are many possible purposes behind conducting a risk analysis, including supporting risk management decision making, determining if a system or situation is safe enough, and meeting regulatory or other policy requirements. But is a risk analysis meant to be predictive? Does a risk analysis make predictive statements about future states of the world? If so, why and how? If not, why not? This talk will explore this question and discuss prediction and validation in the context of risk analysis done for different purposes.

P.63 Guo, M*; Buchanan , RL; Dubey, JP; Hill, D; Gamble, HR; Jones, J; Pradhan, AK; University of Maryland; [email protected] Risk factors identification for Toxoplasma gondii infection in meat products destined for human consumption Toxoplasma gondii is a parasite that is responsible for approximately 24% of all estimated deaths per year, attributed to foodborne pathogens in the U.S. The main transmission route for human infection is through consumption of raw or undercooked meat products that contain T. gondii tissue cysts. Risk assessment studies related to meat-borne T.gondii infection were very limited so far. The objective of this study was to compare risk among different meat products, identify risk factors and summarize risk assessment studies for human T. gondii infection through consumption of meat products, both conventional and organic, in the past twenty years. Relevant studies in literature were searched in PubMed and Google Scholar database by key words ‘Toxoplasma gondii’ and in combination with ‘pig’, ‘pork’, ‘sheep’, ‘lamb’, ‘chicken’, ‘cattle’, ‘meat’, ‘organic meat’, ‘risk’, and ‘risk assessment’. This structured review focused on studies of T. gondii infection through meat-consumption route. Risk factors identified on farm include outdoor access, farm type, feeding, presence of cats, rodent control, bird control, farm management, carcasses handling, and water quality. Seroprevalence of T. gondii is greater in conventional pig and sheep compared to cattle and poultry. Seroprevalence of T. gondii is greater in organic compared to conventional meat products indicating higher risk of T. gondii infection from organic meats. To better understand the risk of toxoplasmosis in humans from meat consumption in the U.S., a quantitative microbial risk assessment of meat-borne toxoplasmosis based on data and information relevant to the U.S. is critically needed. This study would serve as a useful resource and information repository for informing quantitative risk assessment studies for T. gondii infection in humans through meat consumption.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.25 Haber, LT*; Dourson, ML; Mohapatra, A; TERA; [email protected] Development of Chemical-Specific Adjustment Factors for Long-Lived Chemicals: PFOS as a Model Chemical Guidance for the development of chemical-specific adjustment factors (CSAFs) has been available for a number of years, and has been applied in assessments of several chemicals, such as boron and 2-butoxyethanol. Typical dose metrics considered for interspecies extrapolation include the area under the concentration times time curve (AUC) or maximal concentration (Cmax). The IPCS (2005) guidance provides some lines of evidence to aid in choosing the dose metric, but notes that “a reasonable assumption is that effects resulting from subchronic or chronic exposure would normally be related to the AUC, especially for chemicals with long half-lives, whereas acute toxicity could be related to either the AUC or the Cmax.” Despite this guidance, CSAFs have been derived primarily for chemicals with short half-lives. A challenge with using AUC for interspecies extrapolation is how to take into account the duration of exposure, particularly if steady state has not been reached. We investigated the development of a CSAF for long-lived chemicals, using perfluorooctanesulfonate (PFOS) as an example. We evaluated the data supporting AUC vs. Cmax as the appropriate dose metric, with particular attention to the relationship between various dose metrics and measures of toxicity in post-exposure recovery groups. We also considered the implications of non-steady state kinetics, as well as relating the exposure duration to the critical effect.

T3-F.4 Hakkinen, PJ; National Library of Medicine, National Institutes of Health; [email protected] Updating On-line Resources: New Tools and Approaches Being Used by NIH to Make Information More Accessible Human health risk assessors, risk managers, consumers, and others need to know where to look for various types of trustworthy and accurate information. Much of the online information is free and accessible around-the-clock globally. The (U.S.) National Library of Medicine (NLM) of the National Institutes of Health (NIH) compiles, reviews, and disseminates various types of human health risk assessment and management information as databases, special topics Web pages, downloadable computer resources, Smartphone apps, and as Web pages optimized for mobile devices. NLM examples include numerous “Enviro-Health Links” Web pages developed to provide easy access to online content from Federal agencies and other sources on topics such as nanotechnology and indoor air. Another recent NLM example is the enhancement of the Hazardous Substances Data Bank (HSDB®) to include new materials (e.g., nanomaterials), state-of-the science toxicology, exposure, and risk assessment information, and HSDB’s first sets of images (e.g., chemical structures). Efforts are ongoing to examine databases developed by other organizations that could enhance the types of content in NLM’s TOXNET® suite of databases. A rather recent example is the addition in 2011 of the Comparative Toxicogenomics Database (CTD). The National Institute of Environmental Health Sciences (NIEHS) funds the CTD. This presentation will also mention key examples of efforts by other governmental and non-governmental organizations to develop and provide free online access to risk assessment and management information.

W3-I.4 Hallegatte, S; Senior Economist, The World Bank, Sustainable Development Network, Office of the Chief Economist and World Development; [email protected] Beyond the ideal – Obstacles to risk management and ways to overcome them Countless risk management measures have been shown to yield large benefits and to be highly cost-effective – yet individuals and societies often struggle to implement these measures and manage their risks effectively. The specific reasons for this vary from case to case, but are always related to the obstacles and constraints facing individuals and societies: for instance, the lack of resources and information, cognitive and behavioral failures, missing markets and public goods, and social and economic externalities. This chapter analyzes and discusses these obstacles to risk management, illustrates their relevance with examples from various sectors, and presents possible solutions to overcoming them. Overall, it is argued that the identification of risks is not enough for effective risk management: the obstacles to risk management must also be identified, prioritized, and addressed through private and public action. In doing so, decision makers need to adopt a holistic approach to risk management, which coordinates across levels (from the individual to the government), and manages risks in an integrated manner.

P.64 Hallman, WK*; Cuite, CL; McWilliams, RM; Senger-Mersich, A; Rutgers, The State University of New Jersey; [email protected] Scald and food safety risks posed by unsafe water, refrigerator, and freezer temperatures in residences of Meals On Wheels recipients in 4 US states In the US, an increasing number of older adults live alone. Taking on-site measurements in the residences of 800 Meals On Wheels Recipients in 4 US States (AR, IA, NJ and SC), this study examined the potential scald risks to older individuals posed by unsafe water temperatures and the food safety risks posed by unsafe refrigerator and freezer temperatures. Most water heater manufacturers have voluntarily adopted a 120°F standard for domestic hot water as recommended by the US Consumer Product Safety Commission. However, the thinner skin of older adults burns more quickly than that of younger people and so are at increased risk for scalding and burns even at 120°F. This study adopted a nominal acceptable water temperature range of 114.8 to 118.4°F since studies have shown that this ensures thorough removal of grease films (which may promote bacterial growth), yet reduces the risk of scalding. Only 27% had hot water temperatures within this range. More than half (52%) were >120°F and 11% were >130°F (exposure to which can result in second degree burns within seconds). The highest temperature recorded was 184.5°F. Since older adults have a worse prognosis than younger patients after scald burns, the potential health consequences are serious for a large percentage of this population. The USDA recommends a freezer temperature < 0°F and a refrigerator temperature < 40°F to minimize microbial growth. However, 71.6% of the homes surveyed had freezer temperatures >0°F; with homes in AR and SC at statistically significantly higher risk of out-of-range freezers. In addition, 26.3% had refrigerator temperatures > 40°F; with homes in AR at statistically significantly higher risk of having an out-of-range refrigerator. The results suggest that the risks of scalding and microbial exposure are high for a large number of older individuals and highlights the need for surveillance and the development of prevention strategies to ensure safer water, refrigerator and freezer temperatures in the homes of these individuals.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-I.3 Hamilton, MC*; Lambert, JH; University of Virginia; [email protected] An iterative value of information approach using scenario-based preferences for risk analysis of infrastructure systemsThis is the title Risk analysis is too often applied ad hoc to factors that later turn out to be of little importance to an investment problem at hand. The selection of sources of risk for future analyses ought to be conditioned on knowledge that the risks have a significant influence to priorities of stakeholders. This paper describes integrating three parts of analysis to ensure that risk analysis is focused on significant factors in risk management of large-scale systems: (i) scenario-based preferences analysis combines factors into scenarios that influence priority-setting; (ii) systems analysis with multiple criteria ensures that expertise and concerns of diverse stakeholders are represented; (iii) value of information analysis supports dialogue and negotiation in adaptive iterations. The above steps focus risk and uncertainty analysis towards factors and stakeholder concerns with significant influence to decision-making. This research is unique to combine scenario-based preferences with a value of information philosophy to guide iterative steps of the decision analysis. The combination of factors provides a basis on which to update the investment alternatives, evaluation criteria, and future scenarios in subsequent analyses. The result aids agencies and industry in achieving a more holistic, risk-informed understanding of strategic planning problems. The integration of the three elements is demonstrated on a case study of energy systems of military and industrial installations considering investments in renewable energy, microgrids, and natural gas microturbines, among others. We provide a quantitative demonstration that addresses cost-risk-benefit of several alternatives in multiple iterations of scenario analysis. Scenarios include combinations of future and emergent factors that span technology, climate, economy, regulatory, socio-economic, and others.

W3-D.3 Hamilton, KH*; Haas, CN; Drexel University ; [email protected] Prioritization of roof-harvested rainwater pathogens to guide treatment and use Current efforts to reduce demands and detrimental impacts on the world’s water resources have led to the reevaluation of the practice of rainwater harvesting. However, limited information is available on actual end uses of roof-harvested rainwater (RHRW) and their frequencies for developed countries. Given the variable nature of rainwater quality due to catchment surface material, meteorological parameters, etc., it is challenging to designate appropriate uses. Despite these limitations, it is useful to identify high-priority human pathogens for guiding rainwater treatment and use. The ultimate goal is to encourage sustainable water use while not promoting a significantly increased exposure to disease-causing pathogens. Epidemiologic studies indicate that rainwater (re)used for drinking or domestic use has been associated with disease risks and several outbreaks. In light of these potential risks, this study (1) summarizes RHRW pathogen occurrence from eight North American/European studies identified through a systematic literature review; (2) develops risk model input distributions using maximum likelihood estimation; (3) uses a probabilistic risk model and sensitivity analysis to prioritize pathogens for further study; and (4) evaluates the potential for human health risk of RHRW by comparing volumes necessary to incur an “unacceptable” risk. Given limited reported concentrations from North America/Europe available, exposure to low RHRW volumes (Median 10E-4 to 10E+2 L, 5th percentile 10E-6 to 10E-2 L), could plausibly exceed current USEPA infection risk targets for recreational water (1E-4). Priority pathogens identified (median volumes for exceeding risk standard are <1 L) are C. parvum, L. pneumophila, E. coli, and G. duodenalis. Uncertainty in the concentration distributions dominates dose response uncertainty, highlighting the importance of obtaining additional occurrence information for priority pathogens in the United States.

M4-J.5 Harper, S; Ruder, E*; Roman, HA; Geggel, A; Nweke, O; Payne-Sturges, D; Levy, JI; Industrial Economics, Incorporated; [email protected] Using inequality measures to incorporate environmental justice into regulatory analyses at the U.S. Environmental Protection Agency Environmental justice concerns are theoretically incorporated into all actions undertaken by the US Environmental Protection Agency (EPA). However, formally evaluating how specific policy measures influence environmental justice is challenging, especially in the context of regulatory analyses in which quantitative comparisons are the norm. We conducted a literature review to determine whether health inequality measures developed in other settings can be applied in a manner consistent with EPA concepts of environmental justice and the structure and data requirements of regulatory analyses. We concluded that an outcome-based assessment of environmental inequality, specifically considering minority and low-income populations but not restricted to between-group comparisons, would be consistent with EPA definitions and concepts. Appropriate application of these indicators would require thorough characterization of the baseline distribution of exposures or risks; exposure models stratified by both location and demographics; and dose-response models that account for vulnerability attributes that may be demographically patterned. The preferred indicators would incorporate both between-group comparisons and within-group inequality. Choosing among candidate indicators requires decisions regarding the appropriate reference point for comparisons, whether the indicators should reflect relative or absolute inequality, whether social groups of interest have inherent ordering, and whether an explicit inequality aversion parameter is preferred to make transparent any value judgments important to decision making. Overall, we found that quantitative measures of exposure or health risk inequality are theoretically justified and can provide valuable insight for regulatory analyses at EPA, provided that the input data are appropriately constructed and the indicators are selected according to explicit decisions by EPA regarding the dimensions of environmental justice of greatest interest.

M2-C.3 Hartford, W*; Hartford, D; Hartfit Division of Nutritional Health Education; [email protected] Involuntary Personal, Individual and Societal Risk in relation to Risk Control Policies The idea of involuntary personal risk is introduced to explore the role of the choices that individuals at risk make in determining the outcome of undesirable events that follow from policies aimed at controlling individual and societal risk levels. Such choices are typically made under conditions of uncertainty rendering them suitable for examination in terms of the principles of risk analysis and the methods of probabilistic decision theories. In order to examine the problem from a general decision-making under uncertainty perspective, a broad spectrum of circumstances where the outcome of an involuntary risk condition depends on both historical and real-time choices of the individuals at risk whose real-time actions during the risk event influence the outcome to some degree. The investigation is bounded by two situations; those where the threat is known, response plans have been established, and the attributes of the risk can be modeled scientifically; and, those where the threat arises from harmful combinations of individually un-harmful things that are recognizable only to individuals who are sufficiently well informed. For the former bound, the knowledge of what to do is institutionalized in the response plan and this knowledge can be imparted to the individuals at risk in a conventional way such that they know generally what to do and the uncertainty around the choices that they make is somewhat limited. In the case of the latter bound, the knowledge as to what to do is either tacit knowledge held by individuals or learned knowledge about the cause – effect relations between the various considerations and entities that create the risk. Using the Life Safety Model simulation process, the paper identifies a framework whereby the models for the former bound can be adapted to address the more complex latter bound in terms of a generalized Bayesian risk modeling environment.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-K.1 Hassel, H; Johansson, J*; Lund University; [email protected] Mapping societal functions, flows and dependencies to strengthen community resilience – results from an initial study Communities are becoming increasingly dependent on critical societal functions and flows, such as energy supply, transportation of goods and people, school and health care. Dependencies between these functions and flows are also increasing. These trends imply that disruptions in the functions and flows may quickly lead to large societal consequences through domino effects which may be very difficult to understand and foresee. Understanding the functions, flows and their dependencies is key in order to increase the resilience of the communities. Gaining such an understanding requires systematic mapping of relevant information and methods for analysis. This paper presents a new method for mapping societal functions, flows, and their dependencies. Such a method is very useful as part of a comprehensive community-level risk and vulnerability assessments. The mapping is conducted by eliciting information from actors, both public and private, that represent critical societal functions. The mapping method is divided into three main parts: 1) System-level mapping, including e.g. mapping values/objectives for the system and what societal functions are contributing to their achievement. 2) Function-specific mapping, including e.g. mapping values/objectives for a specific function and what activities that must be performed to achieve them, as well as what flows are dependent on the activities and what the actor is dependent on to be able to perform the activities), 3) System-level aggregation, i.e. aggregating and synthesizing the information from step 2 into a holistic picture of a community or a region. The mapping method has been applied in an initial study of public and private actors in a specific geographic region (two different municipalities) in Sweden. From the initial results a complex picture of functions, flows and dependencies emerge which stresses the importance of these types of methods to guide public and private policy makers in governing risk and vulnerability.

M4-H.5 Hawkins, NL*; Elkins, DA; Janca , A; Simons, J; Montezemolo, M; Piper, J; Lesely , T; Cox , P ; Susel , I ; Brzymialkiewicz, C; Department of Homeland Security ; [email protected] DHS’ Risk-Informed Quadrennial Homeland Security Review (QHSR) The Department of Homeland Security is executing its Second Quadrennial Homeland Security Review in 2013, as required by law. As part of the initial Preparatory phase of the review, top risks and risk insights were identified to inform leadership discussion and guide QHSR studies and analysis. Risk analysis is also required in the Study and Analysis phase of the review, wherein the study teams must conduct risk characterizations to achieve a baseline understanding of the risks within the study mission space, and then analyze the costs and benefits of alternative strategic approaches. Study groups will follow the DHS Strategy Development and Analysis Process to arrive at conclusions. We will discuss some of the lessons learned, challenges, and innovative approaches developed to ensure the continued advancement of DHS’s analysis to inform strategic decisions.

W4-E.1 Hearl, FJ; National Institute for Occupational Safety and Health; [email protected] Pandemic Response for Workers: Controlling Occupational Risk Protecting the health of the workforce during a pandemic is important to ensure the health of the nation. Protecting the workforce involves selecting appropriate control measures and begins with risk analysis. Control banding, used by industrial hygienists typically for control of chemical substances, presents a structured framework to guide control selection using available data, observations, and assumptions based on past experience and decision logic. The process described in this paper is an adaption of the control banding model applied to infectious disease. It is a tool to be used by risk assessors and occupational health professionals to guide policymakers and employers in selection of control options and responses over a wide range of disease entities and job or task settings. The tool is not designed to be applied mechanically without separately giving full consideration to the peculiarities of individual situations and other mitigating factors. These approaches should be generalizable to other novel pathogens if sufficient information is available on transmissibility and virulence.

W4-E.4 Hearl, F; Boelter, F; Armstrong, T; Rasmuson, J*; Meier, A; CHEMISTRY & INDUSTRIAL HYGIENE, INC.; [email protected] Risk of Occupational Asbestos Disease Based on Biomarkers The reliability of industrial hygiene estimation of cumulative asbestos exposure to quantitatively predict risk of asbestos-related mesothelioma and lung cancer, especially at low exposure levels, is evaluated. This is accomplished by examining the linearity and precision of industrial hygiene cumulative asbestos exposure estimates via regression and ANCOVA correlation analysis with pathological lung tissue asbestos fiber burden and asbestos body analysis, including the evaluation of the effect of fiber type. The results are reviewed in terms of the most commonly applied quantitative asbestos risk assessment models for mesothelioma and lung cancer, with an emphasis on mesothelioma. The use of asbestos exposure biomarkers, in general, to qualitatively characterize past asbestos exposure and risk of disease is also reviewed, with special emphasis placed on asbestos fiber type. Considerations in the use of biomarkers, in general, to verify and validate cumulative asbestos exposure estimates are offered.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.59 Henning, CC*; Overton, AJ; Marin, K; Cleland, JC; Turley, AT; ICF International; [email protected] DRAGON: A Single Risk Assessment Database to Promote Transparency and Data Sharing With the availability of large volumes of data for risk assessment and a greater emphasis on consistency and transparency in federal agencies, data management and data sharing are of keen interest for risk assessors. The DRAGON tool is a database that stores risk assessment data and allows nimble management of the overall assessment process. Within DRAGON, the risk assessors can implement systematic review of the literature, manage the assessment of the quality of key studies and store related decisions, manage the data entry process, perform dose-response modeling, and rapidly generate reports in a variety of formats. The database itself has a unified structure that allows data-sharing across agencies and risk assessors with similar interests in a given chemical. Data-entry forms, reports, and assessment decision logic can be tailored for each agency, though, to meet the different internal priorities and needs. Included in the database is an evolving health outcome standard vocabulary that can be crosswalked to any other vocabulary if needed. The vocabulary is based on a system-based classification for each endpoint. Specific endpoints can also be mapped to custom categories for each assessment as desired. DRAGON also provides a framework for coordinating the work of multiple people working on assessments of chemicals with large databases to improve consistency and to facilitate quality assurance procedures.

M3-F.2 Henry, AD; Dietz, T*; University of Arizona; [email protected] Co-Evolution of Beliefs and Networks in Environmental Risk Policy: An Advocacy Coalition Framework Approach Effectively managing issues of environmental risk requires collaboration within policy networks. Within the environmental policy process, networks of information sharing, resource exchange, and other forms of interaction allow organizations to synthesize information and work towards shared goals. Ultimately, this allows policy actors to collectively learn how to deal with complex, uncertain, and emerging risks. Despite the importance of policy networks, however, the forces that shape these structures — and possible interventions to promote more effective networks — are not well understood. According to the Advocacy Coalition Framework, the dynamics of policy network formation lead to structures exhibiting belief-oriented segregation—that is, a high correspondence between shared policy beliefs and voluntary collaborative relationships. These structures may be produced through at least two pathways: belief homophily, where actors actively seek out connections with others sharing their belief system, and organizational learning, where policy beliefs diffuse through collaborative ties between organizations involved in risk policy. The cross-sectional design of many policy network studies precludes an explicit examination of these potentially complementary forces. This paper explicitly examines these dynamics using a reanalysis of data on policy beliefs and networking in U.S. environmental risk policy across two time periods, 1984 and 2000 (N = 223). Results indicate strong homophily effects, but relatively weak learning effects, in the evolution of this policy network. This research helps pave the way for additional research on the dynamics that share policy networks and beliefs, and also helps to clarify the differences between individual versus organizational contributions to policy network evolution.

M4-B.6 Henry, SH; Aungst, J*; Castoldi, AF; Rhomberg, L; Butterworth, J; Retired Food and Drug Administration, Food and Drug Admin., European Food Safety Authority, Gradient Corp., Science journalist/investigative reporter; [email protected] Panel Discussion for A new look at the toxicity of bisphenol A and public health policy This panel discussion moderated by Sara Henry and Jason Aungst will allow presenters to interact with each other and then with the audience on the symposium topic of the toxicity of bisphenol and public health policy.

M4-B.5 Henry, SH; Aungst, J; Castoldi, AF; Rhomberg, L; Butterworth, T; Fitzpatrick, J*; Retired Food and Drug Admin., Food and Drug Admin., European Food Safety Authority, Gradient Corp., Science journalist/investigative reporter,; [email protected] A new look at the toxicity of bisphenol A and public health policy making A question and answer session with the presenters of this symposium and the audience will follow the panel discussion. Session will be moderated by Sara Henry and Julie Fitzpatrick

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T1-A.4 HERRERA, DA; Toulouse School of Economics; [email protected] To fortify or not, a structural analysis of the public advisory policy on folic acid in France This paper analyzes consumer’s response to the French advisory policy on folic acid. The advisory was issued in 2005 in order to warn pregnant women or women who plan to become pregnant about the perils of a poor diet and the bad health consequences to their offspring. The advisory specifically mentions the peril of neural tube diseases (NTDs) that would result from a deficiency of folic acid. We are interested in how consumers responded to the advisory policy. We investigate consumers’ responses in terms of their demand for fortified ready-to-eat breakfast cereals making use of structural difference in difference (DiD) methods. Even though the prevalence of NTDs did not change after the introduction of the advisory policy, we find evidence of a positive effect on at-risk consumers’ demand for fortified cereals. Fortification is, however, controversial because it may have potential side effects including cancer and neurotoxic effects in individuals’ aged 50 or older. We therefore provide a welfare assessment of a fortification policy in France. We find that the benefits of fortification outweigh the costs by a large amount.

T4-B.5 Hertzberg, RC*; Burkhardt, EA; MacDonell, MM; Emory University; [email protected] Special Considerations for Risk Characterization in a Cumulative Risk Assessment The final step in a cumulative risk assessment (CRA) is the characterization of the risks. The step synthesizes information gathered in evaluating exposures to stressors together with dose-response relationships, inherent characteristics of those being exposed, as well as external environmental conditions. A cumulative risk characterization differs from a classical, chemical risk characterization in five important ways: the focus is on a population instead of a pollution source, exposures include chemicals along with environmental conditions and nonchemical stressors, the stressors and health effects are multidimensional, the complexity is partly handled by simplifying procedures and assumptions, and the uncertainty analysis is critical. CRAs also emphasize the involvement of not just risk scientists but also stakeholders and risk managers, especially in the planning, scoping and problem formulation step where assessment goals are decided. The characterization also includes an evaluation of how successful the assessment was in achieving those goals. This presentation illustrates these differences with case study examples, and shows how uncertainty analysis can be applied to indices and maps, two of the more common ways of communicating cumulative risks.

P.58 Heyl, ME*; Moyano, E; Cornejo, F; Cifuentes, LA; Faculty of Enginnering, Pontifical Catholic University of Chile; [email protected] Environmental attitudes and behaviours of university students: A case of study at an Chilean university. Encouraging the adoption of pro-environmental behaviour is critical to reduce the environmental impacts and to move toward a more sustainable future. Higher education plays an important role in educate and form professionals who will play an important role in protecting the environment through their decision and behaviours in their personal and professionals lives. The aim of this study is to identify whether there are significant differences between university students depending on the diploma of specialisation, related to the environment or not, the year in which they are studying and gender. Besides to investigate which factors influence significantly (perceived effort, positive environmental attitude or perceives positive consequents) the frequency of pro-environmental behaviours in the students. The sample consisted of 383 students in first, third and sixth year by which two instruments were designed to measure environmental attitudes and behaviours. Significant differences were noted between those who are studying diplomas related to the environment and those who aren’t, as opposed to the variations at different stages of the course. However, students have positive environmental attitudes which are not reflected in the performance of environmental behaviour. Upon conducting regression analysis, it is noted that the three factors influence significantly the frequency of pro-environmental behaviour, being the perceived effort (negative) the most influential variable.

T1-D.2 Hill, AA*; Kosmider, RD; Dewe, T; Kelly, L; De Nardi, M; Havelaar, A; Von Dobscheutz, S; Stevens, K; Staerk, K; Animal Health and Veterinary Laboratories Agency, Royal Veterinary College, Istituto Zooprofilattico Sperimentale delle Venezie; [email protected] Modelling the species jump: spatially ranking influenza A virus ability to cross species barrier and infect humans One of the most notorious group of zoonotic pathogens are Influenza A viruses. Historically, avian influenza viruses have been of primary concern as they were responsible for the pandemics in 1918 (“Spanish flu”), 1967 (Hong Kong) and 2009 (“Swine flu”).The latter outbreak has challenged the ethos of Influenza A pandemic preparedness and highlights the importance of taking a broader approach to the problem. The European Food Safety Authority (EFSA) has commissioned an Article 36 research project to develop a more formal approach to the identification of animal influenza A strains that have the potential to make the species jump into humans. We propose a prototype risk assessment framework to spatially rank Influenza A viruses circulating in animal populations for their potential to jump the species barriers to humans. We use a modified version of the classical epidemiological risk equation to estimate the risk of at least one human infection (given an infected livestock population) within a 5km2 area. The output will be a list of ranked animal viruses that could have the potential to infect humans in certain areas of the globe and, hence, may have pandemic potential. The intention is for the model to be eventually used regularly to prioritise research/surveillance for animal influenzas in higher risk areas of the globe. We will present initial spatial ranking results of the framework model for historic strains (validation) and hypothetical strains to illustrate the important issues and drivers of the model.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.27 Ho, WC*; Lin, MH; Caffrey, JL; Lin, YS; Fan, KC; Wu, TT; Chen, PC; Wu, TN; Sung, FC; Lin, RS; China Medical University , 15 F, No.91 Hsueh-Shih Road ,Taichung city,Taiwan; [email protected] Birth weight, household smoking and the risk of wheezing in adolescents: a retrospective cohort study OBJECTIVE: Low birth weight (LBW) and environmental tobacco smoke (ETS) exposure are each associated with respiratory difficulties (wheezing) in children. This study was designed to examine the combined association of LBW and ETS with wheezing. METHODS: A retrospective birth cohort analysis linked with a national survey of allergic disorders among 1018031 junior high school students in Taiwan (1995–96) was analyzed. The reported incidence of wheezing (yes or no) and ETS exposure (4 categories: 0, 1–20, 21–40 and ≥41 household cigarettes per day) were obtained from validated questionnaires. Logistic regression models were used to assess the associations of interest. RESULTS: LBW was associated with a higher odds ratios (ORs) of reporting ever wheezing (1.08, 95% confidence interval, 1.01 to1.16), current wheezing (1.09, 95% confidence interval, 1.00 to 1.20) and wheezing with exercise (1.11, 95% confidence interval, 1.02 to 1.21) within the smoke-free cohort. Higher ETS exposure correlated to a higher risk of wheezing (ever, current and with exercise). With ETS exposure, adolescents from the lowest birth weight cohorts were more likely to report wheezing (ever, current and with exercise). CONCLUSIONS: ETS and LBW each has been comprised a public health risk for respiratory symptoms in children. Furthermore, LBW may exaggerate the risk among those exposed to ETS. LBW, ETS and associated respiratory impairments may deserve special attention as part of a comprehensive environmental health risk assessment directed toward prevention and intervention.

M4-D.5 Hoelzer, K*; Pouillot, R; Food and Drug Administration, Center for Food Safety and Applied Nutrition; [email protected] Listeria monocytogenes and produce – a previously discounted public health risk Traditionally, fresh produce has been regarded as an unlikely vehicle for listeriosis, but large produce-related outbreaks have led to a paradigm shift in recent years. Notably, in 2003, the U.S. Department of Health and Human Services and the U.S. Department of Agriculture published a quantitative assessment of the relative risk to public health from foodborne Listeria monocytogenes among selected categories of ready-to-eat foods. This risk assessment identified fresh fruits and vegetables as posing an overall low to moderate risk of listeriosis, and highlighted important data gaps. In response to these findings, the U.S. Food and Drug Administration commissioned a risk profile to gain a more comprehensive understanding of the available data on this fresh fruits and vegetables as a vehicle for L. monocytogenes, and to evaluate the effectiveness of current and potential interventions. In this presentation, the key findings of the risk profile will be summarized, followed by a discussion of lessons learned about the predictive ability of risk assessments and the inherent challenges of predicting thus-far unrecognized public health risks.

M2-A.3 Hoffmann, SA*; Hald, T; Cooke, R; Aspinall, W; Havelaar, A; USDA Economic Research Service, Technical University of Denmark, Resources for the Future, University of Bristol University, Utrecht ; [email protected] Adapting Expert Elicitation Methods for Global Study of Foodborne Disease There is increasing interest in using large expert panels to elicit probability judgments about global health problems. Several such studies are in planning. This paper reports on the methodological advances and results from a global expert elicitation study conducted for the WHO, Global Burden of Foodborne Disease Initiative. The study elicited expert judgment on the relative contributions of different exposure pathways to microbiological, parasitic, and chemical hazards that can be foodborne. Attribution estimates are elicited for each WHO region. This study provides both an important application of expert elicitation and an opportunity to test new methods that will enhance the feasibility of using expert elicitation in large global studies. This expert elicitation provides source attribution estimates for each WHO region for multiple hazards and exposure pathways. This requires assembling multiple panels of experts whose knowledge captures food production, processing, and marketing conditions, water quality, and infectious disease transmission in all parts of the world. The effort pushes on existing methods in multiple ways. First, cost and logistical concerns imposes limitations on conducting elicitations in person, as has been done in the past. Third structuring the elicitation and calibration in ways that capture both global variation and provides meaningful comparisons around the globe. Third, need for transparency and reproducibility creates an opportunity to explore the implications of alternative criteria for selection and recruitment of expert panel members. A major methodological contribution of this study is formal testing for impact of conducting elicitations done by phone with facilitator and computer assistance versus allowing panelists to complete the elicitation on their own after phone orientation.

W3-D.1 Hoffmann, SA*; Ashton, L; Berck, P; Todd, J; USDA Economic Research Service, Unviersity of California, Berkeley; [email protected] Using Time Series Analysis to Investigate Food Causes of Foodborne Illnesses Information about the food sources of foodborne illness provides the foundation for targeting interventions under the new Food Safety Modernization Act. Current foodborne illness source attribution estimates are based on outbreak investigations, yet outbreaks account for less than 5% of total foodborne illnesses in the U.S. Case control studies suggest that attribution estimates from outbreak data do not reflect the role of different foods in causing sporadic foodborne illnesses equally well for all pathogens. FoodNet active surveillance data captures sporadic illness, but historically has not directly linked these illnesses to foods. This is the first study to use time series modeling methods to explore the relationship between food consumption and foodborne illness. The study relies on FoodNet illness data and food purchase data from Neilsen HomeScan, The method uses lag structure, seasonality, and geographic variability as well as exogenous controls to identify and estimate the relationship between Campylobacteriosis and illness from non-O157 STEC and consumption of different foods from 2000 to 2008. HomeScan allows very detailed categorization of foods including form of packaging. Detailed categories were developed based on known risk profiles. These included categories like ground beef, intact beef, ready-to-eat leafy greens, other ready-to-eat produce, pre-packaged deli meats, other deli meats. We find a significant positive relationship between Campylobacteriosis and purchases of berries, other fruit that is eaten without peeling, tomatoes, and ground beef. We find a significant positive relationship between Salmonellosis and purchases of ground beef, fruit that is not peeled before eating. For both pathogens there is a significant negative relationship with breakfast cereals and fruit that is peeled before eating and a positive relationship between illness and temperature. Coefficients on regional dummies are highly significant and large.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-K.3 Holman, E*; Francis, R; Gray, G; U.S. Environmental Protection Agency (author 1), George Washington University (authors 1-3); [email protected] Comparing Science Policy Choices in Chemical Risk Assessments Across Organizations Environmental and public health organizations including the World Health Organization (WHO) and the U.S. Environmental Protection Agency (USEPA) develop human health risk values (HHRV) that set ‘safe’ levels of exposure to non-carcinogens. This analysis evaluates specific science policy choices made in the context of setting HHRV and differences in these decisions observed across organizations. These choices include the selection of principal study, critical effect, the point of departure (POD) approach and numerical estimate, and the use of uncertainty factors (UF). By systematically evaluating each choice while recognizing connections among choices, the goal is to elucidate the most common sources of agreement and disagreement across organizations. In setting the UF, organizations typically use default 10X values, reduced values (often 3X), or chemical-specific adjustment factors. A common reason for using a reduced UF with a LOAEL POD is that the observed critical effect is considered minimally adverse. If chronic studies indicate that subchronic POD are more sensitive, a full 10X UF may not be required for a subchronic principal study. While older assessments often use default values, the use of PBPK modeling and human study data is becoming increasingly common, resulting in reduced UFs to account for interspecies and intraspecies extrapolations. To account for database deficiencies, organizations may invoke a database UF for concerns such as the lack of a specific study type or potential carcinogenicity. This analysis also examines cases where given the same or similar toxicological data, one or more organizations set an HHRV but other organizations do not. Included in the analysis are HHRV from the following organizations: USEPA, WHO, Health Canada, RIVM (Netherlands), and the U.S. Agency for Toxic Substances and Disease Registry. (The opinions are those of the authors and do not necessarily reflect policies of USEPA or the U.S. government.)

P.126 Holser, RA; Russell Research Center; [email protected] Microbial contamination in poultry chillers estimated by Monte Carlo simulations The risk of contamination exists in meat processing facilities where bacteria that are normally associated with the animal are transferred to the product. If the product is not stored, handled, or cooked properly the results range from mild food poisoning to potential life threatening health conditions. One strategy to manage risk during production is the practice of Hazard Analysis and Critical Control Points (HACCP). In keeping with the principles of HACCP a key processing step to control bacterial growth occurs at the chiller. The risk of microbial contamination during poultry processing is influenced by the operating characteristics of the chiller. The performance of air chillers and immersion chillers were compared in terms of pre-chill and post-chill contamination using Monte Carlo simulations. Three parameters were used to model the cross-contamination that occurs during chiller operation. The model used one parameter to estimate the likelihood of contact and a second parameter to estimate the likelihood of contamination resulting from that contact. A third parameter was included to represent the influence of antimicrobial treatments to reduce bacterial populations. Results were calculated for 30%, 50%, and 80% levels of contamination in pre-chill carcasses. Air chilling showed increased risk of contamination in post-chill carcasses. Immersion chilling with 50 mg/L chlorine or 5% trisodium phosphate added to the chiller water as antimicrobial treatments reduced contamination to negligible levels in post-chill carcasses. Simulations of combination air/immersion chiller systems showed reductions of microbial contamination but not to the extent of immersion chillers. This is attributed to the reduced exposure time to antimicrobial treatments. These results show the relation between chiller operation and the potential to mitigate risk of microbial contamination during poultry processing.

T2-B.1 Honeycutt, ME*; Haney, JT; State Government; [email protected] IRIS improvements: meeting the needs of Texas With a large land area, population, and concentration of industry, Texas has a need for scientifically-defensible and meaningful Toxicity Values (TV; e.g. RfD, RfC, Cancer Slope Factors) to prioritize scarce resources. Relying on conservative defaults in response to ever-present uncertainty as opposed to data can result in assessments yielding safe values less than background in certain media. While EPA states that IRIS chemical assessments are not risk assessments, the risk assessment implications of an IRIS TV are far reaching (e.g., implying background arsenic soil, fish, rice, and groundwater levels exceed acceptable risk levels). Toxicologically-predictive TVs are important to properly prioritize the 3,700+ remediation sites in Texas so that agency actions and limited funds can be focused on the sites which realistically pose the greatest public health threat and thereby achieve the greatest real health risk reduction. If a great multitude of sites exceed target risk or hazard limits due to overly conservative TVs, then it is difficult to properly prioritize sites for action to achieve the greatest public health benefit. EPA has flexibility for post-Baseline Risk Assessment risk management decisions when calculated media risk-/hazard-based comparison values are exceeded at a site where TCEQ does not. For the Texas Risk Reduction Program (TRRP) rule, individual-chemical and cumulative risk and hazard limits which trigger action have been included in the rule a priori, which while straight forward does not offer risk management flexibility. For example, the individual-chemical excess risk limit which triggers action for a chemical in surface soil is 1E-05. This often triggers action under TRRP when EPA would often have the flexibility at an individual-chemical risk <= 1E-04 to determine no further action needed. Risk management risk/hazard triggers for action under TRRP highlight the critical importance of toxicologically realistic and predictive TVs.

P.95 Hosono, H*; Kumagai, Y; Sekizaki, T; the University of Tokyo; [email protected] Two years since Fukushima accident. Do people still willing to support for the affected area? We’ve implemented 3 times of web based consumer survey, Nov. 2011(N=4,363), Mar. 2012 (N=5,028) and Jan. 2013 (N=6,357) to investigate how Japanese consumers consider the food produced in the area affected by the Fukushima accident. From these survey, risk perception of 7 beef related hazards became lower as the time passed. Limited relationship between the risk perception or willingness to pay for the food from disaster affected area and knowledge were observed while the intention to support for the recovery of affected area significantly increase the willingness to pay for these food. Another notable finding was in the 3rd survey implemented about 2 years after the disaster, 22.5% of respondents didn’t want to accept food from affected area even radioactive cesium is below standard level while the ratio in 1st and 2nd survey was 13.0% and 9.8% respectively. Therefore, we’ve developed and applied a web based lottery to identify individual risk aversion level as well as a donation experiment with the respondents of 3rd survey to relate willingness to accept food from disaster affected area and willingness to support the recovery of affected area.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-C.4 Hoss, F*; Vaishnav, P; Carnegie Mellon University; [email protected] What guides spending on risk mitigation: Perceptions or statistics? People perceive risks on numerous dimensions. They care about different risks more or less than the expected losses from those risks would justify. This puts policymakers in a quandary. Should they prioritize their risk mitigation efforts and resources based on people’s perceptions, or based on some ‘objective’ assessment? Policymakers do not spend equally on mitigating different risks. For example, a wide range of values of statistical life that can be inferred from a range of proposed and actual policy interventions that would reduce the risk of premature mortality. Our research will compare what different countries – in the first instance, the US and the countries of the EU – spend on mitigating different risks. Based on published data, we will work out what the expected value of losses associated with these risks are in each of the countries. Based on survey data, we will find out how residents of these countries perceive each of these risks: for example, to what extent do they think the risk is voluntary. We will then work out whether spending priorities are more closely correlated with expected value of losses or perceived risks in one country or another, and try to explain why these correlations are different. For example, the US and Europe have different approaches to risk mitigation is in the detection and treatment of cancer. Europeans are more likely to die of certain cancers than Americans. A possible explanation is that there is a greater emphasis on regular scans in the US. Surveys suggest that Europeans dread cancer, and place a greater value on preventing the loss of life due to cancer than due to other causes. Why, then, do Europeans place less emphasis on scanning for cancer? Is this a historical artefact? Is this a choice policymakers have made despite public opinion? Do Europeans dread cancer less than Americans do? Do policymakers with different professional backgrounds reach different decisions? Our research will answer such questions.

T1-C.3 Hoss, F; CARNEGIE MELLON UNIVERSITY, Pittsburgh; [email protected] The clients of the National Weather Service: Does the current use of river forecasts fully exploit their potential to decrease flood risk? For thousands of river gages, the National Weather Service daily publishes the expected river-stage for the next few days. Good river forecasts have skill for up to two days ahead, meaning that they perform better than assuming that water level will not change. For more lead time, especially when extreme events are concerned, the forecast error grows rapidly. It is not uncommon that forecasts for river-stages in the 90th percentile of observed river-stages have average errors of several feet. The river forecast centers of the National Weather Service do not yet publish the uncertainty or expected error associated with these short-term forecasts. It follows that successful use of river forecasts is heavily dependent on the understanding of the uncertainty of forecasts. For example, ignorance of the average errors of forecasts for the Red River led to insufficient preparation of Grand Forks, ND and its subsequent flooding in 1997. This research focuses on the users of river forecasts in emergency management. Central questions are if the emergency managers have the knowledge to correctly use the forecasts and what the benefits of river forecasts as they are used today are. Do river forecasts as they are published today reduce the flood risk, e.g. the likelihood to suffer damage from flooding? To investigate these questions, 17 emergency managers of mostly small and medium-sized communities along rivers in Pennsylvania and Oklahoma have been interviewed. The analysis is structured as follows. First, the emergency forecasters themselves are described. Which education and experience do they have? How much do they know about the forecasting process and the resulting uncertainty? Second, the process of preparing for an approaching flood and the role of river forecasts therein is analyzed. Third, the research zooms in on the use of the forecast. Which forecasts are being used and through what channels are they accessed? The research is rounded off with a discussion on how river forecast could reduce flood risk if they were used more effectively.

T4-D.5 Hristozov, DH*; Wohlleben, W; Steinfeldt, M; Nowack, B; Scott-Fordsmand, J; Jensen, KA; Stone, V; Costa, A; Linkov, I; Marcomini, A; University Ca' Foscari Venice; [email protected] Sustainable nanotechnologies (SUN) Our understanding of the environmental and health risks associated with nanotechnology is still limited and may result in stagnation of growth and innovation. There have been other technologies that revealed unexpected ecological and health effects only several years after their broader market introduction. In the worst cases this caused tremendous costs for society and the enterprises in the form of lock-in effects, over-balancing regulations and demolished consumer confidence. The new European Seventh Framework Programme SUN (Sustainable Nanotechnologies) project, worth 14 million euro, is based on the hypothesis that the current knowledge on environmental and health risks of nanomaterials, whilst limited, can nevertheless guide nanomanufacturing to avoid future liabilities. SUN goal is to develop and apply an integrated approach that addresses the complete lifecycle of production, use and disposal of nanomaterials to ensure holistic safety evaluation. The project will incorporate scientific findings from over 30 European projects, national and international research programmes and transatlantic co-operation to develop methods and tools to predict nanomaterials exposure and effects on humans and ecosystems, implementable processes and innovative technological solutions to reduce their risks, and guidance on best practices for securing both nano-manufacturing processes and nanomaterials ultimate fate, including development of approaches for safe disposal and recycling. The results will be integrated into tools and guidelines for sustainable manufacturing, easily accessible by industries, regulators and other stakeholders.The industrial partners in the SUN consortium will evaluate and “reality-check” these tools in case studies in terms of cost/benefit and insurance risk. The project involves major stakeholders such as OECD, ECHA, US EPA in implementing the SUN results into practice and regulation.

P.62 Huang, Y*; Anderson, S; Yang, H; The US Food and Drug Administration; [email protected] Modeling the relationship between post-vaccination hemagglutination inhibition (HI) titer and protection against influenza The objective of this research is to evaluate the relationship between post-vaccination HI titer in the host and the protection against influenza using modeling approaches. The HI titer is currently used as a surrogate endpoint for protection against diseases in FDA’s regulatory review of influenza vaccine products. We expect that the results of this research will provide us an insight on whether HI titer is a good predictor of protection against influenza; and if it is, what the level of HI titer needed for a sufficient protection is. We first searched available data from human challenge studies that reported post-vaccination HI titer, challenge dose, and post-challenge influenza infection. Five large-scale studies were identified. Among them, four studies used single doses for challenge while one reported multiple-dose challenge. We grouped the volunteers based on their HI titer levels. We assumed the relationship between challenge dose and infection rate (response) could be described by exponential or beta-Poisson dose-response models that have been widely used for a number of infectious disease agents. We estimated the model parameters for each HI titer group, and examined the dependency between host susceptibility represented by model parameters and post-vaccination HI titer. The dose-response models were further modified by incorporating such dependency and fit to the data set with graded challenge doses using maximum likelihood estimation. An exponential dependency between the model parameters and HI titer was identified and the dose-response models incorporating this dependency provided statistically acceptable fit to the data while the original models failed to do so. The modified models can be potentially used to identify the critical level of post-vaccination HI titer required for sufficient protection against influenza; and therefore, enhance our ability to evaluate the efficacy and protection offered by future candidate influenza vaccines.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T3-F.3 Huerta, MF; National Library of Medicine, National Institutes of Health; [email protected] The NIH BD2K initiative: Enabling biomedical research & raising the prominence of data Biomedical research generates vast amounts of complex and diverse data, increasingly in digital form. Despite some important exceptions, many of these data never leave the labs in which they are generated. Rather, the major public products of this research are concepts described in scientific publications, not the data upon which those concepts are based. The NIH Big Data to Knowledge (BD2K) initiative will advance the science and technology of data and big data, enabling the research community to harness the transformative power of large scale computation to advance understanding of health and illness. Significantly, BD2K will also raise the prominence of data in biomedicine. It will establish an ecosystem for research data that will bring data into the routine processes of science and scholarship by making it more available, discoverable, usable, and citable, and by linking data to the scientific literature. This presentation will provide an overview and status report of the initiative, including: findings from a series of recently held workshops, synopses of funding opportunities and a vision of the manner in which BD2K will affect the scientific landscape in biomedicine.

W2-D.3 Humblet, MF; Vandeputte, S; Albert, A; Gosset, C; Kirschvink, N; Haubruge, E; Fecher-Bourgeois, F; Pastoret, PP; Saegerman, C*; University of Liege; [email protected] A multidisciplinary and evidence-based methodology applied to prioritize diseases of food-producing animals and zoonoses Objectives: optimize financial and human resources for the surveillance, prevention, control and elimination of infectious diseases and to target the surveillance for an early detection of any emerging disease. Material and methods: The method presented here is based on multi-criteria analysis consisting in listing the criteria to assess pathogens, evaluating the pathogens on these criteria (scores), determining the relative importance of each criterion (weight), aggregating scores and weights of criteria into one overall weighted score per pathogen. This method is based on a multi-criteria decision making including multidisciplinary international experts’ opinion (N=40) for the weighting process and evidence-based data for information corresponding to each criterion/disease (>1,800 references). Hundred diseases were included in the process (OIE listed diseases and emerging diseases in Europe) and five categories of criteria (N=57) were considered. An overall weighted score was calculated for each disease using Monte Carlo simulation to estimate the uncertainty and the consecutive ranking was established. A classification and regression tree analysis allowed the classification of diseases with the aim to obtain subgroups with minimal within-variance (grouping diseases with similar importance). Results: A final ranking of diseases was presented according to their overall weighted scores and using a probabilistic approach. Few differences were observed between deterministic (mean of each weight) and probabilistic approaches (distribution function of weights) (Pearson correlation coefficient = 0.999; p-value < 0.0001). This is probably linked to few subjective interpretation problems or to the dilution of individual discordances among the high number of experts. CART analysis permits to differentiate 4 groups of diseases in function of their relative importance. Conclusions: The present methodology is a generic and predictive tool applicable to different contexts.

W4-B.2 Irons, RD*; Kerzic, PJ; Fudan University Shanghai, China; Cinpathogen, University of Colorado Health Sciences Center; [email protected] Predicting the risk of acute myeloid leukemia (AML) using peripheral blood cells or cells in culture has questionable biological relevance The hematopoietic stem cell (HSC) compartment gives rise to all blood and lymphoid cells and is the origin for acute myeloid leukemia- initiating cells (AML-IC). Recent advances in stem cell biology reveal that the characteristics traditionally ascribed to HSC, i.e. the capacity for self-renewal, maintenance of hematopoiesis, as well as the quiescence required for longevity, are the result of complex cell interactions in the bone marrow microenvironmental niche. HSC in bone marrow are typically found at frequencies between 10-6 - 10-7 and cannot form colonies in semisolid media. In isolation, HSC possess the intrinsic characteristics of primitive cells; rapidly proliferate with accumulating cytogenetic damage, and they assume the phenotype of AML-IC. Studies based on actual disease outcomes reveal that AML following exposure to benzene, a prototype chemical leukemogen, actually possess cytogenetic features consistent with de novo- AML and not those typical of therapy-related disease, nor those found in circulating lymphocytes of exposed subjects or induced in bone marrow cells in vitro. Consequently, measuring cytogenetic abnormalities in surrogate cells or even CD34+ cells in culture is not useful for predicting the risk of AML developing in vivo. Presently, the definitive basis for predicting the risk of AML in humans is evidence-based medicine.

P.28 Ishimaru, T*; Yamaguchi, H; Tokai, A; Nakakubo, T; Osaka University; [email protected] Development of practical quantifying method applicable for risk assessment of metabolic inhibition during co-exposure in workplaces by applying a PBPK model in humans At present, chemical substances in workplaces were managed based on administrative control level for single substance. When a number of chemical substances are used in a workplace, they are managed on the assumption that risk increase additively. The Hazard Index is calculated as the sum of the ratio a chemical’s exposure level to administrative control level, such that values larger than 1 are of concern. However the management based on this assumption cannot appropriately control compounds concerned the effect of metabolic inhibition. Based on the above considerations, we aim to develop the method to quantify the effect of metabolic inhibition in order to support risk management in occupational workplaces. In particular, we construct the method to derive dose-response curve by applying PBPK model for the metabolic inhibition and assess the effect caused by co-exposure with the case of toluene and n-hexane. Using the method to integrate the PBPK model applicable for co-exposure to toluene and n-hexane into the hierarchical model to evaluate the dose-response relations by dividing into pharmacokinetics (PK) and pharmacodynamics (PD), we have derived the dose-response curve including metabolic inhibition. As a result, by quantifying the variation of risk levels such as BMD10 from the dose-response curve excluding metabolic inhibition and the curve including metabolic inhibition, the effect of the metabolic inhibition was quantified for every administered concentration of competing chemical substances. We evaluated the threshold of co-exposure interaction. Moreover, this method could be applied to another type of combination of chemicals which causes the mutual metabolic inhibition if their metabolic inhibition mechanism is clear. Therefore, for the further development of this method, we deem it necessary to classify compounds which may cause the mutual metabolic inhibition in workplaces, and to clarify the competition mechanism of metabolic enzyme.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M4-F.2 Jardine, CG*; Driedger, SM; UNIVERSITY OF ALBERTA; [email protected] Communicating Environmental Health Risk Uncertainty: A Systematic Review of the Literature Communicating the uncertainty associated with environmental health risks is a continuing challenge within risk communication theory and practice. A systematic review of quantitative and qualitative empirical research published between 1985 and 2008 was conducted to consolidate and integrate the knowledge currently available in this area. The review involved searching more than 30 databases, representing various health-related and communication disciplines, using multiple combinations of keywords and synonyms. In addition, a search was made of key journals (such as Risk Analysis and Journal of Risk Research) and 109 key risk communication authors. Of the initial 29,499 potential articles identified, 28 met the inclusion criteria (22 of which were empirical evaluations conducted since the year 2000). Lessons learned for best practice include: (1) the importance of describing the nature, source, and consequences of the uncertainty; (2) the need to avoid vague or ambiguous descriptions of uncertainty; (3) the influence of different risk communication formats on understanding and uptake of uncertainty information; (4) the importance of the spokesperson; (5) how to work with the media to avoid inaccurate information being put ‘into the information void’; and (6) the critical importance of communicating risk uncertainty at the beginning of a developing risk issue. Notable gaps in the literature include the lack of research on different cultural understandings of uncertainty and the impact of new channels and sources of information (such as social media) on communicating uncertainty. The findings from this review will provide insights on needs for risk communication research in this area that will help guide new paths forward.

T4-G.4 Jenkins, F*; Rowan, KE; George Mason University; [email protected] Ecological risk communication and environmental values: Predicting public interest in participating in federal rulemaking concerning pesticide risk Two impediments to public participation in regulatory decision making regarding ecological risks may be complex findings and frequent failure to make ecological risk relevant to the public’s concerns. Consequently, this study used the mental models approach to risk communication to develop pesticide ecological risk assessment materials accessible to lay audiences. Thirty-six university students were interviewed to determine their understanding of a pesticide’s risk to non-target plants and animals. Their understanding was compared to scientific characterizations of this hazard. Results showed participants were not familiar with 21 of 29 elements found in characterizations of pesticide ecological risk from scientific literature. These results were used to design an account of pesticide effects accessible to lay audiences. Participants (180) were randomly assigned to a treatment or control conditions and asked to complete an environmental attitudes survey. Those in the treatment condition received a more accessible account of a pesticide’s ecological risk than did those in the control group. Results showed those who endorsed pro-environmental attitudes were likely to be interested in participating in a hypothetical US EPA rule-making effort concerning a pesticide’s ecological risks. This association was significant at the .05 level; however, there was no association between condition assignment and interest in participating in US EPA rule making. This set of findings suggests that environmental values guide interest in participating in environmental rule-making but leaves open the question of whether knowledge about ecological pesticide risk is associated with interest among stakeholders in participating in rule making about ecological risk. The authors discuss factors that may affect lay interest in managing ecological risks, especially hazards apt to have implications for plants and animals near dwellings.

T3-E.4 Jensen , E; Bellin, C; Embry, M*; Gaborek, B; Lander, D; Tanir, JY; Wolf, D; Zaleski, R; Dow Corning Corporation ; [email protected] Water Chemicals Case Study Using the RISK21 Tiered Exposure Framework As part of the approach for advancing risk assessment in the 21st century, the HESI RISK21 project considered a case study focused on health risk assessment for chemicals that may be found in surface water, and therefore raise the potential for human exposure. The problem was formulated such that a risk manager would have ONE year to decide whether risk management is required for any or all chemicals on a large list of potential drinking water contaminants. For simplicity, the only route of exposure considered was via consumption of drinking water. This case study highlighted rapid screening methods for risk prioritization, identifying readily available sources of suitable data for use in risk assessment, and challenges in considering cumulative exposure assessment. The RISK21 case study participants identified key questions to address and parameters to inform the assessment, and described assessment results progressing through the tiers with increasing levels of complexity. Another important output was a graphical presentation of exposure and hazard that could be useful in communicating risk to decision makers.

M2-A.4 Jessup, A*; Sertkaya, A; Morgan, K; Department of Health and Human Services/OASPE; [email protected] A novel approach to attributing illness to food using consumption data and expert elicitation Policy analysts are often asked to answer questions where there are no published studies or published studies are only peripherally related to the policy question. Often, the policy questions are complex and data needed to provide a useful answer are limited, conflicting, unavailable, or (as is sometimes the case) unobtainable. Further, seldom are analysts given large budgets in time or money to find answers. Meta-analysis and systematic reviews are methods to combine results from a body of established studies. Expert elicitation synthesizes expert opinions when data are lacking, but is typically limited to characterizing uncertainty around generally accepted parameters. Where data and studies are completely lacking, however, innovative methods must be applied. Information linking foodborne illness (FBI) cases to the source of the illness and/or specific food vehicle with sufficient specificity to guide policy is one such area. Previous food attribution research has used microbiological approaches (e.g., microbial sub-typing), epidemiological approaches (e.g., the analysis of outbreak and other surveillance data), and expert elicitation approaches (Pires et al., 2009). None of these studies, however, appear to have produced sufficiently detailed information to produce groupings of foods that are homogenous with respect to risk. We examine the challenges of answering scientific questions needed for policy analysis, moving beyond characterizing uncertainty. First, we discuss the applicability of different research methods (i.e., expert elicitation [EE], meta-analysis, and systematic reviews to the objective of generating FBI attribution rates by highly disaggregated food categories. Next, we develop and apply a hybrid novel method combining EE and food consumption data to estimate FBI cases by highly disaggregated food categories.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-E.2 Jiao, W; Frey, HC*; North Carolina State University; [email protected] Measurement and Comparison of PM2.5 AND CO Microenvironmental Exposure Concentrations for Selected Transportation Modes Daily commutes may contribute disproportionately to overall daily exposure to urban air pollutants such as fine particulate matter (PM2.5) and carbon monoxide (CO). The on-road and near-road microenvironments are of concern because of proximity to on-road traffic emissions. A field data collection study design was developed based on factors that may affect variability in in-transit concentration, including transportation mode, time of day, traffic volume, weather, vehicle ventilation conditions, road geometry and traffic control, traffic vehicle mix, and proximity to intersections. PM2.5 and CO concentrations were measured and compared across pedestrian, bus, and car modes during lunchtime and afternoon rush hour within a three-week time period on pre-selected round trip routes in Raleigh, NC. Variability in the transportation mode concentration ratios of PM2.5 and CO is quantified. Factors affecting variability in PM2.5 and CO concentrations are identified. The average pedestrian concentration is compared with fixed site monitor (FSM) data to determine if FSM is an appropriate surrogate for near-road concentration. Preliminary results indicate that on-road or near-road microenvironmental concentrations are sensitive to transportation mode, traffic volume, and proximity to onroad emission sources. In general, pedestrians experienced the highest PM2.5 concentrations among all measured transportation modes. Peaks in pedestrian PM2.5 concentration are typically associated with a passing truck. In comparison, the average PM2.5 concentration in-car is the lowest because the selected ventilation conditions helped to prevent ingress of particles. A positive association was found between traffic counts and average CO concentrations. Field studies such as this are needed to develop data for input to population-based stochastic exposure simulation models to more accurately predict transportation mode exposure concentrations.

M3-E.1 Jiao, W*; Frey, HC; North Carolina State University; [email protected] Comparison of predicted exposures versus ambient fine particulate matter concentrations Persons 65 and older are particularly susceptible to adverse effects from PM2.5 exposure. Using the Stochastic Human Exposure and Dose Simulation model for Particulate Matter (SHEDS-PM), distributions of inter-individual variability in daily PM2.5 exposures are estimated for Bronx, Queens and New York Counties in the New York City area for the years 2002 to 2006 based on ambient concentration, air exchange rate, penetration factor, deposition rate, indoor emission sources, census data, and activity diary data. Three research questions are addressed: (1) how much is variability in estimated daily average exposure to ambient air pollution influenced by variability in ambient concentration compared to other exposure factors?; (2) what is the inter-annual variation in daily average exposure?; and (3) what key factors and values of these key factors lead to high exposure? In comparison with CMAQ input ambient concentrations, daily average exposure estimates have more variability. Variation in estimated exposure to pollutants ambient origin is mostly affected by variation in ambient concentration, air exchange rate, and human activity patterns. Estimated daily average exposure to ambient PM2.5 is about 30 to 40 percent less than the ambient concentration. Seasonal differences in estimated exposure are mainly caused by seasonal variation in ACH. There was relatively little estimated inter-annual variation in the daily average exposure to concentration ratio (Ea/C), since factors affecting exposure such as ACH, housing type and activity patterns were assumed to be relatively stable across years. The distribution of inter-individual variability in the Ea/C ratio can be used to identify highly exposed subpopulations to help inform risk management strategies and to provide advisory information to the public.

T4-H.3 John, RS*; Rosoff, HR; University of Southern California; [email protected] Validation of Adversary Utility Assessment by Proxy Most adversaries are not available for or willing to allow for direct elicitation. Such adversaries have a strong interest in countering or foiling others; these instances range from criminal organizations, terrorist organizations, corporations engage seeking to gain a market advantage, political organizations seeking to promote their views and hindering rivals from making progress, and sports rivalries. In such cases it is necessary to construct a representation of preferences using information that is known about adversary motivations, objectives, and beliefs. Such information includes a variety of sources, including past adversary behavior, public statements by the adversary, adversary web sites, and intelligence. An adversary objectives hierarchy and MAU model based on this information can be constructed by proxy, using judgments from an adversary values expert (AVE). The construction of value models by proxy raises the question of whether such models can accurately capture adversary preferences using only secondary and tertiary sources. There is no published research to date on the validity of utility models constructed by proxy. In this paper, we report two validation studies comparing MAU models for two different politically active non-profit organizations that utilize civil disobedience to achieve political objectives. In both cases, we constructed an objectives hierarchy and MAU model using AVEs who have access to publicly available information about the organizations’ motives, objectives, and beliefs, but no direct contact with organization leaders. We then independently compared these MAU model parameters and constructed preferences to those based on direct assessment from a representative of the organization. In both cases, we demonstrate good convergence between the proxy model and the model assessed by direct contact with a decision maker. The proxy MAU models provided a complete and accurate representation of the organizations’ values, including objectives, trade-offs, risk attitudes, and beliefs about consequence impacts.

M4-H.3 John, RS*; Scurich, N; University of Southern California and University of California, Irivine; [email protected] Public perceptions and trade-offs related to randomized security schedules Although there are theoretical advantages to randomized security strategies, and they have been adopted in several major areas, there has been no research evaluating the public’s perception of such measures. Perhaps the most challenging hurdle for randomized security strategies is the potential for perceived unfairness by the public. Randomization is clumpy, and random selections for search will often appear nonrandom to an individual who observes some relatively small number of searches while waiting in line. Individuals observing random searches may exhibit a confirmation bias that magnifies the perception of unfair search patterns in short sequences due to (1) biased observation seeking to confirm hypothesized inequities, and (2) biased interpretation and recollection of observed searches. In short, the perception of safety can be as important as the reality of safety. Likewise, the perception of fairness can weigh as heavily as the reality of fairness. If randomized security schedules are perceived as inefficacious and/or unfair, potential patrons might protest their use and pursue alternatives that actually increase the net societal risk. In the present experiment, over 200 respondents were asked to make choices between attending a venue that employed a traditional (i.e., search everyone) or a random (i.e., a probability of being searched) security schedule. The probability of detecting contraband was manipulated (i.e., 1/10; 1/4; 1/2) but equivalent between the two schedule types. In general, participants were indifferent to either security schedule, regardless of the probability of detection. The randomized schedule was deemed more convenient, but the traditional schedule was considered fairer and safer, suggesting a perceived trade-off between safety, fairness, and convenience. There were no differences between traditional and random search in terms of effectiveness or deterrence.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-J.3 Jones, SM*; Smith, DW; Conestoga-Rovers & Associates; [email protected] Ecological Risk and Hydraulic Fracturing: Perception, Assessment, and Reality The environmental impacts and risks of hydraulic fracturing (fracking) to human and ecological receptors are highly controversial. Opponents of fracking contend that chemicals in fracking fluids, as well as the fracking process itself, impact groundwater supplies, surface waters, and other human and natural resources. In an effort to evaluate the risks associated with fracking, federal and state agencies, industry, academic institutions, environmental groups, and Hollywood have produced a number of impact statements, analyses, and position papers. One notable example is the Draft Supplemental Generic Environmental Impact Statement (DSGEIS) prepared by the New York State Department of Environmental Conservation. Instead of settling the issues and calming the storm, these reviews are, unfortunately, often derided for alleged bias, misinformation, and inadequate analysis. In this talk, we identify the key ecological issues associated with fracking, identify major impact assessments and analyses that have been conducted, and present, as best we can, an objective evaluation of the major points of contention, focusing on ecological receptors and natural resources.

T1-C.2 Jongman, B*; Hochrainer-Stigler, S.; Rojas, R.; Feyen, L.; Bouwer, L.M.; Botzen, W.J.W.; Aerts, J.C.J.H.; Ward, P.J.; Institute for Environmental Studies, VU University Amsterdam; [email protected] Challenging disaster risk financing capacities: probabilistic flood risk assessment on pan-European scale Flooding of rivers is the most frequent and damaging natural hazard currently affecting European countries, causing annual losses of more than 9 billion USD. The upward trend in damages that is reported over the past three decades across the continent is expected to continue in the future, as a result of changes in flood hazard, and increases in exposure. Several mechanisms are in place to distribute and compensate flood losses on both country and European Union levels. These mechanisms can be divided into three main categories: insurance, national government funding and the EU-funded Solidarity Fund. The sustainability and viability of the risk financing sources is subject to changes in disaster risk, and in particular flood risk. To analyse the pressure on the financing mechanisms, a probabilistic approach is required that assesses changes in exceedance probabilities of high losses. In this paper we (1) present probabilistic trends and projections in flood losses for different parts of Europe using empirical loss data and high-detail risk modelling techniques; (2) analyse the public and private mechanisms in place for financing flood recovery and adaptation; (3) assess the expected required and available funds for the period 2010 – 2050; and (4) propose policy on flood risk financing going forward. The results are important for researchers, policy makers and (re-) insurance firms that are concerned with natural disaster costs and financing mechanisms. The novel probabilistic methods can be relevant for scholars and analysts working on risk assessment across different regions.

T4-H.2 Jose, VRR*; Zhuang, J; Georgetown University; [email protected] Beyond Risk-Neutrality in Attacker-Defender Games: Expected Utility and Cumulative Prospect Theories Traditional models of attacker-defender games typically assume that players are risk-neutral. Behavioral research however has shown that individuals often violate risk-neutrality. In this talk, we consider how the use of alternative decision making frameworks, namely expected utility and prospect theories, affects the equilibrium behavior of players in attacker-defender games. We then demonstrate how certain results may no longer hold when risk-neutrality is dropped in favor of these theories. For example, we show that although the literature posits that the more risk-averse a defender is, the more likely she is to defend, and that the less risk-averse an attacker is, the more likely he is to attack, we find that this behavior may not always be true when other factors such as risk preferences and loss aversion are considered.

T1-F.2 Jovanovic, A. S.; ZIRIUS, University of Stuttgart & EU-VRi, Stuttgart, Germany; [email protected] Aligning approaches to management of emerging risks – the new European CEN CWA pre-standard The contribution highlights practical aspects of the development and practical application of the EU approach set up in the European pre-standard document issued by CEN (European Committee for Standardization) entitled “CWA 67 General Framework and Guidelines for Early Recognition, Monitoring and Integrated Management of Emerging New Technology Related Risks” (http://www.cen.eu/cen/ Sectors/TechnicalCommitteesWorkshops/ Workshops/Pages/WS67-IntegRisks.aspx). The document is based on the results of the iNTeg-Risk project (www.integrisk.eu-vri.eu). Main goals of the pre-standardization action has been to provide a better basis for the consensus needed for acceptance of new technologies. This acceptance can be reached only if the stakeholders are convinced that possible or perceived emerging risks related to these technologies can be managed in a safe, responsible and transparent way. The role of the CEN documents is, hence, also the improving the management of emerging risks, primarily by proposing a procedure for early recognition and monitoring of emerging risks and decrease reaction times if major accidents involving emerging risks happen (improved resilience). The CWA consists of a general Guideline for the Emerging Risk Management Framework (ERMF) supported by additional parts dealing with Emerging Risks related in particular to (a) New Technologies, (b) New Materials and Products, (c) New Production and Production Networks, (d) Emerging Risks Policies and (e) Uncertainties in testing procedures. The alignment of positions and development and approval of the CWA have involved European industry, research institutions and academia, as well as representatives of the European standardization bodies, under the leadership of the German national standardization body DIN. Liaison to ISO and the ISO 31000 standard and the respective ISO Committee (TS 262) have been established. The CWA is not intended for certification.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.65 JYOTHIKUMAR, V; University of Virginia; [email protected] Biological weapons and bioterrorism threat assessment. Present day terrorists do not officially represent countries or states, but often they represent a religious ideology expressed through violence and death. The use of weapons of mass destruction (WMD) against civilian noncombatants is not novel or unique to present times. Mankind has exploited diseases, toxins, and poisons since the earliest days of recorded history to wage warfare, commit crimes, and force others. However, accessibility to biological weapon agents, and their enhanced capacity to cause morbidity and mortality, as well as improvement of tactics for their employment, have significantly increased the need for the development of more effective means of detecting and countering such weapons. Infectious diseases have unquestionably played a significant and defining role in the overall progression of mankind, religions, and cultures to the structure and organization of economies and governments. Bacterial pathogens that can be potentially used as biological threat are typically 1–2 microns and are most effective when inhaled or ingested into the body. Antibiotic-resistant strains of anthrax, plague, and tularemia are known to exist naturally and may be exploited for weapons. Bacillis anthracis , for example, can be weaponized by attaching spores to carrier particles. A new economical way to screen such samples in nature is by using autofluorescence. Autofluorescence is based on the detection of natural intrinsic tissue fluorescence emitted by endogenous molecules such as co-enzyme, collagen and flavins. After excitation by a short-wavelength light source, these fluorophores emit light of longer wavelengths. The overall fluorescence emission patterns differ among various bacterial species due to corresponding differences in fluorophores concentration, metabolic state. Bacterial spores contain a high concentration of endogenous fluorophores besides other elements, which may allow a proper discrimination of spores from other suspicious particles.

T4-G.5 KAIN, NA*; JARDINE, CG; WONG, J; University of Alberta; [email protected] Learning from SARS and H1N1: A comparison of survey data from nurses in Alberta, Canada Over the past decade, two major communicable respiratory disease outbreaks affected Canada: the Severe Acute Respiratory Syndrome (SARS) outbreak in 2003 and the pandemic H1N1 influenza outbreak in 2009. Playing the dual role of both front-line caregivers and information conduits, nurses were an integral component in the public health and healthcare response to both outbreaks. In the Canadian province of Alberta, surveys of nurses were conducted after these public health crises (2004/5 and 2010, respectively) to assess nurses’ perspectives on the risk communication issues related to the SARS and H1N1 outbreaks in Canada. Both surveys were components of larger research projects examining the various risk communications processes surrounding these public health events, so that communications strategies in Alberta and Canada might be improved in the incidence of a future infectious disease outbreak. A total of 361 Alberta nurses completed the SARS self-administered paper-based mailed survey in 2004/5, and 1,953 Alberta nurses completed the H1N1 anonymous internet-based online survey in 2010. Both open and closed-ended questions were asked in the surveys, including demographic information, questions related to nurses’ information sources about the outbreak/disease, access to information about the disease/outbreak, understanding of the disease symptoms, their perceived roles and responsibilities during the outbreak, and whether or not the information they received about the disease/outbreak was confusing or conflicting. Similarities and differences between nurses’ responses post-SARS and post-H1N1 are outlined, compared and contrasted. Recommendations from nurses in Alberta relating to future risk communications regarding infectious disease outbreaks are summarized.

P.104 KAIN, NA; University of Alberta; [email protected] Crisis and Emergency Risk Communication to Family Physicians in Canada Family physicians play the unique role in the Canadian health care system of being both recipients and translators of complex health risk information. This role is especially highlighted during times of public health crisis, such as the SARS outbreak in 2003, the Maple Leaf Foods Listeria outbreak in 2008, and the H1N1 pandemic influenza outbreak in 2009. Reviews of these crises outline the necessity for improved risk communication of appropriate and timely information to family physicians. Public health and professional agencies need to better understand the information-seeking behaviours, knowledge requirements and trust relationships of this community, in order to maximize the potential of knowledge dissemination to this group, and to improve the risk communication of crisis/emergency information to family physicians in Canada. This paper outlines an original research study that will: 1) explore the way in which Canadian family physicians access information during times of public health crisis/emergency, 2) acquire a clear understanding of who family physicians trust for timely, accurate and credible information, and 3) assess the knowledge requirements and risk communication needs of family physicians. Using a phenomenological approach, individual interviews will be conducted with family physicians from various practice settings and locations across Canada. The interviews will be audio-record and transcribed verbatim and coded to identify descriptions of the phenomenon of risk communication of crisis/emergency information to family physicians, and then by clustering these descriptions into categories to describe the “essence” of this phenomenon. A set of recommendations for public health and professional agencies in Canada to improve risk communication strategies for family physicians relating to crisis/emergency information will be proposed.

P.107 Kajihara, H; National Institute of Advanced Industrial Science and Technology; [email protected] Selection of next-generation low global-warming-potential refrigerants by using a risk trade-off framework Because the refrigerants currently used in air-conditioners have high global-warming-potential (GWP), substances with lower GWP, such as R-1234yf, are being sought as candidate next-generation refrigerants. However, low-GWP substances often have comparatively high chemical reactivity and may carry increased risks of combustibility, toxicity, generation of degraded products, and CO2 emission increase caused by poor energy-saving performance. It is therefore possible that there is a risk trade-off between currently used refrigerants and low-GWP ones. In this research, I proposed a framework for evaluating this risk trade-off in the following five categories: (1) environmental characteristics; (2) combustion characteristics; (3) toxicity; (4) volume of greenhouse gas emissions; and (5) applicability to air-conditioning equipment. I then selected substances well suited as next-generation refrigerants in accordance with a specific screening process. I showed the importance of clearly specifying the combination of a number of end points and assessment criteria in the process of decision-making based on risk trade-off. This yields a rapid understanding of the necessary data, as well as flexible decision-making that is relevant to the social conditions.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-I.4 Kajitani, Y*; Yuyama, A; Central Reserach Institute of Electric Power Industry; [email protected] Operational Reliability of Power Plants and Energy Shortage Risk in Japan after the March 11 Earthquake and Tsunami The power shortages following the Great East Japan Earthquake create persistent impacts on Japanese society. The firms which are not only located in the severely damaged area but also outside the area had to reduce power usage both at peak and total and spent considerable efforts to achieve the target amount of power consumption. After the event, thermal and hydro power plants substitute nuclear powers by paying the efforts to reduce the shut-down periods, but there is uncertainty that they can be affected by internal and external incidents. In addition, the supply capacity and demand of electricity can be influenced by many other factors such as capacity of saving energy both in household and industry depending on temperature, rain falls for hydropower plants, etc. This presentation introduces these supply and demand uncertainty based on the experience of march 11 disaster in Japan, and estimate the risk of supply shortages by some statistical models.

T2-J.3 Kashuba, R*; Fairbrother, A; Kane Driscoll, S; Tinsworth, R; Exponent; [email protected] Probabilistic Methods to Address Ecological Risk of Secondary Ingestion Exposure to Chemicals The goal of risk assessment is to evaluate the potential of a future adverse event based on current information, which necessarily wrestles with the interpretation of uncertainty. Such uncertainty includes (1) how well available data characterize “true” values now and in the future, (2) how well mathematical simplifications approximate “true” relationships between variables, overlaid with (3) natural variability in attributes within a population (e.g., body weight). Probabilistic risk assessment methods enable a risk assessor to incorporate these different sources of uncertainty into the characterization of risk. Not only does this create a more complete picture of the risk profile, it also quantifies the proportion of the population likely affected and the certainty with which that prediction can be made. In contrast to deterministic risk, estimated for a constant average or worst-case scenario, probabilistic risk is assessed across the range of possible scenarios, and the likelihood of each occurring. This process is examined through calculation of the ecological risk associated with hypothetical potential ingestion of chemical-exposed prey by non-target predators. Distributions of possible secondary exposures to a predator are calculated via Monte Carlo sampling from distributions of input variables, such as body weight, food ingestion rate, percent of predator diet composed of a particular prey, and percent of prey population containing chemical residues (from primary exposure). These dose-to-predator distributions are then propagated through a distribution of possible dose-response curves, resulting in distributions of expected mortality rates, which integrate uncertainty associated with both exposure and effects estimation. Risk is then reported as the likelihood of exceeding different mortality thresholds. Changes to reported risk as a result of different modeling assumptions are evaluated, and challenges of communicating probabilistic risk are explored.

T1-G.2 Kasperson, RE; Clark University; [email protected] Social trust and fracking The call is out for a rapid worldwide expansion of a new energy supply option to meet the global threats of climate change. At the same time, in a number of counties (including prominently the U.S.), a continued erosion of social trust in those who will develop and manage fracking projects is painfully evident. This presentation seeks to identify and explore the relevant issues. The long-term trend in the erosion of social trust has both structural and behavioral roots. Indeed, the trend appears to have become a central ingredient in the socialization process, as changes in the confidence and trust in a wide array of social institutions and sectors (note the historically low trust in the U.S. Congress as an example) become apparent. This suggests, as well as the work of Slovic, Renn, and other social scientists, that social trust once lost cannot be easily regained, even with highly meritorious behavior. Accordingly, decisions will need to be made under conditions of high social distrust. The scientific and social uncertainties that permeate fracking exacerbate this reality. Moreover, risks, community impacts, and local benefits vary widely from plaace to place. Societies are still early in the deployment of this energy option as an antidote to climate change. And so accumulating experience will be important in assessing and managing the risks.

T4-K.2 Kasperson, RK; Clark University; [email protected] Opportunities and dilemmas in managing risk uncertainty Uncertainty is an inescapable ingredient of life. Even for familiar situations--such as crossing a street--some level of uncertainty is involved. Past experience is relevant for decisions involving the future but contexts change aqnd new elements affecting risk may unexpectedly occur. It is not surprising that in a world of complex systems involving technological change, highly coupled human and natural systems, and a kaleidoscope of social and political institutions high levels of uncertainty challenge existing assessment methods and familiar decision procedures. This paper explores opportunities and dilemmas in confronting these issues.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-A.3 Kazemi, RK*; Rahaman, FR; Urban, JU; William Carter, WC; USFDA; [email protected] A Probabilistic Risk Analysis (PRA) Framework for Modeling Risk in Global Drug Supply Chain To make drug supply chain safer and more secure, we need models that can realistically and comprehensively characterize the risks associated with global drug supply chain. Most prescription (Rx) and over-the-counter (OTC) drugs consumed in the U.S. are manufactured in foreign facilities. The industry imports drugs as bulk active pharmaceutical ingredients (APIs), bulk finished dose form (FDF) drug products for final packaging in the US, or FDF products in the final packaging for wholesale or retail marketing. It is estimated that 40 percent of finished drugs used by U.S. patients, and 80 percent of active ingredients and chemicals used in U.S. drugs are produced abroad. Globalization and underlying complexities in pharmaceutical supply chain have created opportunities for criminal actors to take advantage of the weaknesses of the supply chain and to introduce substandard drugs that would reach and harm patients. A well publicized instance of such substandard drug that entered the drug supply chain, and claimed many lives, is the case of adulterated heparin. To ensure quality and safety of drugs consumed by patients, it is crucial to ensure drug component manufacturers and finished drug product manufacturers adhere to quality standards. It’s also critical to ensure the security of the supply chain which starts from raw material manufacturers to patients. Additionally a meaningful, efficient and effective oversight on the supply chain from the regulatory agencies is required to strengthen drug supply chain safety and security. In this study a probabilistic risk analysis (PRA) framework has been adapted to formally identify and assess the vulnerabilities in the global drug supply chain and model the risk of counterfeit or substandard drugs reaching to patients.

P.18 Kazemi, R*; Rahaman, F; Urban, J; USFDA; [email protected] A Bayesian Belief Network (BBN) for Modeling Risk of Adverse Events Due to the Particulate Matter in Injectables Particles in injectable medications chiefly come from two main sources, intrinsic contaminants that result from manufacturing and packaging processes and extrinsic particles that are introduced at the time of administration to patients. These particles are generally considered to be harmful and can have many shapes or types (e.g. glass, metal, rubber, lamellae, etc.) and many sizes. Many factors play a role in determining whether these particles will have imminent effect of patient’s health, whether the particulates could cause serious health problems or death, temporary health problems or no serious adverse health reaction is expected at all. Among these factors are particulate profile (e.g. type, size, etc.), amount of particulates administered to the patient, Route of administration, whether or not barriers will be effective to screen out the particulates (i.e. use of filters) as well as patient’s resistance factors. Due to the uncertainties involved in how these particles may influence risk of adverse events in patients, Bayesian Belief Network (BBN) formalism has been utilized to assess the risk of adverse events due to the injection of particulates. BBNs are probabilistic in nature and the uncertainty in assessing risk of adverse events, given the different states of all influencing factors can be explicitly expressed and modeled. Given patient’s conditions and the characteristics of the particles, the model will be able to assess the likelihoods of adverse events with their respective uncertainties.

M4-G.4 Keller, C*; Sütterlin, B; Siegrist, M; ETH Zurich; [email protected] A comparison of spontaneous associations with nuclear power underlying its acceptance before and after the Fukushima disaster, and of associations with nuclear and solar energy resources After the nuclear accident in Fukushima, energy policies changed in some European countries. In Switzerland, for example, the government decided to phase out nuclear power in the near future and to promote and expand renewable energy resources. To better understand public acceptance of the changed energy policy, spontaneous associations with nuclear power as well as with solar power and the affective evaluations of these associations were examined. A study was conducted among Swiss residents in 2012 (N=1211), after the Fukushima disaster. Slightly fewer than half of the sample (n=561) had already participated in a study in which associations with nuclear power were assessed in 2009, before the Fukushima disaster. The present study compared the spontaneous associations with nuclear power plants between 2009 and 2012, and examined the relationship with the acceptance of nuclear power. In addition, the spontaneous associations with nuclear power were compared with associations with solar power. The affective evaluation of nuclear power was more negative in 2012 compared to 2009. However, the general pattern of associations with nuclear power underlying the acceptance of nuclear power did not change. Almost all associations with solar power were rated positively. People did not associate risk with solar power. Practical implications will be discussed.

M4-I.1 Kenney, L*; Arvai, J; University of Calgary; [email protected] Confronting risks and benefits of energy system improvements in developing communities: The case of Canada’s Northwest Territories Decisions about energy development and delivery are inherently complex: they must address the technical challenges of energy production and transmission, and because they involve and affect people; as such, these decisions must address a range of often conflicting values. Decision making processes cannot proceed without input from multiple stakeholders. Ultimately, untangling this complexity means making tradeoffs across economic, social, and environmental concerns, while accounting for risk and uncertainty related to dynamic coupled natural-human systems. In developing communities these kinds of decisions are made more challenging as a result of the unique characteristics that typify these locations: e.g., poorly developed infrastructure; limited government budgets; political systems lacking transparency; economic vulnerability; and low education levels. Moreover, the social, economic, political, and environmental systems in these areas are tightly linked; changes in one system can easily affect the others. Decisions about energy have significant consequences in terms of quality of life and, therefore, are often part of larger development agendas. A case in point is Canada’s Northwest Territories (NWT), which is home to significant variations in geography, infrastructure development, economic activity, cultural traditions, and governance arrangements. Under these challenging conditions the NWT government is attempting to reform the region’s energy systems. This presentation reports the results of case study research conducted in the NWT between 2012 and 2013. We discuss the risks, challenges and benefits related to improving energy systems in developing communities. We then provide recommendations about how to structure decisions related to the energy development and delivery in the NWT so as to effectively meet a range of stakeholders’ objectives in a transparent and inclusive manner.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-F.3 Kiker, GA*; Linhoss, A; Munoz-Carpena, R; Frank, K; Fischer, R; Linkov, I; University of Florida, Mississippi State University, US Army Corps of Engineers; [email protected] Florida, sea level rise and decision analysis: choosing between the devil and the deep blue sea Climate change (through sea-level rise and altered weather patterns) is expected to significantly alter low-lying coastal and intertidal areas. The State of Florida has significant coastal lands that serve as an economic engine for growth as well as a refuge for ecological communities. Decisions concerning sea level rise present significant challenges at both regional and local scales for adapting to potentially adverse effects. This research effort presents two examples of multi-criteria decision analysis applied to coastal Florida areas at regional and local scales using the Sea Level Affecting Marshes Model (SLAMM) to simulate potential land cover changes under a variety of scenarios. At a Florida Gulf Coast scale, we link SLAMM simulations with a habitat suitability model (Maxent) and a metapopulation model (RAMAS-GIS) to simulate decision alternatives to diminish adverse effects of sea-level rise on Snowy Plover populations. At a more local planning scale, we use SLAMM and decision analysis to explore trade-offs and risk among several sea-level rise scenarios in the Matanzas River Watershed on the northeastern Florida Atlantic coast. In each of these examples, selecting a coastal adaptation strategy under sea-level rise is a complex task that entails the consideration of multiple streams of information, stakeholder preferences, value judgments, and uncertainty.

P.84 Kim, S-J; Colorado State University; [email protected] Utilizing Need for Affect and Need for Cognition from a Dual-Processing Framework: Measuring Environmental Policy Preference by Experimental Design Studies Haddock, Maio, Arnold, and Huskinson (2008) reported that an affective message had stronger effects on attitude changes amongst those high in NFA and low in NFC. On the other hand, a cognitive message was found to elicit more positive changes in attitudes in those categorized as high in NFC and as low in NFA. Based on the review of the literature, the present study proposes several ways to more effectively utilize both NFA as well as NFC. Thus, this paper suggests H1: Individuals exposed to a cognitive message who are high in NFC and low in NFA will more likely to support an environmental policy issue (a new 54.5 MPG standard by 2025), compared to individuals exposed to other messages (i.e. affective, both cognitive & affective, and neither cognitive nor affective messages); H 2: Individuals exposed to an affective message who are low in NFC and high in NFA will more likely to support an environmental policy issue, compared to individuals exposed to other messages; H 3: Individuals exposed to a combined (both cognitive and affective) message who are high in NFC and high in NFA will more likely to support an environmental policy issue, compared to individuals exposed to other messages; H 4: Individuals exposed to a neutral (neither cognitive nor affective) message who are low in NFC and low in NFA will more likely to support an environmental policy issue, compared to individuals exposed to other messages. In addition, this study adds involvement and endorser as moderators of these relationships; furthermore, it looks at opinion leadership on the climate change issue and behavior/intention towards the adoption of a new 54.5 MPG standard vehicle as additional dependent variables. A series of experimental design studies (Study 1, 2, and 3) will be introduced and their strengths/limitations will be discussed.

T2-G.2 Kirby-Straker, R.*; Turner, M.; University of Maryland, College Park; George Washington University; [email protected] Climate Change and Related Risks: Personal or Impersonal? The degree to which people perceive a risk as being personal or impersonal will determine their response to the risk. This dichotomous classification of risks is related to perceived personal relevance, and an important challenge for environmental risk communicators is determining how to increase personal relevance, in other words, how to make impersonal risks more personal. Before risk communicators address this challenge however, they must first gain a better understanding of how their audience views the risks of interest. A survey (N = 170) conducted at the University of Maryland, College Park, investigated student perceptions of eight risks: climate change, drought, flood, global warming, heat wave, high pollen count, tornado, and West Nile Virus. Participants were asked to indicate on a scale of zero to 100, their perceptions of these risks based on characteristics such as personal relevance, severity, susceptibility, immediacy, and abstractness. Demographic data, including gender, political philosophy, and home state were also collected. The data reiterated the complexity of risk perceptions and the need to unpack subjective judgments of risk before attempting to develop risk communication strategies to change public perceptions. The data however revealed promising results for climate change and global warming, in that both were considered to be the least abstract of the eight risks and the most severe, and participants considered themselves to be most susceptible to both these risks than to the other six, despite their perceptions that these risks were the least immediate, and that they were more likely to affect people in other countries. Although these results bode well for social change regarding climate change, they suggest a disconnection between perceptions of climate change and the other risks.

P.11 Kirk, M; Hakkinen, P*; Ignacio, J; Kroner, O; Maier, A; Patterson, J; University of Virginia; [email protected] Toxidromes - A decision-making tool for early response to chemical mass exposure incidents A common language to describe and recognize clinical manifestations of toxic chemical exposures is essential for emergency responders and hospital first receivers to be prepared to provide rapid and appropriate medical care for victims of industrial chemical mass exposures and terrorist attacks. In these situations, when the identity of the chemical is not known, first responders need a tool to rapidly evaluate victims and identify the best course of treatment. Military and civilian emergency response communities use a “toxic syndrome” (toxidrome) approach to quickly assess victims and determine the best immediate treatment when information on chemical exposures is limited. Toxidromes can be defined by a unique group of clinical observations, such as vital signs, mental status, pupil size, mucous membrane irritation, and lung and skin examinations. Data on over 20 toxidrome systems were evaluated to identify salient features and develop a consistent lexicon for use by state, local, tribal, territorial, and federal first responders and first receivers. A workshop of over 40 practitioners and experts in emergency response, emergency medicine, and medical toxicology developed names and definitions for 12 unique toxidromes that describe and differentiate the clinical signs and symptoms from exposures to chemicals. These toxidromes focus on acute signs and symptoms caused by inhalation and dermal exposures. Each toxidrome is characterized by exposure routes and sources, organs/systems affected, initial signs and symptoms, underlying mode of action, and treatment/antidotes. Toxidrome names and definitions are designed to be readily understood and remembered by users. Communication in a crisis requires accurate and succinct terms that can quickly convey the health conditions of patients. These toxidromes lay the foundation for a consistent lexicon, that if adopted widely, will improve response to chemical mass exposure incidents.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-F.3 Klinke, A*; Renn, O; Memorial University of Newfoundland, University of Stuttgart; [email protected] New conceptual considerations on dynamic governance handling risks in public policy Public policy is confronted with a new task to cope with challenges and risks emerging in transition and transformation periods, e.g. in domains such as energy, natural resources and global warming. Since current governance structures are often lacking institutional and procedural capacities, new governance institutions may become essential. We propose that transitions from “old’ to “new” systems that are challenged by institutional change and transformation require a dynamic form of governance. We reflect these issues from different perspectives of new institutionalism which emphasize flexible and responsive processes of institutional change and the need for sustainable institutional reform. Rational choice theorizes the logic of instrumental rationality, historical institutionalism offers the logic of path dependency, and sociological institutionalism relates to the logic of appropriateness to both the policy goals and the public support to provide legitimacy to the process. These “older” new institutionalisms assume a more static view that is why we turn to discursive institutionalism with a more dynamic approach to change and discursive problem solving capacity as explanatory power – a “new” approach that has recently be added to the “older” approaches. We attempt to glean how the institutional and procedural incapability of current public policy courses and mechanisms can be overcome in the face of transition and transformation. We conceptualize a framework for a dynamic configuration consisting three major performative capacities, namely integrative, adaptive and deliberative capacities, that are characterized by active, flexible and extensible structures and functions intertwining at multiple levels. Finally, we draw conclusions of how dynamic governance contribute to change the existing course of public policy in the energy sector and how a new governance regime could be shaped that is essentially different in the design than the traditional organizations and institutions that govern and control the energy system.

T3-A.3 Koks, EE*; Bockarjova, M; De Moel, H; Aerts, JCJH; VU University Amsterdam; [email protected] Development and sensitivity analysis of an indirect risk model for the port of Rotterdam As transportation hubs and vital economic lifelines, ports play a critical role within the local, regional and global economy. At the same time, port cities are inherently at risk from both riverand coastal flooding, aggravated by pressures such as sea-level rise. Due to their complex infrastructure and extended regional linkages, a comprehensive flood risk assessment is required to understand which impacts potential flooding may bring to the economy at large. In this paper, we propose a framework for a comprehensive integrative flood damage model for port regions and conduct an extensive sensitivity analysis of its underlying assumptions. The framework consists of multiple steps. First, a direct damage assessment is conducted in the port region, specifically differentiating between various industrial sectors. Second, we show how direct losses in capital and labor can be translated into the loss in production per sector in a more consistent manner, by making use of a Cobb-Douglas production function. Finally, the recovery of this production shock is modeled using a hybrid input-output model. The model is applied to the port region of Rotterdam, using seven different inundation scenarios. Results show that indirect production losses can form a similar share in the total flood risk as direct losses, in terms of expected annual damage. As a contribution to the literature, we perform an extensive sensitivity analysis of the model. We explore parameter uncertainty using a global sensitivity analysis, and adjust critical assumptions in the modeling framework related to, amongst others, import and export restrictions and financial liability, using a scenario approach. The results exemplify the uncertainties intrinsic to indirect damage modeling; offer explanations for differences found between different modeling tools; and provide critical insights for the methodological and empirical domains of damage modeling.

P.76 Kovacs, DC*; Thorne, SL; Butte, GE; Wroblewski, MJ; Decision Partners; United States Census Census Bureau; [email protected] Applying Mental Modeling Technology™ to Developing the Communications Research and Analytics Roadmap for Census Bureau The U.S. Census Bureau (CB) serves as the leading source of quality data and information about the nation's people, places, and economy. More than just numbers, this information shapes important policy decisions that help improve the nation’s social and economic conditions. The Center for New Media and Promotions (CNMP), within the Communications Directorate of the Census Bureau, coordinates, develops, and implements integrated communications and promotional campaigns with consistent messaging and branding about the Census Bureau. This includes exploring innovative ways of communicating through the web, digital, and social media; promotional activities; and evolving communications channels and platforms in support of the Data Dissemination Initiative and other CB programs. To support their activities, the CNMP sponsored development of a Communications Research and Analytics Roadmap (CRAR) to provide the needed insight to guide the development of effective integrated communications services. The Roadmap was informed by Foundational Research comprising an assessment of existing research and Mental Models research, a key component of Mental Modeling Technology. The mental models research is the topic of this presentation. First, an Expert Model of “Influences on Integrated Communication for Data Dissemination” was developed based on a review of background materials and discussions with a select group of CB employees. Using the model as an analytical basis, an interview protocol was designed and in-depth mental models interviews were conducted with 26 key internal CB stakeholders. The results of the expert interviews provided critical insight to support the development of the CRAR, which will provide guidance needed to improve the effectiveness of CB communications and data collection at a time of significantly escalating CB data collection costs and ever tightening government budgets.

M2-G.4 Kowal, SP*; Jardine, CG; Bubela, TM; University of Alberta; [email protected] Transition, Trauma, and Information: Immigrant Women’s Relationship with Immunization Risk Communication Effective vaccine risk communication strategies by health agencies increase compliance with immunization programs. Unfortunately, current strategies do not address the needs of specific target groups, such as recent immigrants in Canada, who have lower vaccination rates than non-immigrants. Our study examined how foreign-born women access and use information to make personal and childhood immunization decisions. We conducted interviews with recently immigrated women from South Asia, China, and Bhutan who were pregnant or new mothers living in Edmonton, Alberta, Canada. Using NVivo qualitative software we generated an inductive coding scheme through content analysis of the interview transcripts. Results showed that transitional traumas associated with immigration impact women’s desire to access or critically assess immunization information. These transitional traumas included political marginalization, as experienced by Bhutanese refugees, or the loss of a strong traditional family system, for South Asian women. Such hardships impacted the women’s information gathering practices. Additionally, the degree to which women exercised agency in their health decisions in their countries of origin influenced how they accessed information in Canada, with a high proportion of participants demonstrating passive information gathering. Finally, there were widespread misconceptions amongst the study participants about Canadian vaccination programs (e.g. whether vaccines are mandatory) and whether women should be vaccinated before or during pregnancy. Our research uncovered the shortfalls of current risk communication strategies for immigrant women in Edmonton. Risk communicators must respond to the passive information gathering practices of these individuals to prevent misunderstandings about immunization policy and its importance. The lack of access to culturally relevant immunization risk communication for immigrant women in Canada potentially limits their ability to effectively make decisions to protect themselves and their children from communicable diseases.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.130 Kowalek, Denna; Howard University ; [email protected] Understanding risk: Applying the CAUSE model in a content analysis of emergency management organizations coverage of hurricane Sandy. Andersen and Spitzberg (2010) state that by many measures, the world is becoming a more dangerous place. The fact that there are more people in more places, from more cultures, often in greater levels of density, means that when disasters occur, they have the potential to affect more people, and more organizations and institutions are responsible for managing such disasters. Examining emergency management organizations communicative messages during Hurricane Sandy in the fall of 2012 allows risk and crisis communicators to determine how the hurricane information was disseminated, thus providing further precautionary and preparedness actions. Understanding how people view disasters and precautionary and preparedness will help generate more effective risk communication campaigns. This research will utilize Rowan’s et al. (2009) CAUSE model as a framework to understand how The National Weather Service (NOAA), American Red Cross and FEMA incorporated precautionary and preparedness action into their coverage of Hurricane Sandy. This will be done to determine how emergency management organizations created understanding through their messages. This information is crucial to understand in order to comprehend precautionary and preparedness actions regarding disasters. A content analysis will be conducted of messages from the National Weather Service, American Red Cross, and Federal Emergency Management Agency (FEMA) to understand how messages addressed the U in the CAUSE model, understanding, while discussing precautionary and preparedness actions regarding Hurricane Sandy.

W2-E.4 Kponee, K*; Vorhees, D; Heiger-Benays, W; Boston University School of Public Health; [email protected] Exposure to highly contaminated drinking water in a rural Nigerian Village In 2011, the United Nations Environment Programme (UNEP) reported findings from its study of petroleum hydrocarbon contamination in the Ogoniland region of Nigeria where frequent and massive oil spills have destroyed vast areas, including farmland, fisheries, and water supplies. UNEP recommended research and remedial action to protect human health, with some recommendations focused on the Ogale community where residents rely on a drinking water supply contaminated with benzene and other petroleum hydrocarbons. Benzene alone has been detected at concentrations almost 2,000 times higher than the USEPA drinking water standard. UNEP staff observed people using the contaminated drinking water and noted that its strong odor. Because such elevated exposures are likely to be associated with a range of acute and chronic effects, UNEP recommended emergency provision of clean drinking water, medical surveillance, and implementation of a prospective cohort study to investigate the effects of exposure to the contaminated drinking water in Ogale. There are no reports indicating that these recommendations have been implemented except for provision of clean water in late 2011. This study implements some of UNEP’s recommendations for Ogale. Based on an investigator-administered questionnaire to one hundred households, this study provides a (1) detailed assessment of exposure to the contaminated water supply; and (2) a preliminary comparison of self-reported symptoms and health outcomes in the community served by the contaminated drinking water supply and a nearby comparable community served by a relatively clean drinking water supply. This study represents the logical next step to determine whether the extremely high levels of exposure in Ogale might be associated with acute and chronic adverse health effects. It also might improve understanding of how oil spills affect human health, a question that has eluded those investigating oil spills involving lower levels of exposure.

P.91 Kroner, O*; Wullenweber, A; Willis, AM; Toxicology Excellence for Risk Assessment (TERA); [email protected] Rethinking Risk Data: ITER 2.0 For 17 years, the International Toxicity Estimates for Risk (ITER) database (www.tera.org/iter) has been a centralized source of peer-reviewed chronic human health risk values. ITER is a free resource, providing access to international risk information with a side-by-side comparison of risk values. Since 1996, the database has grown to include over 700 chemicals and includes risk values derived by organizations from around the world. However, during this time the world has seen major advancements in computational processing power, database mining and programming, and user interface design. In short, we are learning to extract more knowledge from the available data. With an eye to the future, a series of stakeholder surveys have been conducted to evaluate how ITER and risk information is currently used by risk assessors, and how ITER may be restructured to be most useful to meet the risk assessment community’s needs. Survey results indicated several areas for improvement and have spurred a call for ITER 2.0, which is currently underway. The redesigned system will be built to support additional problem formulations (NAS, 2009), offering flexibility to include additional data types such as acute values, occupational exposure levels, biomonitoring equivalents, and possibly ecological risk values. Possible user interface enhancements will allow for searching and sorting by chemical class, target endpoint, date of derivation, and uncertainty factors, and allow for cross chemical comparisons and meta-analyses. The development of ITER 2.0 based on user feedback will help organizations share their risk work, and help risk scientists navigate the available data to streamline public health protection efforts.

P.90 Kugihara, N; Graduate School of Human Sciences, Osaka University; [email protected] Effects of changing frequency of heterogeneous stimuli over time on estimation of frequency Two kinds of stimuli (i.e. photos and words) were shown repeatedly to participants for twenty minutes. The photos have neutral emotional valence (e.g. spoon, tissue paper and dish) and words were nonsense syllables (e.g. nuse, heyo and rue). These stimuli were shown according to two types of schedules, HF (high frequency) and LF (low frequency). As for HF, the presented frequency increased rapidly and reached the peak (60 times per minute) in two minutes, then decreased gradually. As for LF, it increased gradually and reached the peak (6 times per minute) in two minutes, then decreased. HF and LF schedule were done at the same time. In one condition, stimuli of HF were words and LF were photos and in another condition HF were photos and LF were words. If a disaster occurs, mass media have a tendency to try to identify and pursue a target in charge of the tragic event. The frequency of the newspaper articles pursuing targets varies and fluctuates through time. Some researches indicate that transitions in scapegoats (targets) occur as time advances from persons or groups to society or to our culture. My past laboratory studies showed that these transitions were mainly caused by our memory bias. This means that the rare targets picked up by articles overestimated and also the degree of the overestimation increases as time advances. However, these experiment used same kinds of stimuli. Therefore participants may fail to discriminate these stimuli. To avoid this problem the present study used heterogeneous stimuli (words and photos) as HF and LF. Results showed that perceived frequencies of LF were overestimated, and the subjective peak of LF appeared later than actual peak. On the other hand frequencies of HF were underestimated, and estimation peak nearly corresponded to presentation peak. These results indicate that even if presented stimuli were heterogeneous we have subjective biases of frequency and of its peak estimation.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.138 Kuiken, T*; Quadros, M; Woodrow Wilson Center, Virginia Tech; [email protected] Keeping track of nanotechnology in your everyday life: The Nanotechnology Consumer Products Inventory 2.0 The Woodrow Wilson International Center for Scholars and the Project on Emerging Nanotechnologies created the Nanotechnology Consumer Product Inventory (CPI), in 2005. This first-of-its-kind inventory tracks consumer products claiming to contain nanomaterials and has become one of the most frequently cited resources showcasing the widespread applications of nanotechnology. The CPI now contains 1,628 consumer products that have been introduced to the market since 2005, representing a 24 percent increase since the last update in 2010. In the years since its launch, the CPI has been criticized because of its lack of scientific data. To address some of these concerns, this update adds qualitative and quantitative descriptors, such as size, concentration, and potential exposure routes for the nanomaterial’s contained in consumer products. It also includes published scientific data related to those products, where available, and adds a metric to assess the reliability of the data on each entry. In addition, the newly re-launched inventory seeks to address scientific uncertainty with contributions from those involved with nanomaterial production, use, and analysis. This is the first major overhaul of the inventory, since it was launched in 2005. The re-launched inventory seeks to “crowd source” expertise in an effort to create an inventory with more accurate information on consumer products. Registered users are encouraged to submit relevant data pertaining to nanoparticle function, location, and properties; potential exposure pathways; toxicity; and lifecycle assessment, as well as add product data and information on new products. The Virginia Tech Center for Sustainable Nanotechnology worked with the Wilson Center to redevelop the inventory to improve the reliability, functionality, and scientific credibility of this database. Virginia Tech’s Institute for Critical Technology and Applied Science provided funding for the effort.

P.89 Kumagai, Y*; Hosono, H; Sekizaki, T; the University of Tokyo; [email protected] This is the title; Investigating “consumer awareness” in evaluating food safety hazards related to beef in Japan In an emergency situation, effective risk communication may reduce unnecessary public concern and consequential behaviors. In 2011, Japan has two crisis events for beef-related health hazards (enterohemorrhagic Escherichia coli O111 and O157 and radioactive contamination) in the area of food safety. In this situation, Japanese consumers become increasingly aware of the risks of health hazards related to beef. The aim of this study is to investigate “consumer awareness” made an influence on evaluating health hazards in an emergency situation. We conducted the internet-based questionnaire survey on October in 2011. The survey had 3,957 respondents. The following subjects were asked about; (1)where each health hazard related to beef (“enterohemorrhagic E. coli (EHEC)”, “Salmonella spp.”, “Campylobacter spp.” “bovine spongiform encephalopathy (BSE)”, “radioactive substances”, “antibiotic residues”, and ”cloned animal”) was ranked in descending order of risks, (2)the reasons why respondents chose the highest health hazard. We analyzed the words into free descriptions of the reasons, categorized the words into 8 broad types (“severity”, “probability of occurrences”, “anxiety and fear”, “adverse effects for infants”, “reliability of governmental management”, “avoidance by oneself”, “attention to media”, and “production of are”) as “consumer awareness” and explored factors made an influence on evaluating health hazards. In the result, “consumer awareness” in risk ranking was confirmed as follows; (1)regarding EHEC and Salmonella spp., “severity”, “probability of occurrence”, and “anxiety and fear”, (2)regarding BSE, “anxiety and fear”, “severity” and “avoidance by oneself”, (3)regarding radioactive substances, “reliability of governmental management”, “anxiety and fear”, “attention to media” and “adverse effect for infants”. The result of this study implied that “reliability of governmental management” is very important factor for emerging hazards like radioactive substances.

T2-D.1 Kundu, A; University of California, Davis; [email protected] ESTIMATING RISK OF INTESTINAL NEMATODE INFECTION FROM EXPOSURE TO AMBIENT WATERS USING QUANTITATIVE MICROBIAL RISK ASSESSMENT (QMRA) IN SALTA, ARGENTINA The main objectives of this research were to establish risks related to (i) direct or indirect incidental ingestion of water in Arenales river in three scenarios: primary contact, children and secondary contact; (ii) consumption of uncooked vegetables irrigated with A. lumbricoides contaminated water; and (iii) incidental ingestion of irrigated water by farmers working in the fields. The study area focused on the Arias-Arenales river, which is in the northwestern region of Salta province in Argentina. Eleven locations on the Arenales river were selected for a thirteen-month monitoring study. A quantitative microbial risk assessment (QMRA), based on observed nematode density, was performed to calculate the annual probability of helminth infection in various scenarios associated with direct or indirect exposure to Ascaris lumbricoides in surface waters. The highest estimated mean risk from direct or indirect ingestion of surface water per person per year (pppy) from a single exposure scenario was found in children (21.23%), followed by adults (12.49%) and secondary exposure (1.54%). We estimated the mean annual risks from consumption of raw (uncooked and unpeeled) vegetables irrigated with polluted water as a worst case scenario. The mean risk was highest in lettuce (0.659%) and lowest in cucumber (0.011%) for the total population. A similar pattern was found for the consumer-only population as well. Here, the risks from consumption of vegetables were 0.27% for cucumbers and 1.54% for lettuce. The annual risk of infection in farmers due to accidental ingestion of irrigated water was 1.06%. The risk varied from as low as 0.005% to a high value of 41% in farmers working in the field without using any protective clothing like gloves, rain boots and so forth. We also estimated risks associated with four scenarios for treating left-censored observations: (i) one-half the detection limit (DL) was assumed for the censored data; (ii) DL was assumed for censored data; (iii) bounding for the censored observations; and (iv) observations with zeroes.

P.88 Kuroda, Y*; Iwamitsu, Y; Takemura, K; Ban, N; Sakura, O; Sakata, N; Tsubono, K; Nakagawa, K; The University of Tokyo, Kitasato University, and Waseda University; [email protected] Effect of information trustworthiness on cancer risk perception after a nuclear disaster This study examines the effect of trustworthiness on risk perception in residents of Tokyo and Fukushima after the Fukushima Daiichi nuclear disaster. A cross-sectional study was conducted among 2000 residents of Tokyo and Fukushima (1000 per city), selected by stratified random sampling. Participants anonymously filled out a questionnaire on 8 cancer risk factors: Smoking, Drinking, Poor Diet and Lack of Exercise, Obesity, Stress, Food Additives, Low Vegetable Consumption, and Radiation Exposure. From these 8 factors, participants were asked to select and rank the top3 (1=highest risk). They also rated their trustworthiness in sources of information about radiation on a 5-point scale (1 = Not Reliable, 5 = Reliable). This study was approved by the Institutional Review Board at the University of Tokyo. Responses were obtained from 554 participants from Fukushima (mean age = 52.8 ± 16.3 y) and 465 participants from Tokyo (mean age = 51.6 ± 15.8 y). Participants from both cities rated Smoking (40.7%), Radiation Exposure (31.5%), and Stress (17.7%) as factors with highest risks for cancer. Radiation was rated significantly higher as a risk factor by participants from Fukushima than participants from Tokyo (X2 = 6.21, df=1, p < .01). Sources of information about radiation were classified as “Reliable” (score of 5 or 4) and “Unreliable” (scores of 3, 2, or 1). A chi-square test revealed that those receiving information from an unreliable source were more likely to report radiation as a higher risk factor (X2 = 6.81, df=1, p < .01). Trustworthiness is significantly related to perception of radiation risk; thus, building trustworthiness is the key issue to address for effective radiation risk communication.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-G.3 Kuttschreuter, M.*; Hilverda, M.D.; Pieniak, Z.; Department Psychology of Conflict, Risk and Safety, University of Twente; [email protected] Actively seeking versus taking notice of risk information: the case of food risks Food safety incidents in the Western world still take lives. Examples are the contamination of cantaloupes with Listeria in the US and that of fenugreek seeds with the EHEC-bacterium in Europe. Such incidents typically lead to anxiety among consumers, a need for additional information and potentially information seeking behaviour. How these processes function precisely, is yet unknown. This presentation focuses on the applicability of the Risk Information Seeking and Processing (RISP) model to food risks, and on the channels that consumers use to find relevant information. Building on prior research, the RISP-model was refined and a distinction was made between two modes of information seeking: taking notice of information that one encounters accidentally versus actively taking steps to search and find additional information. Data were gathered through a cross-sectional survey that was carried out in 8 different European countries as part of the FoodRisC-project (n= 6400). This survey focused on the responses to the risks of fresh vegetables. Participants were questioned regarding their information seeking behaviour and a number of potential determinants such as risk perception and trust in the safety of food products. Their use of information channels such as traditional mass media, internet resources and social media was also measured. Television and search engines as Google were found to be the channels most often used to learn more about the risks involved. Taking notice of risk information and actively taking steps to learn more were indeed distinct forms of information seeking behaviour (r = .61). Regression analysis showed that there were differences as well as similarities in the determinants of both modes of information seeking. Structural equation modelling has been applied to test a model, describing the determinants of both modes of information seeking, using AMOS. Results will be presented and the consequences for risk communication will be discussed.

W4-F.1 Kuzma, J; NORTH CAROLINA STATE UNIVERSITY; [email protected] Global Risk Governance for Genome Editing Recently the field of biotechnology has been revolutionized with the introduction of promising new technologies, which can be collectively called “targeted genetic modification techniques” (TagMo). Unlike traditional genetic engineering technologies, which introduced changes in genomes randomly, these new methods allow scientists to modify DNA sequences at precise locations. These new technologies are revolutionizing biotechnology by making the engineering process for plants, animals, mammals, and bacteria faster and allowing multiple site-directed modifications to organisms in a short period of time. These TagMo methods have collectively been referred to as “genome-scale engineering”, “genome editing,” or “genomic rewriting.” They represent a transition between old recombinant DNA (rDNA) genetic engineering and synthetic biology. In turn, they present significant governance challenges, including ambiguity in what the potential risk analysis issues are and whether existing regulatory definitions and systems can accommodate the rapid changes in these technologies. In this paper, first the landscape of the genome editing field will be explored by using "tech mining" techniques based on bibliometric analysis of a key set of articles from the Web of Science. This analysis will help to lay the groundwork for discussions of particular sub-fields of TagMo and associated risk assessment issues including gene flow, gene stability & migration, and their off-target impacts. A subset of the literature on the stability and specificity of genomic insertions, deletions, or edits using TagMo will be reviewed. Then, select models for risk analysis of the first generation of products of genetic modification will be evaluated for their appropriateness for emerging products of TagMo based on findings from an expert-stakeholder interview process, workshop, and existing literature. Finally, the state of risk governance for genome editing in three key regions of development—the U.S., EU, and Japan—will be examined in a comparative policy analysis approach.

T1-A.8 Lachlan, KA*; Spence, PR; Lin, X; University of Massachusetts Boston (Lachlan), University of Kentucky (Spence & Lin); [email protected] Getting Information to Underserved Communities using Twitter: Lessons from Hurricane Sandy With its capability for real time updating and reports from the scenes of accidents and disasters, Twitter has emerged as a medium that may be especially useful for emergency managers communicating the risks associated with impending events. Yet, little is known about the ways in which Twitter is being utilized during widespread disasters, or the ways in which government agencies are using Twitter in reaching at risk populations. Given past research suggesting that members of underserved communities may be especially at risk during crisis and natural disasters, this is especially problematic. The current study involves an automated content analysis of over 20,000 tweets collected in the days leading up to the landfall of Hurricane Sandy, and a human coder subsample of 1785 tweets that were evaluated in greater depth. Tweets containing the hashtag #sandy were examined, since this was the official hashtag used by the National Weather Service in their communication efforts. Tweets were collected in four hour intervals over the course of four days. The results indicate that in the days leading up to landfall, risk information became less prevalent and expressions of negative affect became more common. Tweets from relief agencies and emergency managers were all but absent in the sea of information. Tweets in languages other than English were largely absent during the developing stages of the crisis, and very few of these contained actionable information. The results are discussed in terms of best practices for emergency managers in conveying risk information to historically underserved communities.

T1-D.3 Lambertini, E*; Buchanan, RL; Narrod, C; Pradhan, AK; University of Maryland, Joint Institute for Food Safety and Applied Nutrition; [email protected] Zoonotic diseases from companion animals: risk of salmonellosis associated with pet food Recent Salmonella outbreaks associated with dry pet food highlight the importance of these foods as vehicles for zoonotic pathogens. The need to characterize the risk profile of this class of products is currently not supported by data. Moreover, the relative impact of industry practices and household behavior in mitigating risk is unknown. This study aims to: 1) model the microbial ecology of Salmonella in the dry pet food production chain, 2) estimate pet and human exposure to Salmonella through pet food, and 3) assess the impact of mitigation strategies on human illness risk. Data on Salmonella contamination levels in pet food ingredients, production parameters, bacterial ecology on food and surfaces, and transfer by contact were obtained through a systematic literature review and from industry data. A probabilistic quantitative microbial risk assessment model was developed to estimate exposure of pets and their owners to Salmonella in pet food, and the associated illness risk. Model outcomes highlight that human illness risk due to handling pet food is minimal if contamination occurs before the heated extrusion step (10-15 CFU/Kg of food at the point of exposure, even with initial 1015 CFU/Kg in the protein meal ingredient). Risk increases significantly if contamination occurs in coating fat, with mean probability of illness per exposure Pill 1/1000 at mean Salmonella levels of 107 CFU/Kg fat. In this case, an additional post-coating 3-Log CFU reduction is needed to limit Pill to 10-6 (mean of 0.03 CFU/Kg in finished product). Recontamination after extrusion and coating, e.g. via dust or condensate, can lead to even higher risk (Pill 1/100 with mean Salmonella levels of 0.4 CFU/Kg in intermediate product). In this scenario, hand washing after handling food would reduce the Pill only by 30%. The developed risk model provides a tool to estimate health impacts under a range of production and household handling scenarios. Model results provide a basis for improvements in production processes, risk communication to consumers, and regulatory action.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T3-E.3 Lander, DR; Heard, NE; Dellarco, M*; DuPont, Syngenta, NIH; [email protected] Product Stewardship for a New Product: RISK 21 Tiered Exposure Framework in Practice This presentation demonstrates how the RISK21 tiered exposure framework can be applied for product stewardship or regulatory compliance through a hypothetical case study. RISK21 uses a tiered framework for refining the exposure and hazard estimate independently based on which will have the biggest reduction in uncertainty. The hypothetical problem formulation for this case study was: your company is developing a mosquito bed netting containing a pyrethroid to prevent transmission of West Nile Virus for use at children outdoor summer camps in the United States. The company has product stewardship requirements to verify there is no unwarranted risk to workers making the product and consumers using the product. The marketing department is ready to sell the product but is waiting for the completed risk assessment. As the RISK21 framework is applied, the decisions made by the risk assessor are discussed. The Tier 0 approach used the worker and consumer banding methods as well as the Environmental Background Database approach. The Tier 0 risk assessment matrix indicated further refinement was needed in both exposure and toxicity estimates. For a deterministic Tier 1 exposure assessment more information on the use was needed. The World Health Organization’s 2004 document “A Generic Risk Assessment Model for Insecticide Treatment and Subsequent Use of Mosquito Nets” was used to inform on the mosquito net use. With further exposure refining inside Tier 1, a risk assessment matrix was developed showing minimal risk from use. If confidence was not high enough, Tier 2 could be used to ascertain the probabilities of the risk to further inform. Finally, since deltamethrin is an insecticide used for other uses there could be concern for cumulative exposures. Tier 3 looked at NHANES biomonitoring data to compare current human pyrethroid background exposure to the new use to verify no concern from cumulative uses.

T2-J.2 Landis, WG*; Johns, A; Western Washington University; [email protected] Analysis of the exposure-effects relationships from concentration-response curves for ecological risk assessment Recently there has been an intense discussion regarding the proper analysis tools for the description of the exposure-response relationship for environmental risk assessment. We have developed an alternative analysis that relies on curve fitting but that generates a distribution around a selected ECx value where x is the boundary of the unacceptable effect size. We have explored a variety of exposure-response datasets from two laboratories and with a diverse group of chemicals. The DRC package in the R programming environment was used to calculate the regression and the 95 percent confidence intervals. At concentrations at and surrounding what would be estimated as a 20 percent effect a distribution was derived to capture the likelihood of different levels of effects. A triangle distribution was created via Monte Carlo technique with the mode the point corresponding the estimated exposure-response and the upper and lower limits corresponding to the range of the confidence interval. Analysis of a variety of exposure-response curves illustrated the importance of the slope of the curve and the breadth of the confidence interval. For example, the concentrations of parathion corresponding to an EC20 value for immobilization of Daphnia magna was bounded by effects levels from 8 to 42 percent. The description of the EC value becomes a distribution bounded by the upper and lower bounds of the effect axis, the upper and lower bounds along the concentration axis and the likelihood of each exposure and effect combination. This kind of probabilistic information is not available from point estimations from conventional hypothesis testing or reliance on the determination of a point ECx. We present several examples to demonstrate the critical nature of having the exposure-response curve, its confidence bounds and the exposure-effects distribution in estimating risk.

T1-H.4 Lathrop, JF; Innovative Decisions, Inc.; [email protected] Applying Concepts of Quality of Position to Terrorism Risk Management Last year at this conference, I presented the concept of Quality of Position as a decision guiding metric for terrorism risk management. The underlying logic is that Probabilistic Risk Assessment (PRA) is a powerful tool for organizing and processing attack-scenario information for terrorism risk assessment, but it is intrinsically limited to assessing risks that can be characterized by anticipated scenarios. The problem is that in terrorism risk management, an important part of the risk space is comprised of unanticipated scenarios, i.e., “Black Swans,” especially since some terrorists (“Reds”) are smart enough to deliberately design attacks “not on Blue’s list” (that would appear, to Blue, as Black Swans), and so not addressed well (if at all) by PRA. So last year I suggested the concept of “Quality of Position” (QOP), a multiattribute utility (MAU) metric capturing how well Blue is positioned in the “game” against Red, with attributes including a PRA metric and also broader concepts of resilience, leverage, deterrence and disincentivization. The concept of QOP is lifted from guidance for chess players, who are faced with a similar (but importantly different) problem of playing a game where it is beyond their capability to predict how all possible futures may unfold. This paper takes those concepts and fleshes them out with actual MAU attribute scales, and mathematical demonstrations of how a QOP metric can advise terrorism risk management in ways superior to PRA alone. Developing QOP concepts brings along with it the development of broader concepts of how analysis can guide terrorism risk management. The most important of those is to lift the role of analysis from guiding essentially tactical decisions, such as allocating detectors among target cities, to more strategic decisions, i.e., to playing the game at a more strategic level than simply defending a large number of (but not all) targets. The paper will conclude with recommendations for implementation.

M3-A.1 Lathrop, JF; Innovative Decisions, Inc.; [email protected] Applying terrorism risk management concepts to enhance ISO 31000 risk management We take big steps when we step from widely accepted risk management, such as specified in ISO 31000, to broader risk management concepts informed by other fields. This talk will briefly synthesize risk management concepts from terrorism risk management, and other new thinking, with the classic risk management concepts specified in ISO 31000. We have learned from our work in terrorism risk assessment/management that in many arenas we need to orient our thinking around managing risk that specifically addresses unanticipated scenarios. Risk management can be improved by taking advantage of three key concepts we’ve developed in our terrorism risk management work: robustness, resilience and quality of position. Then we examine new concepts provided to us from Nassim “Black Swan” Taleb in the latest book he has inflicted upon us, Antifragile, and develop concepts informed by combining his latest work with our terrorism work. This paper will combine concepts from both of those areas into recommendations for an additional set of principles of risk management, to be added to the considerable set of principles, already widely accepted, in ISO 31000.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W3-E.3 Le, HQ*; Lander, DR; Starks, SE; Kreckmann, KH; Symons, JM; DuPont Epidemiology Program (1,3,4,5), DuPont Haskell Global Centers (2); [email protected] Evaluation of population-based biomonitoring data for risk assessment: An environmental-wide association study approach

T3-I.3 Lee, E*; Dunwoody, S; University of Pennsylvania; [email protected] Information processing modes and risk judgment quality The current study concerns dual information processing modes and risk judgment quality. In the context of two risk issues– talking on a cell phone while driving, and driving under the influence of alcohol– a Web-based experiment was conducted to learn whether systematic or heuristic processing improves participants’ risk judgment. Under the systematic processing condition, participants were asked to think about risk information presented, while participants assigned to the heuristic processing condition were distracted by an anagram from thinking about risk information. The results demonstrated that the effect of current thinking mode on risk judgment is moderated by previous thinking mode. Among both those who did not think a lot and who did think a lot regarding these two risk issues, systematic processors judged the two risks more scientifically than did heuristic thinkers. Conversely, among those who thought a lot about cell phone use while driving, but did not think a lot about driving after drinking, heuristic thinkers made more scientifically-based judgments on the two risks than did systematic thinkers. Those who thought a lot about driving after drinking, but did not think a lot about cell phone use while driving, however, did not statistically differ between systematic and heuristic thinkers. The findings indicate a need for careful analysis of information processing modes and risk judgment over conclusive interpretations.

T2-K.2 Lemay, JC*; Prueitt, RL; Hixon, ML; Goodman, JE; GRADIENT; [email protected] Distinguishing between Risks and Hazards: A Case Study of Bisphenol A Some recently proposed chemical regulations in the United States rely on hazard-based assessments to make determinations about the safety of chemicals in commerce. By definition, these analyses do not take into account exposure, which is a critical component of assessing human health and ecological risk. If exposure is low, even a substance with a high hazard may be associated with low risk. We conducted a hazard assessment of Bisphenol A (BPA) to evaluate potential human health and environmental hazards of BPA based primarily on evaluations conducted by government agencies and systematic critical reviews in the peer-reviewed literature, as well as original research studies conducted more recently than these reviews. We concluded that BPA may pose a hazard for endocrine-mediated effects, so we evaluated estimated exposure to BPA to human receptors to assess risk-based levels of concern. Taking exposure into account, we found that BPA is of low concern for human health and environmental effects, including endocrine disruption and reproductive and developmental effects, but it is of moderate concern for explosivity and flammability. This evaluation demonstrates that when a potential hazard is identified, an exposure and risk assessment could be conducted to determine whether and at what dose a chemical is likely to cause adverse human or environmental effects.

T2-A.1 Levy, JI*; Fabian, MP; Peters, JL; Boston University School of Public Health; [email protected] Strengths and limitations of meta-analytic approaches for developing multi-stressor dose-response functions For many policy analyses, including but not limited to cumulative risk assessments, it is important to characterize the individual and joint health effects of multiple stressors. Often, this involves synthesizing epidemiological evidence using meta-analytic techniques. This approach has limitations if epidemiological studies do not include all of the stressors of interest, making it challenging to pool evidence across studies. In addition, studies may include multiple stressors in multivariate epidemiologic models, but these models may not provide outputs in a format necessary for specific risk assessment applications. Given these limitations, novel analytical methods are often needed to synthesize the published literature or to build upon available evidence. In this presentation, we discuss three recent case studies that highlight the strengths and limitations of meta-analytic approaches and other research synthesis techniques for human health risk assessment applications. In the first example, a conventional meta-analysis was used to inform the design of a new epidemiological investigation of the differential toxicity of fine particulate matter constituents. In the second example, discrete event simulation modeling was used to synthesize complex relationships among environmental pollutants, lung function, and asthma exacerbations. In the third example, the focus on an effects-based cumulative risk assessment of cardiovascular disease – a complex multi-factorial health outcome – led to a decision to apply structural equation modeling to publicly available datasets, rather than relying on the published literature. These case studies emphasize the importance of conducting epidemiology with a risk assessment application in mind, the need for interdisciplinary collaboration, and the value of advanced analytical methods to synthesize epidemiological and other evidence for risk assessment applications.

Background: A biomonitoring equivalent (BE) is the concentration of a chemical or its metabolite in a biological matrix that is estimated from an existing exposure guidance value, such as the U.S. EPA reference dose (RfD). A BE allows evaluation of population-based biomonitoring data relative to estimates from chemical risk assessments. An environmental-wide association study (EWAS), a data-mining method adopted from the genome-wide association study, provides an epidemiologic screening tool that can comprehensively evaluate multiple health outcomes, including self-reported medical conditions and diseases as well as clinical parameters, for potential associations with a biomarker expressed as its BE value. Methods: We evaluated over 120 health outcomes for associations with two selected substances: blood benzene and urinary arsenic for adults in the 2005-06 National Health and Nutrition Examination Survey. Logistic regression models included each substance classified as a dichotomous outcome (above or below the BE) and were adjusted for age, sex, race, smoking, and socioeconomic status. Estimated odds ratios (OR) indicated the association between those with elevated BE values relative to those with concentrations at or below BE values. Results: 2,179 adults (>19 years of age) with biomarker concentrations for benzene and arsenic were included in the analyses. The proportions of adults with blood benzene and urinary arsenic that exceeded BE values were 13% and 68%, respectively. Benzene was positively associated with elevated albumin (OR=2.91, p<0.01), and 7 components of a complete blood count (p<0.05). Arsenic was positively associated with having a self-reported liver condition (OR=2.82, p=0.04) and elevated uric acid (OR=1.15, p=0.04). Conclusion: The EWAS approach can assess the distribution of a wide range of health outcomes among general populations in relation to environmental biomarkers corresponding to an RfD. Biomarker concentrations expressed as BE values allow for efficient analyses of population-based biomonitoring data for evaluating current risk characterization values.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-E.4 Levy, JI*; Fabian, MP; Peters, JL; Korrick, SA; Boston University School of Public Health; Channing Division of Network Medicine, Brigham and Women's Hospital; [email protected] Geographic and demographic patterns of health risks associated with chemical and non-chemical stressor exposures in a low-income community Evaluating environmental health risks in communities requires models characterizing geographic and demographic patterns of exposure to multiple stressors, with dose-response models that account for the effects of joint exposures and can be applied to all community residents. In this study, we used simulation methods and structural equation models (SEMs) to construct integrated models of exposure and health risk for a low-income community (New Bedford, Massachusetts) facing multiple chemical and non-chemical stressors. We focused on ADHD-like behavior and blood pressure, two outcomes of interest in New Bedford. We first constructed simulated microdata with multivariate characteristics for all New Bedford residents, applying probabilistic reweighting using simulated annealing to data from the American Community Survey. We then developed exposure models using exposure and demographic data from the Behavioral Risk Factor Surveillance System, the National Health and Nutrition Examination Survey, and a long-standing birth cohort in New Bedford, and applied these models to the microdata to predict exposures to multiple stressors associated with health outcomes of interest. Finally, we developed SEMs of the simultaneous relationships among exposure predictors represented in the microdata, chemical and non-chemical stressor exposures, and health outcomes. SEMs for ADHD-like behavior and blood pressure indicated that multiple demographic variables (e.g., income, country of origin, education) predicted both exposures and outcomes, and that multiple chemical and non-chemical stressors were directly associated with outcomes. Linkage of our SEMs with our exposure models yielded spatially and demographically refined characterization of high-risk subpopulations. This combination of statistical and simulation methods provides a foundation for community-based cumulative risk assessment studies that leverage epidemiological findings.

T1-J.3 Lew, N*; Nardinelli, C; Schick, A; Ashley, E; U.S. Food and Drug Administration; Office of Management and Budget; [email protected] A Retrospective Cost-Benefit Analysis of the Bar Code Rule With an objective to reduce the number of medication administration errors that occur in hospitals and other healthcare settings each year, FDA published a final regulation in 2004 that requires pharmaceutical manufacturers to place linear bar codes on certain human drug and biological products. At a minimum, the linear barcode must contain the drug’s National Drug Code (NDC) number, which represents the product’s identifying information. The intent was that bar codes would be part of a system where healthcare professionals would use bar code scanning equipment and software to electronically verify against a patient’s medication regimen that the correct medication is being given to the patient before it is administered. By requiring commercial drug product packages and unit-dose blister packages to carry bar code labels, it was anticipated that the rule would stimulate widespread adoption of bar code medication administration (BCMA) technology among hospitals and other facilities, thereby generating public health benefits in the form of averted medication errors. We use the 2004 prospective regulatory impact analysis as the basis to reassess the costs and benefits of the bar code rule. Employing the most recent data available on actual adoption rates of bar code medication administration technology since 2004 and other key determinants of the costs and benefits, we examine the impacts of the bar code rule since its implementation and identify changes in technology that have occurred. In this retrospective study, we also use alternative models of health information technology diffusion to improve estimates of counterfactual scenarios against which we compare the effects of the bar code rule.

M2-B.2 Lewis, RJ*; Money, C; Boogaard, PJ; ExxonMobil Biomedical Sciences, Inc.; [email protected] Judging the Quality of Evidence for REACH An approach has been developed for how the quality of human data can be systematically assessed and categorized. The approach mirrors that applied for animal data quality considerations (Klimisch scores), in order that human data can be addressed in a complementary manner to help facilitate transparent (and repeatable) weight of evidence comparisons. The definition of quality proposed accounts for both absolute quality and interpretability, thereby enabling consistent and trasparent incorporation of data into human health risk assessments. Using examples, the presentation will illustrate how the scheme can be applied for weight of evidence comparisons, building from a systematic assessment of available data quality and adequacy. The utility of the scheme for describing data reliability, especially when contributing entries to the IUCLID database, will also be shown.

T3-D.1 Li, X*; Lovell, RA; Proescholdt, TA; Benz, SA; McChesney, DG; Division of Animal Feeds, Office of Surveillance and Compliance, Center for Veterinary Medicine, Food and Drug Administration; [email protected] Surveillance of Salmonella Prevalence in Pet Food, Pet Treats and Pet Nutritional Supplements by the United States Food and Drug Administration in 2002 – 2012 The Center for Veterinary Medicine, FDA, conducted a surveillance study to monitor the trend of Salmonella contamination in pet food, pet treats and pet nutritional supplements. The samples were randomly collected from pet food, pet treats and pet nutritional supplements in interstate commerce and at United States ports of entry for surveillance purposes. These samples were tested for the presence of Salmonella. There were 2,026 samples collected in 2002-2012, and 100 were positive for Salmonella (4.94%). This total included 524 pet food samples, of which 28 were positive for Salmonella (5.34%); 1,328 pet treat samples, with 61 were positive for Salmonella (4.59%); and 174 pet nutritional supplement samples, with 11 were positive for Salmonella (6.32%). A significant overall reduction in Salmonella prevalence (p ≤ 0.05) in all three types of samples was observed in the samples collected in 2012 (2.21%) compared to the samples collected in 2002 (12.40%). This study does not include the results of samples collected from follow up to consumer complaints and violations or Salmonella outbreak investigations associated with pet food, pet treats and pet nutritional supplements, which average approximately 30 additional positive Salmonella cases for the past three years (2010-2012). The findings of the FDA Salmonella surveillance study provide Salmonella prevalence information that can be used to address Salmonella contamination problems. The findings can also be used to educate pet owners when handling pet food, pet treats and pet nutritional supplements at home to prevent salmonellosis.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-C.2 Li, N; Brossard, D*; Scheufele, D. A.; University of Wisconsin-Madison; [email protected] What do government and non-profit stakeholders want to know about nuclear fuel cycle? A semantic network analysis approach Effective risk analysis is critical to improving policy decisions on complex technologies with a potential for catastrophic consequences. To accurately assess technological risks, it becomes increasingly important for policymakers to incorporate stakeholder beliefs into the policymaking process. Current debates on the merits and drawbacks of different nuclear fuel cycle scenarios in the U.S. and abroad present a challenge to integrate various concerns of stakeholders with distinct interests. In this study, we adopted a semantic web analysis approach to analyze the different sets of beliefs held by government and non-profit stakeholders about the risks associated with different aspects of nuclear fuel cycle. In particular, we conducted in-depth cognitive interviews with six stakeholders working for federal government and six working for non-profit organizations. Participants were asked to freely talk about the key issues related to nuclear fuel cycle (e.g., economics, safety, resources recycle, and non-proliferation). An artificial neutral network program CATPAC II was used to analyze the 42 pages transcripts of the twelve one-hour-long interviews. Results showed that the major concerns of government stakeholders significantly differ from those of non-profit stakeholders. While government stakeholders had salient concerns about the security of transporting nuclear materials and the implication of nuclear fuel cycle at the state level, non-profit stakeholders did not assign priority to these issues. Moreover, although both groups highlighted the importance of the back-end of nuclear fuel cycle, government stakeholders focused on the feasibility of recycling nuclear material, whereas non-profit stakeholders emphasized the challenges presented by reprocessing. These differences are illuminated through the use of a hierarchical cluster analysis of 47 unique concepts for each group of stakeholders and a visual representation of the associated mental concepts. Implications for risk analysis and policymaking related to the nuclear fuel cycle are discussed.

P.26 Lin, MH*; Ho, WC; Caffrey, JL; FAN, KC; WU, TT; CHEN, PC; LIN, CC; WU, TN; SUNG, FC; LIN, RS; China Medical University; [email protected] Ambient air pollution and Attention Deficit Hyperactivity Disorder (ADHD) among children Attention Deficit Hyperactivity Disorder (ADHD) is the most commonly diagnosed neurobehavioral disorder of childhood. Animal studies suggest that traffic-related air pollution may have adverse neurologic effects, but studies of neurobehavioral effects in children are still in need. The purpose of this study is to assess the potential adverse health effects of air pollution during maternal pregnancy related to childhood ADHD. There are two databases used in this study: 1) Longitudinal Health Insurance Database 2005 (LHID2005) and 2) Environmental Protection Agency (EPA) air monitoring database. Geographic Information Systems (GIS) will be used in estimating air pollution exposure. Furthermore, Cox proportional hazard regression models will be used in adjusting sex, geographic area, urbanization level, household Environmental Tobacco Smoking (ETS) exposure and lead concentrations in air. All statistical analyses will be performed with the SAS version 9.2 (SAS Institute, Cary, NC, USA). A p-value of less than 0.05 is set to declare statistical significance. The results showed that air pollution could be related to childhood ADHD, especially traffic-related air pollution. Air-pollutant trimester-specific effect was found. Further research is suggested.

P.29 Lin, YS*; Caffrey, JL; Ho, WC; Bayliss, D; Sonawane, B; U.S. Environmental Protection Agency; [email protected] The Role of Dietary Zinc in Cadmium Nephrotoxicity Background: Animal studies have shown that cadmium (Cd) and zinc (Zn) are associated with increased and decreased renal risk, respectively. Goal: To examine the joint effect of Cd exposure and Zn intake on renal risk. Methods: The data were obtained from 5,205 adults aged 50 years and older from the Third National Health and Nutrition Examination Survey (NHANES III, 1988-94). Results: Logged urinary Cd is positively associated with albuminuria (odds ratio=1.29; p=0.01). Despite an apparent protective effect of Zn intake, the finding is not significant. However when considered jointly with Cd, there was a significant inverse association between albuminuria and the Zn-to-urinary Cd ratio (p<0.01). Discussion: Whereas Cd poses an important risk factor for albuminuria in older Americans, the positive role of Zn in moderating Cd-associated renal risk remains to be investigated.

P.30 Lin, HC*; Guo, YL; Wu, KY; Institute of Occupational medicine and Industrial Hygiene, College of Public Health, National Taiwan University, Taipei, Taiwan; [email protected] Assessment of the Occupational Exposure Limit of p-Phenylenediamine for Hairdressers p-Phenylenediamine (PPD) is an ingredient of permanent oxidative hair colouring products. PPD was reported to cause severe allergic contact dermatitis. The exposures to PPD for hairdressers have been of concerns. Previous studies have been carried out to elicit allergic skin reaction as a function of the applied concentration on skin and exposure duration. In order to protect the hairdressers, PPD was banned by Sweden and France, but many regulate PPD contents in hair coloring products including European Cosmetics Toiletry and Perfumery Association and U. S. A. Currently, there is not an occupational exposure limit of PPD. Therefore, the aim of this study was to propose occupational exposure level of PPD for hairdressers by using health risk assessment. Previously, 16 volunteers were tested with patches containing 1%, 0.3%, 0.1% and 0.01% PPD in petroleum for 15 min, 30 min and for 120 min. These data were used for dose-response relationship modeling by using the Benchmark dose software. Only the log-probit and log-logistic models fit this data set. A BMDL10 0.57 %min was adopted, and uncertainty factor of 20 accounting for inter-individual differences and incompletion of data was used to derive the proposed exposure level at 1.2 ug/cm2 of PPD in petroleum on skin in a 8-hr work shift.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-F.4 Lin, L; Lund University; [email protected] Mapping the municipal risk information flow: A study based on the practice of risk and vulnerability analysis in Lund, Sweden Many risks that societies face today are “systemic”.These risks require a holistic approach,a coordinated effort from stakeholders that cut across functional sectors and geographical boundaries to foresee,prepare for and respond to them.Effective risk communication is always considered to be a core essential to successfully dealing with adverse events.This is even more critical when stakeholders from different levels and various disciplines are involved in the assessment and management of risks.Whether risk information is being effectively communicated among those stakeholders will affect risk assessment results and management decisions,thus influencing the ultimate performance of risk governance.In Sweden, all municipalities are obligated by law to conduct risk and vulnerability analysis(RVA),to ensure decision makers are aware of risk and vulnerability,and to help them make appropriate decisions.Conducting RVA and incorporating it into practice are largely based on exchanging and disseminating information about risks among authorities and organizations who work together in assessing and governing risks.This paper focuses on the risk communication among actors throughout municipal risk-governance chain.By using the Swedish municipality Lund as a case,the author explores the question of how are issues concerning risk communicated among municipal departments and their external service providers.A risk governance perspective was used to analyze the empirical data from 18 semi-structured interviews with all Lund’s municipal departments, as well as with representatives of other coordinating external service providers:water company,electricity company,fire service company and the police.Specific attention was directed to potential barriers to information sharing.This study is especially relevant for actors from all areas of society to reflect on their practices of risk communication, thus improving their performance of risk assessment and the quality of decision making.

P.31 LIN, YS*; GINSBERG, G; SONAWANE, B; US EPA; [email protected] The Relationship of Mercury Exposure, Omega-3 Intake, and Risk of Chronic Kidney Disease Background: It remains unclear whether environmental exposure to mercury (Hg) is associated with increased renal risk, and whether omega-3 fatty acid (FA) intake could affect Hg nephrotoxicity. Goal: To examine the relation of chronic kidney disease (CKD) to blood Hg and omega-3 FAs. Methods: The data were obtained from 1,046 adults aged 40 yrs or older from the National Health and Nutrition Examination Survey 2003-4. Results: The adjusted odds ratio for increased CKD risk in the highest tertile of blood Hg compared with the lowest was 2.96 (95% confidence interval= 1.05-8.39). Despite the role of omega-3 FAs in modulating Hg nephrotoxicity, there was only a marginal association between omega-3 FAs. Discussion: Hg exposure is associated with CKD risk and additional studies are needed to assess the role of omega-3 FAs and the sources, exposure routes, and forms of Hg responsible for Hg nephrotoxicity.

T1-G.3 Linkov, I*; Eisenberg, DA; Bates, ME; US ARMY ENGINEER RESEARCH AND DEVELOPMENT CENTER, VICKSBURG, MS, USA, CONTRACTOR TO THE US ARMY RESEARCH AND DEVELOPMENT CENTER, VICKSBURG, MS, USA; [email protected].mil Resilience polices and applications to climate change Escalating adverse events associated with climate change such as natural disasters are reorienting national and local policies to support managed resilience. Resilience is defined by Merriam-Webster as: an ability to recover from or adjust easily to misfortune or change. Advancing general knowledge and application of resilience in complex socio-techno-ecological systems (e.g. a city or military installation) offers a means to address the unforeseen and cascading damages that climate change can cause. As adaptive management strategies are a key component of improving system resilience, other system features such as network structure, data acquisition methods, and decision models employed also determine how well the system will adjust to adverse events. Therefore, improving the resilience of these systems requires a combination of measures associated with both the system’s engineered performance under stress as well as policy and management structures that support adverse event response and recovery. However, as resilience has moved from an ecological and engineering concept to application, there has been limited work addressing whether policies or programs are successful at generating climate change resilience. Moreover, linkages between the impacts of climate change with damages to critical infrastructure and cyber systems require that policies developed outside the usual climate change purview adopt a broader spectrum of strategies to support their resilience. In this presentation, we discuss policies to address resilience at a national and local level to compare perceived success and failures with respect to both engineering performance and adaptive management strategies.

T1-I.2 Liu, X*; Serrano, JA; Saat, MR; Christopher, CPL; University of Illinois at Urbana-Champaign; [email protected] Managing the Risk of Crude Oil Transportation by Rail U.S. crude oil production has recently increased significantly in part attribute to advances in horizontal drilling and hydraulic fracturing technologies. Consequently, this resulted in considerable growth of crude oil shipment by rail. U.S. Class I railroads originated 9,500 carloads of crude oil in 2008, and this number had increased to 66,000 carloads in 2011. Meanwhile, it may increase the risk of hazardous materials release incidents. Risk management of rail crude oil transportation has become increasingly urgent in light of several recent severe release incidents in North America. To assist the railroad industry in understanding the risk of transporting crude oil and the effectiveness of various risk reduction options, the Association of American Railroads sponsored our research to enable estimation and reduction of crude oil release risk. This presentation will describe a novel risk analysis model involving the estimation of the probability and consequence of crude oil release from train accidents. The probability distribution of number of tank cars released is estimated based on a series of conditional probabilities accounting for a variety of principal infrastructure-related and rolling-stock-related factors (such as train length, speed, number and location of tank cars in the train, tank car specification and accident cause). The consequence of a release incident can be measured by affected population, which is related to the number of tank car releases and the amount of release. We model release consequence using simulation methods on a geographic information system (GIS) platform. Finally, we identify and evaluate the safety effectiveness of several potential risk reduction strategies. The methodology and analysis provide a quantitative approach that enables more effective management of the environmental risk of transporting hazardous materials.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.113 Liu, LH*; Chan, CC; Wu, KY; National Taiwan University; [email protected] Probabilistic Risk Assessment for 2-Amino-1-Methyl-6-Phenylimidazo[4,5-b]- Pyridine (PhIP) through Daily Consumption of High-Temperature Processed Meats and Fishes in Taiwan 2-amino-1-methyl-6-phenylimidazo[4,5-b]pyridine (PhIP) has been reported present in many panfried, oven-broiled, and grilled meats and fishes which are important sources of nutrients. It caused colon, prostate and mammary cancer in animal bioassay and is classified as a possible human carcinogen. PhIP can cause the formation of DNA adducts and a mutagen, and a genotoxicity mode of action is relevant to human. Daily dietary intakes of PhIP through consumption of the high-temperature processed meats and fishes have been of great concerns. Therefore, this study was aimed to perform a probabilistic cancer risk assessment on PhIP due to daily consumption of these meats and fishes for the Taiwanese population. Dose-response modeling was performed with the Benchmark dose software for PhIP, a BMDL10 at 0.248 (mg/kg-day) was adopted for species extrapolation to assess a cancer-slope factor 0.4029(kg-day/mg). PhIP concentrations in meats and fishes cooked at different methods were cited from literatures. Questionnaires were used to collect the frequency consuming these meats and fishes from 123 study subjects to adjust the intake rates of these meats and fishes from National survey data. Probabilistic assessment of cancer risk and margin of exposure (MOE) was conducted by using the Monte Carlo simulation with the Crystal Ball software. The results reveal that a mean cancer risk is 4.64 x10(-6), and the bound of its upper 95th confidence interval is 1.58x10(-5), and the mean MOE is 270,000, and its lower bound 95th confidence interval is 10,000.

M3-I.2 Liu, X; Saat, MR*; Barkan, CPL; University of Illinois at Urbana-Champaign; [email protected] Alternative Strategies to Positive Train Control (PTC) for Reducing Hazardous Materials Transportation Risk The Rail Safety Improvement Act of 2008 requires railroads to implement Positive Train Control (PTC) on most lines transporting Toxic Inhalation Hazard (TIH) materials before 31 December 2015. The motivation for this requirement is the belief that by so doing, the likelihood of accidents in which TIH materials would be released would be reduced. However, the particular types of accidents that PTC can prevent comprise only a small percentage of the total accidents with the potential to result in a TIH release. Association of American Railroads (AAR) estimates that these PTC-preventable accidents (PPA) are less than 4% of total mainline accidents. Meanwhile, implementation of PTC is extremely costly. Cost benefit analysis of the PTC rule by Federal Railroad Administration (FRA) indicates that the railroads will incur approximately $20 in PTC costs for each $1 in PTC safety benefit. Consequently, the industry believes that there are other, more cost-effective means of reducing the risk of TIH accidents. This study identified a set of the most promising potential alternative strategies to PTC, and quantitatively assess their potential to reduce TIH transportation risk.

P.54 Liu, SY*; Chang, CS; Chung, YC; Chen, CC; Wu, KY; National Taiwan University; [email protected] Probabilistic Cancer Risk Assessment for Aflatoxin B 1 with Bayesian Statistics Markov Chain Monte Carlo Simulation Aflatoxins are found present in nuts, peanuts, corns, spices, traditional Chinese medicine, maize and rice. Particularly, aflatoxin B1 has been shown to induce liver cancer (hepatocellular carcinomaor ,HCC) in many species of animals and is classified as a human carcinogen by IARC. Exposure to aflatoxin B1 through food consumption is considered as a risk factor for hepatocellular carcinoma (HCC) and could cause synergetic effects to the infection of hepatitis B virus. However, the available residues in foods are very limited, and the intake rates of foods containing aflatoxin B1 are very uncertain. Therefore, the aim of this study was to perform probabilistic cancer risk assessment for aflatoxin B1 with Bayesain statistics coupled with Marko chain Monte Carlo simulation (BS-MCMC) to reduce uncertainty in the distributions of aflatoxin B1 residues and the intakes. The aflatoxin B1 residue data was cited from official reports of routine monitoring data published by Taiwan Food and Drug Administration. Questionnaires were used to collect the frequency of consuming these foods containing aflatoxin B1 from 124 study subjects. A cancer slope factor, 0.128 (g/kg/day)-1, was assessed with the Benchmark dose software and linear extrapolation. These data were used as prior information for BS-MCMC modeling. Our results reveal that the cancer risk was 2.642.07 x10-7 for the HBsAg (-) population and 6.755.29 x10-6 for the HBsAg (+) population. These results suggest that reduction of aflatoxin B1 exposure is necessary for the HBsAg (+) population.

W3-E.1 Liu, CL*; Luke, NL; CDM Smith; [email protected] Application of Lead and Arsenic Bioavailability in Human Health Risk Assessment for a Sediment Site Lead smelting slag was used to construct seawalls and jetties in a waterfront park approximately 40 years ago and battery casing wastes were disposed at the site. As a result, soil, groundwater, surface water and sediment have been contaminated with lead and arsenic. To evaluate the physical and chemical characteristics of lead and arsenic detected at the site and to determine the degree to which lead and arsenic are available for uptake into the human body, a site-specific in-vitro bioavailability and speciation study was performed. The in-vitro bioavailability test involved a laboratory procedure that is designed to mimic some of the conditions of the human digestive tract. Quantitative electron microprobe analysis (EMPA) was used to determine chemical speciation, particle size distribution, association of the metal-bearing forms, frequency of occurrence, and relative mass. Results of the in-vitro study indicated that bioavailability of lead and arsenic varied widely among different samples, both within and across each area. The relative bioavailability for lead ranged from 12% to 84% and averaged about 56%. Similar to lead, in-vitro bioaccessibility for arsenic was also highly variable, ranging from 0.3% to 63%, with an average of 14%. The EMPA study revealed that lead and arsenic were mainly associated with iron oxyhydroxide, which is expected to be moderately bioavailable. Site-specific bioavailability factors were developed that resulted in more realistic human health risk characterization results and remediation goals.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-I.1 Locke, MS; Pipeline and Hazardous Materials Safety Administration; [email protected] A case study in estimating mitigated risk for safety regulators: Hazardous materials transportation Government oversight of industrial hazards is focused on the prevention and diminution of negative consequences (chiefly harm to humans, as well as environmental and property damage or loss), at times evaluating systemic performance only up to the point of judging whether the total magnitude of said consequences was greater or less than the previous year. In fields such as commercial transportation of dangerous goods, it can be difficult to assess how much of this annual variation would be preventable with further investment versus what is simply attributable to the inherent, residual risk of a multimodal, human-designed and human–run system. Presenting this data relative to trends in economic activity provides more context but is complicated by the often-fragmentary nature of exposure data, with volumes of material shipped being broadly surveyed only once every five years. Altogether, however, this information only gives part of the picture: failures of risk management. The U.S. Pipeline and Hazardous Materials Safety Administration, in the continual process of self-assessment, is developing methods for capturing estimates of successfully mitigated risk—that is, the potential consequences averted through purposeful intervention. This initiative intends to explore questions including: How or from when do we determine baseline risk? Can we decompose or partition the effects of government interventions such as regulations, outreach, and enforcement? What is the return on investment of a dollar spent on safety oversight? and How do we avoid double-counting benefits, i.e., saving the same life over again with every new regulation?

T2-K.1 Lofstedt, R; Kings College London; [email protected] The substitution principle in chemical regulation: A constructive critique The substitution principle is one of the building blocks of modern day chemical regulation as highlighted in the Registration, Evaluation, Authorisation and Restriction of Chemicals (REACH) regulation. But what is the substitution principle, what is the history of its use and how do relevant authorities and regulatory actors view it? This article addresses these questions and is based on a grey literature review and 90 in-depth face-to-face formal and informal interviews with leading policy makers in Europe, with a specific focus on Scandinavia. The paper shows that the substitution principle is a surprisingly under researched topic and that there is no clear consensus on how to best apply the principle. The penultimate section puts forward a series of recommendations with regard to the use of the substitution principle which European policy makers and regulators may wish to adop

W4-C.4 Long, KL; Nielsen, JM; Ramacciotti, FC*; Sandvig, RM; Song, S; ENVIRON International Corporation; [email protected] A Risk Assessment Approach that Facilitates Site Redevelopment and Remediation When Future Site Uses are Uncertain Traditional approaches to site risk assessments are often not capable of supporting cleanup decisions for sites where future uses are still being decided because these approaches are based on defined exposure scenarios and exposure units that correspond to the future uses. At the same time, site owners or developers are often reluctant to make decisions about future uses without knowing the potential health risks and remediation costs associated with different future use options. This presentation describes a risk assessment approach that we have developed and applied at a number of sites to overcome these barriers to site redevelopment and remediation decisions. Our approach facilitates site decisions by conceptualizing every sample location as a potential exposure unit and evaluating the health risks and remediation costs for all future site uses under consideration at every potential exposure unit. The massive amounts of risk and remediation estimates generated from this approach are mapped in GIS to help stakeholders visualize the spatial distribution of health risks and the areas that would need remediation under all possible future use scenarios under consideration. By making the key consequences of future use options accessible to stakeholders, this approach not only overcomes barriers to decisions but also helps stakeholder optimize between site redevelopment benefits and remediation costs.

W4-A.2 Loomis, D*; Straif, K; International Agency for Research on Cancer; [email protected] Evaluation of Causality in the IARC Monographs The IARC Monographs identify causes of cancer in the human environment. Since 1971 over 900 agents have been evaluated, with more than 100 classified as carcinogenic to humans and over 300 as probably or possibly carcinogenic. Initially the Monographs focused on environmental and occupational exposures to chemicals, but have expanded to other types of agents, such as personal habits, drugs and infections. The process of causal inference used for IARC’s evaluations is laid out in the Preamble to the Monographs. Working groups of invited experts evaluate human, animal and mechanistic evidence and reach a consensus evaluation of carcinogenicity. Human and animal cancer data are first assessed separately according to criteria established in the Preamble. The strength of the evidence for causation is categorised as Sufficient, Limited, Inadequate, or Suggesting lack of carcinogenicity. To arrive at an overall evaluation, the Working Group considers the totality of the evidence and assigns agents to one of 4 causal groups: 1 Carcinogenic to Humans; 2A Probably carcinogenic to humans; 2B Possibly carcinogenic to humans; 3 Not classifiable as to carcinogenicity to humans, or 4 Probably not carcinogenic to humans. The evaluation criteria reflect the Precautionary Principle in that sufficient evidence of carcinogenicity in animals can be used to classify an agent as possibly carcinogenic to humans when human data are inadequate. Mechanistic evidence can be also invoked to upgrade an evaluation in the absence of adequate human cancer data. Alternatively, strong evidence that a relevant mechanism is absent in humans can downgrade an evaluation based on animal cancer data. Comprehensive assessment of the evidence according to established causal criteria has made the Monographs an authoritative source for agencies and researchers worldwide. While the process described here is one of hazard identification, the Preamble also has scope for characterising risk quantitatively.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.96 Lu, H; Marquette University; [email protected] Burgers or tofu? Eating between two worlds: Risk information seeking and processing during dietary acculturation As the fastest growing ethnic student group in the United States, Chinese international students are becoming a prominent part of the society. Studies have shown that with acculturation, Chinese international students have consumed a more Americanized diet (increased fat and cholesterol and less fiber than their traditional diet) and that this acculturated diet has been associated with a higher prevalence of chronic diseases. The understanding of what factors, cognitive, affective, and/or socio-demographic, might motivate Chinese international students to seek and process risk information about potential health risks from adopting a more Americanized diet is the primary focus of the current study. Guided by the Risk Information Seeking and Processing (RISP) model, an online 2 (severity: high vs. low) x 2 (coping strategies: present vs. absent) between-subjects experiment was conducted among 635 participants. Data were analyzed primarily using structural equation modeling. Some highlights of this study include the situational roles that informational subjective norms (ISN) and affective responses play in risk information seeking and processing. More specifically, ISN had direct relationships with systematic processing and heuristic processing while working through information insufficiency to affect information seeking and information avoidance. Negative affect was positively related to information seeking but also worked through information insufficiency to pose an impact on information seeking. Positive affect was positively related to information avoidance and heuristic processing. Future implications include boosting perceived social pressure and using appropriate fear appeals in healthy eating education programs, and creating more awareness about potential risks of tasty, convenient, inexpensive but unhealthy food. The author would like to be considered for the Student Travel Award.

W2-B.2 Luben, TJ*; Milhan, G; Autrup, H; Baxter, L; Blair, A; Kromhout, H; Ritter, L; Stayner, L; Symanski, E; Wright, JM; U.S. EPA; [email protected] Evaluating uncertainty due to exposure assessment in epidemiologic studies used in risk assessment The results of epidemiologic studies make important contributions to risk assessment because they provide direct information on disease risks in humans and at relevant exposure levels. The ability to characterize and quantify the uncertainty associated with assessment of exposure in environmental and occupational epidemiologic studies is essential to fully utilize epidemiologic data for risk assessment and decision-making processes. A multidisciplinary team of experts was assembled to identify sources of uncertainty and to propose approaches for reducing uncertainty related to exposure assessment in epidemiologic studies. Recommendations that would help to move the state of the science forward and further enhance the use of results from epidemiologic studies in quantitative risk assessments included: (1) Use of an interdisciplinary team to develop an optimized exposure assessment approach with quantitative exposure-response information for every epidemiologic study in order to inform risk assessments; (2) Quantify exposure measurement error and examine the actual impact of this uncertainty on health effect estimates from epidemiologic studies; (3) Encourage the development and application of techniques to quantify and adjust for exposure measurement error in epidemiologic studies; and (4) Develop improved methods for assessing exposure to individual substances from multiple routes of exposures and for dealing with multiple environmental pollutants (including mixtures) which is common to all observational epidemiologic studies. Finally, the group recommends that a new generation of cross-disciplinary scientists be trained to create optimized exposure assessment methods for epidemiologic studies that include quantifying and reducing the uncertainty in the quantitative risk assessment process attributable to exposure assessment limitations. Disclaimer: The views expressed are those of the authors and do not necessarily reflect the views or policies of the US EPA.

T1-H.1 Lundberg, RP*; Willis, HH; Pardee RAND Graduate School; [email protected] Priotitizing homeland security using a deliberative method for ranking risks Managing homeland security risks involves balancing concerns about numerous types of accidents, disasters, and terrorist attacks. These risks can vary greatly in kind and consequence; how people perceive them influences the choices they make about activities to pursue, opportunities to take, and situations to avoid. Reliably capturing these choices is a challenging example of comparative risk assessment. The National Academy of Sciences review of Department of Homeland Security risk analysis identifies developing methods of comparative risk assessment as an analytic priority for homeland security planning and analysis. The Deliberative Method for Ranking Risks incorporates recommendations from the risk perception literature into both the description of the risks and the process of eliciting preferences from individuals and groups. It has been empirically validated with citizens, risk managers, and policy makers in the context of managing risks to health, safety, and the environment. However, these methods have not as of yet been used in addressing the challenge of managing natural disaster and terrorism hazards. Steps in this method include first conceptualizing the risk, including both how to differentiate risks and the characteristics to needed describe them in a comprehensive manner, then developing concise summaries of existing knowledge of how the hazards. Using these materials, relative concerns are elicited about the hazards in a deliberative process. These relative concerns about hazards provide a starting point for prioritizing solutions for reducing risks to homeland security. We identify individuals' relative concerns about homeland security hazards and the characteristics which influence those concerns. The consistency and agreement of the rankings, as well as the individual satisfaction with the process and results, suggest that the deliberative method for ranking risks can be appropriately applied in the homeland security domain.

T3-B.2 MacDonell, M*; Garrahan, K; Hertzberg, R; Argonne National Laboratory (author 1), U.S. EPA (author 2), Emory University (author 3); [email protected] Stakeholder Involvement and Risk Communication in CRA Planning, Scoping and Problem Formulation Cumulative risk analysis has long been rooted in stakeholder involvement and risk communication. The cumulative risk concept is central to the national environmental policy established more than forty years ago, and assessment approaches and tools have continued to evolve since that time. Recent advances in information and communication technologies have greatly increased community awareness of risk issues and also greatly simplified community involvement in the process of assessing those issues. One key aspect of problem formulation and goal setting for a cumulative risk analysis is deciding what metrics are well suited to the given assessment, or which units of risk to use. Considering the wide array of stakeholders, stressors, exposures, temporal and spatial scales, sensitivities, and priorities commonly at play, collective inputs to the core metrics can help maintain a focus on practical decision needs. Among the measures being used is disability-adjusted life years (DALYs), e.g., the number of years of healthy life lost because of the given problem (such as chemical and microbial pollution). Questions include how many DALYs would be unacceptable for a large city? For an individual? Jointly developing these risk measures can anchor the assessment process, recognizing that planning, scoping, and framing the problem are iterative activities and the original metrics will likely change as the assessments progress. Cloud computing, mobile apps, and gamefying approaches are among the technologies and techniques being tapped to enhance stakeholder involvement in articulating risk problems and scoping assessments, as well as collecting and sharing data and communicating risk estimates to guide effective responses. This paper will highlight examples of stakeholder involvement and risk communication for different types of cumulative risk assessments.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W3-I.1 MacGillivray, BH; Cardiff University; [email protected] Heuristics in policy relevant science: an analytical framework for characterising the strengths and limits of formal risk and decision analysis Scholars long argued that one of the defining features of science was that it is systematic and methodical. That is, science was conceived as being governed by generally agreed upon rules for designing experiments, analysing data, and discriminating between conflicting hypotheses. Yet today this is a rather unfashionable view amongst historians and philosophers, with rules now seen as a somewhat peripheral feature of scientific practice. However, focussing on formal risk and decision analysis, this paper argues that the obituary of rules has been prematurely written. It develops an analytical framework for studying a particular class of rules: heuristics. Risk research tends to view heuristics as lay reasoning devices that are generally sub-optimal. In contrast, we follow Pólya and Pearl in defining heuristics as rules of search, classification, inference, and choice that fall short of formal demonstration, yet that are indispensible for domains that did not admit of formal logic or proofs. Analysing various domains of environmental and public health governance, we identify structurally similar rules of thumb used to screen potential risk objects, to discriminate between signal and noise, to weight evidence, to select models, to extrapolate beyond datasets, and to make decisions based on agreed upon facts. We analyse the origins of these heuristics, their justifications, the functions that they play, their empirical adequacy, and the biases that they introduce. Our basic claim is that heuristics play a central role in how we collect, interpret, and act upon scientific knowledge in governing risk; that this role is not necessarily problematic; and that, above all, it needs to be taken seriously if we are concerned with robust, evidence-based public policy.

T3-I.4 MacKenzie, CA; Naval Postgraduate School; [email protected] Deploying Simulation to Compare Among Different Risk Reduction Strategies for Supply Chains A firm can choose among several strategies to reduce the risk of disruptions in its supply chain, including holding inventory, buying from multiple suppliers, and helping suppliers recover. Because these risk management strategies may be non-continuous and depend on each other, simulation can be used to select a near optimal combination of strategies. We develop a conceptual framework for a firm who wishes to select resilience strategies that maximize its profit. Because the risk management strategies may be dependent upon each other, finding the optimal mix of strategies becomes a difficult combinatorial problem. For example, holding inventory may maximize the profit of all the strategies examined in isolation, but a combination of helping the supplier recover and buying from an alternate supplier may result in a higher profit than just holding inventory. Simulating multiple strategies together can account for these interdependencies among the risk management strategies. This increases the number of scenarios the simulation will explore, which increases simulation time. The trade-off between increasing the number of simulations to assess the profit for a given scenario and exploring more scenarios is similar to the trade-off that occurs when using simulation to solve stochastic optimization problems.

P.139 Madden, M*; Young, B; Datko-Williams, L; Wilkie, A; Dubois, JJ; Stanek, LW; Johns, D; Owens, EO; ORISE, U.S. EPA-ORD, U.S. CDC-NIOSH; [email protected] Review of Health Effects and Toxicological Interactions of Air Pollutant Mixtures Containing Oxides of Nitrogen The U.S. EPA sets National Ambient Air Quality Standards (NAAQS) to protect against health effects from criteria air pollutants with the recognition that human populations are exposed to complex air pollutant mixtures. Exposure to these mixtures may differentially affect human health relative to single pollutants as a result of biological interactions between constituents of the mixture. If the effects of a mixture are equal to the sum of the effects of individual components, the effects are additive and the interaction effect is zero; additivity is often assumed as the null hypothesis for interaction effects. Alternatively, synergism (effects greater than additive) and antagonism (effects less than additive) are possible interactions, although the definitions and usage of these terms are not consistent across studies. To understand the potential biological interactions of exposure to air pollutant mixtures, we reviewed toxicological evidence (animal and controlled human exposure) from mixture studies cited in EPA’s Integrated Science Assessments (ISAs) and Air Quality Criteria Documents (AQCDs). We used quantitative and qualitative methods to determine the effects of pollutant mixtures on all health-related endpoints evaluated in these studies, specifically focusing on mixtures containing oxides of nitrogen. Many studies could not be analyzed quantitatively using our statistical model due to incomplete reporting of data. Instead, studies with incomplete response data were evaluated qualitatively for evidence of interaction effects and relevant vocabulary such as “additivity,” “synergism,” and “antagonism.” Although a number of studies reported deviations from additivity, there was no discernible pattern to the relationship between similar exposure scenarios and the direction or magnitude of the biological response. The views expressed in this abstract are those of the authors and do not necessarily represent the views or policies of the U.S. EPA.

P.57 Maeda, Y*; Marui, R; Yamauchi, H; Yamaki, N; Shizuoka University; [email protected] Comparative study of risk with nursing work in Japan and China Risk with nursing work, particularly medical incidents, in hospitals in China and Japan was compared. Data about medical incidents in Japan were obtained from the survey of 1,275 Japanese hospitals operated by Japan Council for Quality Health Care in 2011. As for China, a questionnaire survey was conducted from December 2012 to January 2013 among 631 nurses in Nanjing Drum Tower Hospital. As a result, situations related to medical incidents, factors of medical incidents, length of service of nurses who found the incidents, and the season when the incidents occurred frequently were common in Japan and China, whereas frequency of medical incidents were different. In addition, satisfaction with work schedule of nursing was investigated by several questions in the survey in China. Satisfaction level with the schedule was very high in average, however dissatisfaction was also found for some questions. Independence of medical incident reporting and satisfaction with scheduling was tested. In some questions, significant relationship between dissatisfaction with schedule and frequency of medical incidents were obtained. It suggests that medical incidents are related to busyness of nursing work.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W4-E.3 Maier, A; Nance, P*; Ross, CS; Toxicology Excellence for Risk Assessment; [email protected] Lessons for Information Exchange in Occupational Risk Science: The OARS Initiative The demand is growing for occupational exposure guidance to support risk assessment for workplace exposures. The demand is fed by the expectation that risk analysis will more fully address the global portfolio of chemicals in commerce. A range of traditional and new dose-response techniques are being applied to fill the void. However, risk science and risk policy differences result in OEL guides that differ among organizations and that can result in a confusing landscape of OELs and related health benchmarks. One concept to address this transitional phase in occupational risk assessment is to develop programs that foster harmonization of methods. Harmonization is viewed as an approach to foster shared understanding and ultimately increased utility of the range of OELs and OEL methods available. Currently, numerous sources of information are available, but no unified source of occupational risk methods documentation has been compiled. In addition, decision guides to sort through the confusing landscape of guidance have been demonstrated as adding value for risk assessment practioneers. Such tools have been compiled in the context of a collaborative – the Occupational Alliance for Risk Assessment. The impacts regarding information exchange techniques and education and outreach strategies employed are described based on lesson learned through a 2-year implementation program.

P.43 Makino, R*; Takeshita, J; AIST; [email protected] Can game theory predict the human behavior on safety? Form the viewpoint of an economic experiment Estimating risk through Probabilistic Risk Analysis (PRA) has primarily been a non-behavioral, physical engineering approach. To assess the reliability of a system, however, the behavioral dimension must be taken into account. The theoretical model of Hausken (2002) that merges PRA and game theory considers safety as a “public good” which is a notion of economics [1]. It is well known in economic theory that the supply of public goods is lower than its optimal level without any corrective interventions. Using a game theoretic analysis, Hausken (2002) described the situation where a safety level remains low, that means the supply of safety as a public good is lower than its optimal level. Although his model is valuable, it has not been empirically validated yet. We validate the Hausken’s model using the technique of experimental economics. We basically follow the method for public good experiments employed by Fehr and Gachter (2002) [2]. In our study 48 participants take part in the experiments and they are divided into 12 groups of n = 4 participants. They work virtually on PCs that are inter-connected in series or parallel in a computerized laboratory. We establish safety rules for the virtual works and all participants are asked to follow these rules when they work. The experimental design is that if some, not necessarily all, of the participants follow the safety rules, the risk of occupational/industrial accidents remains low while the observation of the rules costs the rule followers. The costs are designed to give participants an incentive to break the safety rules. We examine the condition that participants break the safety rule, or to put differently, they take unsafe actions. [1] Hausken, K. (2002), “Probabilistic Risk Analysis and Game Theory,” Risk Analysis 22, 17-27. [2] Fehr, E. and Gachter, S. (2002), “Altruistic Punishment in Humans,” Nature 415, 137-140.

W4-J.2 Mandel, M*; Carew, D; Progressive Policy Institute; [email protected] Regulatory Improvement Commission: A Politically Viable Approach to U.S. Regulatory Reform The natural accumulation of federal regulations over time imposes an unintended but significant cost to businesses and to economic growth, yet no effective process currently exists for retrospectively improving or removing regulations. This paper first puts forward three explanations for how regulatory accumulation itself imposes an economic burden, and how this burden has historically been addressed with little result. We then propose the creation of an independent Regulatory Improvement Commission (RIC) to reduce regulatory accumulation. We conclude by explaining why the RIC is the most effective and politically viable approach.

T3-J.4 Mannix, BF; George Washington University; [email protected] Employment and Human Welfare: Do Jobs Count in Benefit-Cost Analysis? Economists think of a benefit-cost analysis of a public project or policy as summarizing the real effects of the decision on public welfare: the well-being of real people, measured according to people’s own preferences. Their methods are designed to look past the familiar aggregate statistics of economic activity – the National Income, GDP, and so forth – to isolate net changes in economic surplus. To a lay person, however, this can all look too obscure and reductionist: the net benefits take the form of disembodied dollars that appear in no budget, cannot be spent, and are “owned” by no one in particular. In contrast, people who know nothing else about the economy know, and care about, the level of unemployment. It gets enormous political attention – particularly during an election season when unemployment may be high and economic growth seemingly stalled. The first question senior policymakers ask economists about a public decision is very often: What will be the effect on jobs? And the answer from economists is often: “zero,” or “negligible,” or “ambiguous,” or “that is the wrong question.” This paper explores some of the reasons why the standard human welfare metric appears to be insensitive to the employment effects that loom so large in the perceptions of the public. It argues that, to a first approximation, employment effects are already counted in a BCA as a component of compliance costs. Any attempt to include additional categories of welfare effects must confront the problem of potentially counting these effects more than once. The paper concludes that, in most cases, employment effects should be treated as they conventionally have been treated – implicitly part of the calculation of compliance costs. There may indeed be employment-related distributional considerations that are important and deserve analytical attention, but these are outside the scope, or at least beyond the reach, of the traditional BCA methodological framework.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-F.2 Marchant, GE*; Lindor, RA; Arizona State University; [email protected] Managing Pharmacogenomic Risks Through Litigation Liability is an important, although often under-appreciated, mechanism for managing the risks of new technologies. Key actors in the development and application of technologies are often motivated by potential liability concerns. This is certainly the case in the medical field, where past experiences such as bendectin, DES, silicone breast implants, Vioxx, and medical malpractice have made both product developers and health care providers very sensitive to, and aware of, liability risks. The development of personalized medicine, in which diagnostics and therapeutics are targeted at the pharmacogenomic profile of the individual patient, will present new or enhanced liability risks to both the product developers and providers for a number of reasons including: (i) failure to accurately convey risks of false positives and false negatives; (ii) inexperience and lack of training/expertise in applying new technologies; (iii) unrealistic or inflated patient expectations; (iv) disparities in the conduct of different manufacturers and providers in designing and applying new, rapidly evolving technologies; and (v) novel liability claims such as violation of emerging duties such as a duty to disclose individual information to at risk relatives. This presentation will review three recent lawsuits that present different theories of liability relating to pharmacogenomic testing (one relating to cancer predisposition, one relating to drug metabolism, and one relating to prenatal testing), and will evaluate both the potential positive and negative implications of such liability lawsuits for managing the risks of personalized medicine.

T3-F.2 Marks, PD; Law Firm; [email protected] Existing Tools for Accessing Federal Data A number of different tools can be used to seek data that resides with a Federal agency or is created with the use of Federal funding. Sometimes data can be obtained directly and informally as the result of a cooperative dialogue. When data is not voluntarily made available, other mechanisms may be pursued. The Freedom of Information Act is frequently used to obtain data, and recent developments in the case law clarify when a requester may turn to the courts to enforce the Act. Where the data was generated by institutions for higher education, hospitals and other non-profit organizations through grants or agreements with the Federal government, Office of Management and Budget Circular A-110 may shed light on the availability of data. In addition, data disclosure is fundamental to the Information Quality Act. This talk will discuss the process for obtaining data and some of the strengths and limitations of A-110, FOIA, and IQ.

T3-B.1 Martin, LR*; Teuschler, LK; O'Brien, W; U.S. Environmental Protection Agency; [email protected] Overview of the Environmental Protection Agency (EPA) Cumulative Risk Assessment (CRA) Guidelines Effort and Its Scientific Challenges The EPA Administrator directed EPA to account for CRA issues in planning, scoping and risk analysis and to integrate multiple sources, effects, pathways, stressors and populations into its major risk assessments. EPA established a Risk Assessment Forum CRA Technical Panel to develop Guidelines. The Guidelines will articulate broad underlying principles and provide a set of descriptive, science-based procedures and methods specific to CRA for use by EPA’s program offices and regions. Significant scientific challenges remain to be addressed in the development of the CRA Guidelines. The introduction of non-chemical stressors and unconventional initiating factors (such public health outcomes) in a CRA results in additional complexities for all facets of CRA planning, communication, analysis and uncertainty. This presentation will address proposed methods for addressing these challenges. Science questions addressed in this presentation will include: identification and evaluation of nonchemical stressors and psychosocial factors that influence risks posed by environmental exposures under EPA’s purview; use of epidemiology study data and traditional or high throughput toxicology data to inform health impacts of stressor combinations; and communication with stakeholders regarding the CRA scope, analysis plan and results.

T1-B.4 Marty, MA*; Zeise, L; Salmon, AG; Cal/EPA, Office of Environmental Health Hazard Assessment; [email protected] IRIS Improvements: Meeting the Needs of California The National Research Council’s recommendations to U.S.EPA to improve the IRIS process included measures to take to increase clarity of the steps in the risk assessment process used by EPA for chemical specific quantitative risk assessments. The states, including California, rely on the IRIS assessments for making decisions as to the need for controls of pollution sources for various media (air, water, soil, food, wildlife). While California is somewhat unique in having resources and mandates to conduct quantitative risk assessments for many chemicals, we do not have the extent of resources and expertise as U.S. EPA. Thus, we utilize EPA assessments and look to IRIS as an important resource. This presentation will focus on the utility of and potential improvements in changes EPA is making in establishing guidelines and templates for literature search and screening, evaluation and display of study data, including evidence tables, integration of the database, and presentation of dose-response modeling. We will also provide input on areas where additional discussion of potentially sensitive subpopulations, human heterogeneity and cumulative impacts could be improved.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.77 Marynissen, H*; Ladkin, D; Denyer, D; Pilbeam, C; Cranfield University; [email protected] The constitutive role of communication for coordinated safety behavior in an organization managing high-hazardous processes The dissemination of safety procedures and guidelines is perceived as pivotal to keep organizations managing high-hazardous technologies incident free. The role of clear communication is seen as essential in the transferring of these procedures and guidelines. However, previous research in a gas-receiving terminal clearly indicates how every single individual in that organization holds divergent perceptions of the present risks. This suggests that the transmitting of information through various forms of communication fails to create a uniform perceived interpretation of the potential risks in an organization. Hence, these variable risk perceptions might actually endanger safe operations. On the other hand, the gas terminal that was the subject of this research has been operating accident-free for more than twenty years. This is at odds with the average number of fatal accident rates in onshore gas companies. Therefore it might be argued that this gas terminal achieves some form of coordinated safety behavior, based on a differing way of relating within the organization. In an attempt to uncover this coordinated safety behavior, this research explores the interactive processes between all staff. Based on Social Network Analysis and qualitative interviews it indicates how the ongoing conversations about safety and risk avoidant behavior constitute a safety culture in this gas-receiving terminal. Furthermore, it fundamentally adds knew insights to the existing knowledge in the field of “communication constitutes organization” research (CCO), and more specifically to the use of CCO in High Reliability Organizations. Finally, recommendations for practice and viable directions for further research are indicated.

T3-F.1 Mason, AM*; Risotto, S; Wise, K; American Chemistry Council; [email protected] The importance of access to underlying data Many stakeholders make substantial investments in research to support product development, health, safety and environmental protection, and to comply with product stewardship and regulatory policies. This research is often conducted to provide input federal agencies and programs such as EPA, NIEHS, NTP, NCI and ATSDR. In addition, stakeholders have an interest in many of the influential scientific assessments that are conducted by these agencies. While not regulations, they are often the trigger or basis for regulatory requirements. While publications summarizing the results of federally funded research may be available, the underlying data and details on study design and statistical analysis generally are not available. Without these details, it is difficult to thoroughly understand critical information that forms the basis for federal decisions related to a specific substance. It is similarly difficult to verify analyses independently or conduct alternative statistical evaluations. With limited openness, the opportunity for robust stakeholder engagement and independent scientific analysis suffers. This talk will summarize and provide examples discussing the importance of access to underlying data and some of the challenges that have been confronted.

P.42 Matteo Convertino, MC*; Rafael Munoz-Carpena, RMC; Greg Kiker, GK; Stephen Perz, SP; University of Florida (on leave), and Emerging Pathogens Institute at the University of Florida; [email protected] Design of Ecosystem Monitoring Networks by Value of Information Optimization: Experiment in the Amazon Effective monitoring of ecosystems is crucial for assessing and possibly anticipating shifts, quantifying ecosystem services, and decision making based on these shifts and services. The selection of monitoring sites is typically suboptimal following local stake- holder or research interests that do not allow to capture the whole ecosystem patterns and dynamics. Here we propose a novel model for the design of optimal monitoring networks for biodiversity based on the concept of the value of information (VoI). We consider the trinational frontier among Brazil, Peru, and Bolivia as a case study. Using a multiresolution texture-based model we estimate species richness and turnover on satellite imagery as a function of different sets of information coming from plot data organized in network topologies. The optimal monitoring network is the network that minimizes the integrated VoI defined as the variation of the VoI in the 28 years considered. This is equivalent to minimize the sum of the species turnover of the ecosystem. We identify the small world network as the optimal and most resilient monitoring network whose nodes are the hotspots of species richness. The hotspots are identified as the sites whose VoI is the highest for the whole period considered. Hence, the hotspots are the most valu- able communities for inferring biodiversity patterns and the most ecologically valuable according to the richness - resilience hypothesis. The small world monitoring network has an accuracy ∼ 50% higher than other network topologies in predicting biodiversity patterns. The network that results from the optimal trade-off between data value with their uncertainty and relevance, has deep implications for understanding ecosystem function and for management decisions. Hence, because of the optimal integration of environ- mental, social, and economical factors the model allows a sustainable monitoring and planning of biodiversity for the future.

T4-D.3 Matthew E. Bates, MEB*; Jeff M. Keisler, JMK; Benjamin A Wender, BAW; Niels Zussblatt, NZ; Igor Linkov, IL; US Army Corps of Engineers; [email protected] Prioritizing hazard research for three nanomaterials through value of information analysis Nanotechnologies are potentially hazardous and EHS research will continue to be needed as new materials are developed and new products transition to market. High uncertainty about basic material properties and limited time and research funding suggest that nanotechnology research needs to be prioritized. This prioritization can be done through Value of Information (VoI) analysis, which we apply to nanomaterial research related to hazard identification. We implement VoI as global Monte Carlo sensitivity analyses on the uncertainty in a hazard-banding identification space to rank parameters whose resolution is most expected to change inferred material hazard scores. This provides a path towards prioritizing hazard-related research strategies in their own right and per dollar spent in terms of expected improvements in classification. We implement this capability with the hazard banding assumptions of CB Nanotool to prioritize hazard research for MWCNT, Nano Ag, and Nano TiO2 particles based on material properties estimated from the literature and expert judgment. Anticipated improvements in hazard classification accuracy and are compared with possible losses from misclassification and research costs to discuss which research strategies seem most promising, with implications for future research policy.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T1-G.4 McCright, AM; Michigan State University; [email protected] The Politics of Climate Science and Policy Effective policy response to climate change may require substantial public support for greenhouse gas emissions reductions. Since the mid-1990s, there has been significant political polarization on climate change among U.S. political elites. Such polarization extended to the general public in the early 2000s to the point that liberals and Democrats are more likely to report beliefs consistent with the scientific consensus and express personal concern about global warming than are conservatives and Republicans. Several studies find that political orientation moderates the relationship between educational attainment and climate change beliefs. That is, the effect of educational attainment on global warming beliefs and concern are positive for liberals and Democrats, but are weaker or negative for conservatives and Republicans. Given the well-documented campaign in the USA to deny the reality and seriousness of climate change (a major goal of which is to 'manufacture uncertainty' in the minds of policy-makers and the general public), I examine the influence that perception of the scientific agreement on global warming has on the public's beliefs about global warming and support for government action to reduce emissions. Using nationally representative survey data from March 2012, I find that misperception of scientific agreement among climate scientists is associated with lower levels of support for government action to reduce emissions. This confirms the crucial role of perceived scientific agreement on views of global warming and support for climate policy. Further, I show that political orientation has a significant influence on perceived scientific agreement, global warming beliefs, and support for government action to reduce emissions. These results suggest the importance of improving public perception of the scientific agreement on global warming, but in ways that do not trigger or aggravate ideological or partisan divisions.

T3-J.3 McGartland, Al*; Ferris, Ann; Environmental Protection Agency; [email protected] Employment Impacts in Benefit-Cost Analyses Benefit-cost analysis (BCA) is one of the dominant paradigms for evaluating regulatory decisions. In 2011, President Obama reaffirmed BCA’s role with Executive Order 13563. Traditionally, employment impacts have not been incorporated into benefit-cost analysis. Many noneconomists are surprised to learn that benefit-cost analysis methods generally do not account for possible employment impacts. Surprisingly, this issue has not been given sufficient attention by labor, macro or environmental economists. We do not have a generalized theory to provide guidance on how to account for employment impacts in benefit-cost analysis. The current underperforming economy has motivated renewed questions about how economists do account for employment changes in benefit-cost analysis (and other regulatory analyses.) Author will discuss the various conceptual and empirical approaches for accounting for job impacts in regulatory analyses.

W3-A.2 McKone, TE; University of California, Berkeley; [email protected] Comparative risk, life-cycle impact, and alternatives assessments: Concepts and perspectives In order to identify, characterize, and compare opportunities for increasing the sustainable use of energy, resources, chemicals, and materials, we need reliable and informative environmental, health and economic impact assessments. In this talk I consider the different objectives and approaches used in comparative risk assessments, life-cycle impact assessments, and alternatives assessments as tools to support chemical decision-making. The goal of a risk assessment is to quantify the likelihood of harm in a format that assists decision makers who must act to tolerate, mitigate, or eliminate the potential harm. This goal is distinct from impact studies that strive to develop indicators of harm, measures of hazard, or ranking schemes. These latter activities focus more on characterizing the possibility of harm. Both hazard and risk relate to some measure of harm, such as number of deaths or diseases, financial loss, species loss, resource privation, etc. Life cycle assessment (LCA) has become an important tool for the environmental impact assessment of products and materials. Businesses are increasingly relying on it for their decision-making. The information obtained from an LCA can also influence environmental policies and regulations. Life-cycle impact assessment is the phase of LCA aimed at understanding and evaluating the magnitude and significance of the potential environmental impacts of a product system. Alternatives assessments (AA) came out of the US EPA Design for Environment program and focus on hazard characterization based on a full range of human health and environmental information. AA strives to inform technology choices so as to minimize the potential for unintended consequences. In comparing these different approaches, I will consider the extent to which they address the possibility of harm and/or the probability of harm and whether and how the approaches confront uncertainty.

M2-B.1 Meek, ME; University of Ottawa; [email protected] Evolution of Weight of Evidence Assessment in Mode of Action Analysis The World Health Organization (WHO)/International Programme on Chemical Safety (IPCS) mode of action/human relevance (MOA/HR) framework has been updated to reflect evolving experience in its application and to incorporate recent developments in toxicity testing and non-testing methods. The modified framework is incorporated within an iterative roadmap, encouraging continuous refinement of problem formulation, mode of action based testing strategies and risk assessment. It can be used as originally intended, where the outcome of chemical exposure is known, or in hypothesizing potential effects resulting from exposure, based on information on putative key events in established modes of action from appropriate in vitro or in silico systems and other evidence. It clarifies the conceptually synonymous terms of MOA and adverse outcome pathways (AOPs) and builds on experience in ever increasing numbers of case studies to additionally inform weight of evidence analysis for hypothesized modes of action. The modified Bradford Hill (BH) considerations have been additionally articulated to simplify application and a template for comparative assessment of weight of evidence and associated uncertainty for various modes of action developed. The contribution of these developments to weight of evidence considerations in related initiatives will be addressed.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T3-B.4 Menzie, C; Kashuba, R*; Exponent; [email protected] Developing Effect-Based Conceptual Models for Cumulative Risk Assessment (CRA) that can Accommodate Diverse Stressors In contrast to stressor-driven risk assessment, which focuses on identifying the potential effects of a particular stressor, effects-driven risk assessment aims to characterize the various interacting stressors and pathways that can lead to an effect of concern. This causal web of influences can be mapped using a conceptual model framework of nodes and arrows, representing concepts and relationships. In identifying assessment endpoints, stressors, and the connections among them, a conceptual model helps stakeholders visualize the problem context, facilitating communication and understanding of the contributing causes to an adverse condition. The evaluation of risk factors leading to cardiovascular disease (CVD) is a particularly good example of a complex system of mechanisms resulting in an undesired health effect. Literature review identifies upward of 50 different factors whose influence on CVD has been studied or postulated, including biological considerations such as lipid status and genetics, behaviors such as smoking and shift work, environmental exposures such as mercury and arsenic, and socioeconomic correlations. Conceptual model development involves (1) defining the model endpoint (e.g., CVD as physiological changes affecting heart function, excluding congenital heart defects), (2) identifying proximal biological mechanisms that lead to that endpoint (e.g., tissue injury, inflammation, atherosclerosis), (3) sorting and summarizing risk factors and categorizing direct versus indirect pathways of influence, (4) soliciting feedback from subject-matter experts as part of the iterative model-amending process, and (5) ensuring aesthetics that most clearly communicate the modeled system (e.g., color-coding stressor categories). A conceptual model diagram can then be used as a starting point to evaluate the magnitude and certainty of (information available for) each causal pathway, and to prioritize both potential interventions and future research directions.

W4-C.5 Merad Myriam, *; Marcel Frédéric, ; INERIS; [email protected] Is it possible to assess the quality of the governance? Conclusions of the National working group on Governance of sustainability within public organizations Economical constraints on organizations can threaten their sustainability. Past and present disasters and scandals such as for example BP Deepwater Horizon oil disaster in the Gulf of Mexico in 2010, Servier Mediator (Benfluorex) scandal in 2009 and Enron affair and collapse in 2001 have pointed significant weakness in governance issues. By governance, we mean more than corporate governance aspects. In addition to the rules of functioning of the Board of Directors, aspects such as functioning of the organization, risk management policy and how risk are prevented, relationship with stakeholders, communication and reporting on Corporate Social Responsibility and Sustainable Development and the way missions and projects of the organization are conducted must be considered. INERIS is a member of public institutions and companies’ Sustainable Development (SD) Club and contributes, in this framework, to animate and to coordinate a working group on "the governance of public organizations in response to SD the challenges". We have argued that there is a need to develop methods and tools to diagnosis and assess the governance of the organizations with respect to Sustainable Development. However, this task remains difficult due to the fact that it is complicated to appraisal the quality of governance. This paper will present the conclusion of a 3 years national working group on governance of public organizations in response to Sustainable development challenges and will present a protocol to diagnosis and analyzes the governance of SD based in the use of multiple-criteria decision aiding methods.

P.61 Merad Myriam, *; Marcel Frédéric, ; INERIS; [email protected] A pragmatic way of achieving High Sustainable Organization: Governance and organizational learning in action in the public French sector Sustainability is becoming more and more the key challenge for organizations. The French public organizations are currently working on issues related to both the assessments and the governance of sustainability. In this paper we propose a “proactive-based assessment” that help to set clear and conscious the different stage of progress of the organizations in terms of sustainability and responsiveness in that respect. Three new concepts to deal with the problem of sustainability for a public organization are proposed based on a research-in-action approach: the concept of “critical capital”, the concept of High Sustainable Organization (HSO), and the concept of learning stages within HSO. Our contribution is mainly based on investigation and pragmatic observation done within the French working group on “Governance” of public organizations.

T3-H.4 Meyer, AK*; Groher, DM; Cain, LG; AM: Environmental and Munitions Center of Expertise, Army Corps of Engineers; DG and LC: New England District, Army Corps of Engineers; [email protected] Health Risk Assessment/Risk Management Case Study: Managing Project Uncertainty Presented by the IRIS Trichloroethylene Reference Concentration Published in IRIS In the fall of 2011 the IRIS Toxicological Review of Trichloroethylene (TCE) was finalized and published. However, questions still remain regarding use of the chronic reference concentration (RfC) whose basis includes developmental effects with a exposure window of concern less than thirty days as a critical endpoint. Implementing this value to assess risk leads to issues over sampling and analysis, exposure assessment and risk management, due to the subacute concern within the chronic toxicity value. This presentation will describe Department of Defense research efforts and practical site approaches that may be used to address these issues.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-H.1 Middleton, JK*; Stoeckel, DM; Nilsen, M; Winkel, D; Anderson, D; Pals, T; 1, 2, 3, 4 - BATTELLE, 5, 6, DEPARTMENT OF HOMELAND SECURITY, SCIENCE AND TECHNOLOGY DIRECTORATE; [email protected] : A second look at bioterrorism scenarios for the Bioterrorism Risk Assessment (BTRA) The Department of Homeland Security’s Bioterrorism Risk Assessment (BTRA) includes a large number of scenarios to provide a comprehensive assessment of risk to the nation. Advancements in technology and emergence or re-emergence of pathogenic organisms require that the scenarios considered within the assessment be revisited on a regular basis. To reduce the burden associated with full implementation into the BTRA and to ensure that including a pathogen, technology, or target into the BTRA is worth the investment, DHS has tested applicability of an analytic approach used in the historic biological research program and has developed a method for assessing the potential consequences of a new scenario as a preliminary assessment. The method minimizes the input data needed for the scenario and simplifies the target considerations to a few specific targets that provide a reasonable assessment of potential consequences. This new methodology will be presented along with an overview of how the method will help to guide inclusion of scenarios within the BTRA.

T1-A.1 Miles, S*; Dalrymple, K. E.; Madsen, P; Krajewski, J; University of Iowa; [email protected] Finding Words That Work: Assessing Media Coverage of Water Issues Across Iowa The past decade has seen a rise in concern for water resource management and policies under the increasing strain from domestic, agricultural, industrial and recreational use of our nation’s aquatic resources. Iowa, like many other states, is investing in research and technology to improve sustainability practices in hopes that proper management today will provide abundance in the future. Studies suggest that policy initiatives created by university researchers and government officials find more success through engagement with the public to promote understanding of water sustainability concerns and participation in community efforts (Gleick, 2003). Crucial antecedents to awareness for any environmental concern are knowledge and understanding of the processes, terminology, and science surrounding the issue. Previous research suggests that the public obtains basic understandings of science and technology through exposure to mass media messages (Friedman, Dunwoody, & Rogers, 1986). This study builds upon this research by investigating how Iowa newspapers cover water-related topics with a goal of informing future research concerning public knowledge of water sustainability issues and policies. We examine these research questions using a content analysis of daily Iowa newspapers with a circulation between 8,470 and 120,654 covering the region of interest. Results shed light upon the ways that journalists address water issues in relation to topics covered, such as drought, toxicity problems, utility sources, recreational uses, and ecological perspectives. These findings not only contribute to our understanding of water sustainability coverage, but also provide insight as to what water related media content Iowa residents are exposed to in their daily lives. Implications of this research regarding the potential effects that such media exposure may have on knowledge of water issues, public opinion towards water use and regulation, and future support of water policies are discussed.

T3-I.2 Miller, MK*; Baker, JW; Stanford University; [email protected] Simulation approaches for assessing the impacts on equity in a region due to earthquakes Earthquakes impact different regions and different demographic groups differently, as evidenced, for example, in the 1989 Loma Prieta Earthquake by the relatively slow recovery of low-income communities in Watsonville, CA compared to that of the wealthier surrounding communities. Up to now, seismic risk analysis has focused on individual locations or on aggregate measures of network performance. In this article we present a method for assessing the impacts on equity in terms of changes in accessibility for different geographic and demographic groups in a region. First, we explain how to employ an activity-based traffic model to answer a new question, namely how transportation patterns will shift after parts of the highway and transit networks are damaged. This provides us a tool for assessing changes in accessibility for each different community. The tool also allows policy makers to quantify the costs and benefits of different mitigation strategies. Second, we illustrate the framework with a case study of the San Francisco Bay Area using a state-of-the-art stochastic catalog of earthquake events. Our findings suggest that local infrastructure, such as the redundancy of the highway links, as well as the geographical proximity to active faults are key predictors of the communities most likely to experience high losses of accessibility but we identify critical outliers at particularly high risk, such as San Rafael, CA. Finally, by analyzing the demographics in travel demand we compare the expected equity consequences from the full stochastic catalog of earthquake events both in the current state of the infrastructure and with two simulated mitigation strategies, one focusing on mitigating overall average regional risk and one focusing on reducing extreme demographic group risks. Thus, this framework will assist private and public policy makers to improve decisions about risk mitigation by incorporating an understanding of the impacts on equity.

P.66 Mishra, A*; Lambertini, E; Pradhan, AK; University of Maryland College Park, MD; [email protected] Foodborne pathogens in leafy greens: Data, predictive models, and quantitative risk assessments In the last few years, technological innovations in production, harvesting, processing, and packaging of fresh produce and their consumption have increased tremendously in the U.S. Consumers are eating more fresh produce, purchasing a broader variety and demanding more convenience products such as ready-to-eat salads. Fresh produce is generally consumed raw, making it a high-risk food in terms of pathogen contamination. A recent study by the Centers for Disease Control and Prevention indicated that in between 1998 and 2008, leafy greens outbreaks accounted for 22.3% of foodborne outbreaks. Contamination with pathogens of fresh produce including leafy greens has been a major concern to various stakeholders such as food industry, regulatory agencies, and consumers. In this study, we performed a structured review of literature to gain more insight into the available data and information related to contamination sources, predictive microbial models, and quantitative risk assessment models for different pathogens such as Listeria monocytogenes, Salmonella, Escherichia coli O157:H7 in leafy greens in the farm-to-table continuum. It was observed that microbial contamination mostly originated from the pre-harvest environment. Contamination can effectively be controlled by storing the leafy greens at appropriate temperature and time, and by the application of intervention steps such as washing and irradiation. Several research studies on predictive modeling and quantitative microbial risk assessments of pathogens in fresh produce, which are focused on one or more steps such as irrigation, harvesting, processing, transportation, storage, and washing, have been reported in the last few years. We divided those into three categories: pre-harvest models, storage models, and models related to intervention steps, primarily washing. Overall, our study provides valuable information to inform future quantitative microbial risk assessment studies related to leafy greens.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T3-D.3 Moez, S.; ANSES, French Agency for Food, Environmental and Occupational Health & Safety; [email protected] Risk assessment model for Shiga-toxin-producing Escherichia coli and Salmonella in ground beef in France: efficiency of different strategies of intervention and sampling beef trim. In France, 0.48% and 0.27% of ground beef samples (1878) tested in 2011 by the Food Safety authority were positive for pathogenic STEC ((most known E. coli O157:H7) and Salmonella spp respectively. Contaminated ground beef products have been implicated in a number of human cases and food borne outbreaks. In this study we developed a quantitative microbial risk assessment (QMRA) model to assess the public health risks associated with the consumption of ground beef contaminated with Salmonella or STEC, and to evaluate the relative efficiency of various interventions at different steps of the food chain and different strategies of sampling. The model considers typical ground beef plants producing batches of 10 000 to 50 000 of ground beef patties from carcasses slaughtered in France. The QMRA model is based on previous QMRA studies. The model combines four modules: farm module, slaughterhouse module, ground beef production and retailing module and consumer module. The model incorporates recent data on ground beef handling, preparation and consumption collected in France. Particularly, data on thermodynamics during cooking and STEC and Salmonella strains heat resistance were generated for different types of ground beef cooking. The risk associated to the consumption of contaminated ground beefs was then estimated using the more recent and appropriate dose-response models. Three type of interventions were implemented in the model: primary preventive measures against STEC contamination of meat, interventions that are expected to decrease prevalence or concentration of STEC in feces, secondary prevention measures that include slaughtering and ground beef production process hygiene measures, and tertiary prevention measures which are intervention taken during ground beef handling, preparation and cooking just before consumption. Finally, the model includes different strategies of sampling beef trim or ground beef using a cost benefit analysis.

P.12 Mohri, H*; Takeshita, J; Waseda University, National Institute of Advanced Industrial Science and Technology; [email protected] Risk analysis for networks with cooperative games We proposed a risk measure with cooperative game theory. Hausken (2002) has introduced a risk measure with non-cooperative game theory for risk analysis. In social sciences such as economics, the non-cooperative game theory is a kind of tool for modern microeconomics e.g. industrial organization theory. Now, it is very popular that non-cooperative game theory is taught in faculty of economics and management. Nevertheless, cooperative game theory is not so popular compared with non-cooperative game theory because researchers of economics think agents, like companies in markets, are competitive. Sometimes, we may come across situation in which agents are cooperative considering on risks. For example, suppose a chemical plant, it was operated by one company usually. It also consists of many sections. However, all sections should be cooperative to make chemical products in one plant under one company. Some researchers of economics saying, if agents are cooperative, cooperative game would be converted into non-cooperative game by Nash program. As we wrote above, it is very easy to suppose cooperative game situation in one chemical plant. Not only that but also agents of infrastructure networks should be regarded as cooperative. In this study, we firstly argued how cooperative game should be introduced for risk analysis. Secondly, we addressed things using concrete examples; some graphs whose had simple mathematical structures. They are series (tandem), parallel, and combined graphs. Then, we discussed how should be treated complicated structure matters.

M3-A.3 Mojduszka, EM; USDA/OCE/ORACBA; [email protected] An overview of applications of risk management principles in food safety and nutrition Effective and efficient food safety risk management is of great importance to the U.S. food industry, consumers, and the government. The Centers for Disease Control and Prevention (CDC) report that over 3000 people die each year in the U.S. from food borne illnesses and 128,000 are hospitalized. The cost to the U.S. economy (including productivity losses) from food borne illnesses is approximately $77 billion per year (CDC web-site). In addition, in the 1990s and 2000s, several indicators of the healthfulness of the American diet deteriorated, including an increase in the percentage of adults and children who are obese or overweight. The estimated cost of this epidemic to the U.S. economy by 2020 is expected to be in the hundreds of billions of dollars (CDC web-site). These estimates emphasize significance of nutrition risks to the U.S. population as well as importance of utilizing effective and efficient, private and public, nutrition risk management approaches. In this paper, I provide an overview of applications of risk management principles in food safety and nutrition. I compare and contrast the ISO 31000:2009 standard published in the general risk management area and standards published in specific food safety and nutrition risk management areas, including ISO 22000:2005 (Food Safety Management Systems), Codex Hazard Analysis, Hazard Critical Control Points (HACCP), and ISO 22002-1:2009 (Prerequisite Programs, PRPs, on Food Safety). I identify and evaluate advantages but also important shortcomings of the standards for food safety and nutrition risk management applications. My specific focus is on evaluating interdependence of the standards as substitutes or complements. This information is necessary for improving effectiveness and efficiency of the standards in practice. I finally propose how the applications of the standards could be further extended in private and public food safety and nutrition policy making.

T1-K.2 Mokhtari, A; Beaulieu, SM; Lloyd, JM; Akl, S*; Money, ES; Turner, MB; Al Hajeri, K; Al Mehairi, A; Al Qudah, A; Gelle, K; RTI INTERNATIONAL, ENVIRONMENT AGENCY-ABU DHABI; [email protected] Development of a practical approach to rank the relative health and environmental risks of industrial facilities in Abu Dhabi In pursuing their initiative to develop state-of-the-art environmental protection programs, the Environment Agency-Abu Dhabi (EAD) face a daunting challenge, namely, to identify the industrial sectors and facilities that pose the most significant risks. The EAD needed a practical tool to rank relative risks, establish a standard framework for evaluations (i.e., minimize subjective judgment), capture multiple dimensions of risk and, at the same time, avoid the types of errors that can be associated with Risk Assessment Matrix approaches. In collaboration with EAD, RTI International developed a methodological framework for relative risk ranking that is facilitated by a tablet-based Data Collection Tool (DCT). The tool maps facility-specific data to risk scenarios that describe the probability and the potential severity of adverse events. The DCT represents a simple yet innovative approach for gathering data related to four risk dimensions: first responders, workers, nearby residents, and local ecosystems. The “hazard evaluator” answers specific questions covering a wide range of “risk factors” including, for example, the types/volume of chemicals, management/storage of chemicals the availability of emergency response plans, and employee safety training. The evaluator need not be an experienced risk assessor to gather this information; the responses and help screens significantly reduce the need for subjective judgment, and the DCT allows for notes, drawings, and photographs to support follow up questions with subject matter experts. The facility data are automatically uploaded into a central database, and relative risk scoring algorithms produce risk scores for each of the four risk dimensions. The system supports relative risk ranking across facilities, sectors, and risk dimensions. It identifies facilities that represent an imminent threat, primary risk drivers, recommended risk mitigation actions, and facilities that require a more robust inspection frequency.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.2 Money, C; Corea, N; Rodriguez, C; Ingram, J; Zaleski, RT; Lewis, J*; ExxonMobil, SC Johnson, Procter and Gamble, Unilever; [email protected] Developing Specific Consumer Exposure Determinants (SCEDs) for assessing risk from chemicals used in consumer products The EU REACH Regulation requires that an assessment of the risks to consumers is undertaken when a substance is registered and is classified as hazardous. The assessment must cover all known consumer uses of the substance. The ECETOC TRA tool has proved to be the preferred lower tier exposure tool under REACH that provides conservative estimates of consumer exposure. In order to ensure that TRA users are able to reliably and efficiently refine exposure predictions, while maintaining the degree of conservatism inherent to lower-tier models, it should be possible to enter relevant habits and practices data into the tool However, in the absence of standardized descriptions of consumer use, there is a likelihood that different TRA users will describe common consumer uses differently. European industry has therefore undertaken an activity, in cooperation with EU regulators, which aims to describe typical habits and practices information for commonly encountered uses of consumer products in Europe. For each use, the habits and practices information is compiled in a transparent manner and described in a structured form (the Specific Consumer Exposure Determinant, SCED) that is capable of electronic data transfer and manipulation. The core elements within the SCED template are specifically designed for use with TRA algorithms. The process for developing the SCEDs, together with examples, will be described.

T4-J.1 Morgan, KM*; Bertoni, MJ; US Food and Drug Administration; [email protected] Risk Management to Achieve Priorities: Linking risk interventions to outcomes There are many models used in government agencies to develop estimates, simulate uncertainties, select priorities, differentiate among decision options, and provide insights into possible future scenarios. There are fewer models used for evaluating whether interventions were successful and for linking these interventions to higher level outcomes. There has been a shift in the government, driven by the Office of management and Budget, to identify performance measures for government work, with special interest in outcome measures. The development of performance measures is increasing, but expertise and knowledge about best practices in measurement and evaluation is sometimes lacking, especially among the scientists and engineers who often know the programs best. In this talk, I will propose one part of a framework for linking risk management activities to outcome measures using logic models. In order to establish a decision process that leads to making progress on the outcomes that we care about, a shift in resources is needed to strengthen research in and the development and validation of models for linking outcome data to agency actions. Data are missing and these gaps will have to be addressed if we are to proceed on the path we have started down to link program activities to outcomes. Conceptual models for policy analysis and risk-informed decision making typically include an evaluation component, but in reality, those steps are often overlooked when attention turns to the development of the next estimation model for the next problem. I propose that the risk community can play a role in increasing and improving the use of program evaluation and therefore strengthening the links to outcomes from risk analysis work. If the risk community can move in that direction, it could lead to a better alignment of resources with successful risk reduction efforts, and could also lead to larger opportunities for improving and validating risk models.

T4-H.4 Morral, AR*; Price, CC; Ortiz, DS; Wilson, B; LaTourrette, T; Mobley, BW; McKay, S; Willis, HH; RAND Corporation; [email protected] Modeling Terrorism Risk to the Air Transportation System: An Independent Assessment of TSA's Risk Management Analysis Tool and Associated Methods Following a National Research Council (2010) report urging DHS to better validate its terrorism risk models, the Transportation Security Administration asked RAND to validate one of its principal terrorism risk modeling tools developed by TSA and Boeing to help guide program planning for aviation security. This model — the Risk Management Analysis Tool, or RMAT — was designed to estimate the terrorism risk-reduction benefits attributable to new and existing security programs, technologies, and procedures. RMAT simulates terrorist behavior and success in attacking vulnerabilities in the domestic commercial air transportation system, drawing on estimates of terrorist resources, capabilities, preferences, decision processes, intelligence collection, and operational planning. It describes how the layers of security protecting the air transportation system are likely to perform when confronted by more than 60 types of attacks, drawing on detailed blast and other physical modeling to understand the damage produced by different weapons and attacks, and calculating expected loss of life and the direct and indirect economic consequences of that damage. Key findings include: 1) RMAT has proven to be of great value to the Transportation Security Administration (TSA) in driving a more sophisticated understanding of terrorism risks to the air transportation system. 2) RMAT may not be well suited for the kinds of exploratory analysis required for high-stakes decision support, because of its reliance on a large number of uncertain parameters and conceptual models. 3) TSA should not treat RMAT results as credible estimates of terrorism risk to the aviation system but can use those results to better understand the components of terrorism risk and to explore possible influences of system changes on that risk.

W4-D.3 Morris, J; Sayre, P; Alwood, J*; US Environmental Protection Agency; [email protected] Risk Assessment and Management of Multiwalled Carbon Nanotubes: Recent Developments in Regulatory Approaches Within US EPA’s Office of Pollution Prevention and Toxics (OPPT), the current risk management approach for multi-walled carbon nanotubes (MWCNT) includes respiratory protection and engineering controls to prevent inhalation exposures, requiring testing for physical/chemical properties and subchronic inhalation toxicity testing to better inform the assessment of potential risks, and minimizing water releases to limit environmental exposures. Efforts are also underway to better understand the effects of MWCNT using high-throughput and high-content screening approaches through research and development within EPA’s National Center for Computational Toxicology, and similar efforts funded by EPA grants. Knowledge is being gained through examination of these results, testing results seen in the OPPT new chemicals program and the open literature, and information generated through international efforts such as the Organization for Economic Cooperation and Development (OECD). The EPA is moving towards developing streamlined testing strategies for MWCNT based on findings from these activities. This work is being leveraged by cooperation with Canada through the Regulatory Cooperation Council by comparing and sharing information on risk assessment and risk management of nanomaterials, including a case study on a MWCNT that was assessed and regulated by both countries. Within the OECD, work has begun to amend OECD test guidelines and develop guidance to address nanomaterials. Priorities for risk assessment and risk management of MWCNT include developing screening tests, identifying key physical and chemical properties for risk assessment, and producing or amending test guidelines for endpoints of interest. The views expressed in this abstract are those of the authors and do not necessarily represent the views or policies of the U.S. Environmental Protection Agency.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-J.2 Morrow, WR*; Gopal, A; Lawerence Berkeley National Laboratory; [email protected] Large-Scale Biomass Feedstocks: A Potentially Intermittent Renewable Resource with Economic Risk for Biofuel Producers High cost of technology is seen as the primary barrier to full commercialization of cellulosic biofuels despite broad expectation that once conversion technology breakthroughs occur, policy support is only needed to accelerate cost reductions through “learning by doing” effects. In this study, we hypothesize that droughts pose a significant economic risk to biofuel producers and consumers regardless of the rate at which technology costs fall, especially once biofuels production reaches the upper threshold of biomass feedstock availability. We model a future switchgrass derived cellulosic biorefinery industry in Kansas based on spatially resolute historic (1996 to 2005) weather data, representing a rainfall regime that could reflect drought events predicted to occur throughout the U.S. We find that droughts reduced modeled biorefinery capacity factors, in four years out of 10. Interestingly, we find that two logical strategies to plan for drought; (a) building large biorefineries to source feedstock from a larger area and, (b) storing switchgrass in good production years for use in drought years; are not very effective in reducing drought risks. In conclusion we propose that biofuels be viewed as an intermittent renewable energy source much like solar and wind electricity production, although on a longer-term timeframe.

M3-G.2 Morss, RE*; Demuth, JL; Lazo, JK; Dickinson, K; Lazrus, H; Morrow, BH; National Center for Atmospheric Research; [email protected] Understanding public responses to hurricane risk messages Recent events such as Hurricanes Isaac and Sandy illustrate that weather forecasters and public officials still face major challenges in communicating about hurricane risks with members of the public. This project seeks to improve hurricane risk communication by building understanding of how members of the public respond to hurricane forecast and warning messages. This includes investigating the extent to which different hurricane messages help motivate people to take appropriate protective action, and how and why different people respond to messages differently. To begin addressing these issues, we conducted a survey of members of the public who reside in areas of coastal south Florida that are at risk from hurricanes and storm surge. The survey presented respondents with different risk messages about a hypothetical hurricane situation, based on modifications of the National Hurricane Center’s track forecast cone product. Respondents were then asked about their intentions to take different protective actions (e.g., evacuate, withdraw cash from the bank) as well as questions about their perceptions of the risks and the risk message. The survey also included questions on respondents’ sociodemographics, worldviews, past hurricane experiences, and perceived barriers to evacuation. The analysis examines how respondents’ intended protective responses were influenced by their characteristics, their experiences and perceptions, and the different message elements tested. We will discuss results from this analysis, as well as initial work applying risk theories (e.g., Cultural Theory of Risk, Extended Parallel Process Model) to help understand the findings.

M2-D.2 Mukherjee, D*; Botelho, D; Sarkar, S; Gow, AJ; Schwander, SS; Chung, KF; Tetley, TT; Zhang, J; Georgopoulos, PG; Chemical Engineering, Rutgers University; [email protected] Multiscale mechanistic modeling of the respiratory toxicodynamics of engineered nanoparticles Engineered Nanoparticles (ENPs) are increasingly becoming constituents of consumer and industrial products. A major exposure route for ENPs is inhalation and as a result, the respiratory system is the first biological target. The aim of this work is to develop a multiscale computational model for the physiologically and biochemically based simulation of toxicodynamic processes associated with ENP inhalation. Computational modules for different biological effects have been developed as components of a dose-response risk information analysis system, which provides a mechanistic framework that utilizes data from both in vitro and in vivo studies designed specifically for this purpose. Modules are developed for multiple biological scales (from cell to tissue to organ) within the respiratory system, providing explicit characterization of processes such as surfactant dynamics and uptake and elimination of ENP by cells. The respiratory system was decomposed into functional modules with alveolar surfactant dynamics, cellular dynamics, cellular inflammation and ENP-surfactant interactions being considered separately. The model first developed and parameterized for mice, was extended to rats using additional species-specific data and will be ultimately extrapolated to humans. The model also considers the mechanistic pathways involved in pulmonary cellular inflammation utilizing parameters estimated using in vitro measurements involving each cell type and ENPs. Surfactant levels and composition and the resultant changes in alveolar surface tension were linked to lung function using a pulmonary mechanics model. The model predictions were successfully compared with lung function measurements following forced oscillatory breathing maneuvers and with in vivo measurements of cell counts, cytokines and surfactants in the lung lavage fluid of ENP exposed rodents. Ongoing work focusses on adaption of the mechanistic modules to support risk analysis of human exposures to ENPs.

W2-H.1 Munns, J; Schafer Corporation; [email protected] A vector approach to measuring deterrence in adversary informed, scenario based risk analyses When a scenario based risk model involves an adversarial actor which behaves to some extent outside of the system being modeled, deterrence is generally a critical consideration. It is often difficult to quantify, and therefore describe, the value of any deterrent effects realized by defensive measures tested in the risk model. This presentation will propose a novel method for measuring deterrence in a relative sense, with the goal of improving the decisions which are informed by a risk analysis. The approach relies on the likelihood vector generated by the risk scenario tree, where each scenario has a likelihood, and the collection of likelihoods can be considered a vector of n dimensions (where n is the number of scenarios being evaluated). If the components of that likelihood vector are determined, or at least influenced, by an adversary, and if we assume that there is deterrent value in forcing an adversary to change its plans, then any change to the likelihood vector can be measured and attributed a corresponding deterrent value. The change of the likelihood vector can simply be measured either as the angle between the old and the new vector, the magnitude of the vector which results from subtracting the old and the new vector, or with reference to some other point in the decision space. This approach could also be applied to a utility vector if an adversary calculates a utility for every scenario, or group of scenarios. This presentation considers several basic examples of the proposed vector based approach, suggests which forms of risk analysis this method is best used with, and provides several suggestions for how these metrics are best communicated and visually represented. Additional consideration is given to handling the case where scenario likelihoods are uncertain and are represented by probability functions.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-J.4 Murphy, PM; George Washington University; [email protected] Electricity and development: A risk based analysis of grid extension and distributed energy resources. More than 1.3 billion people – 20% of the world’s population lack access to an electricity grid. Fewer than 4% people and enterprises in the rural developing world are connected to a grid. As energy access is critical for economic development this has huge impacts on current and potential development. There has been much analysis and effort towards determining when and where grid extension or distributed energy resources (DER)are more economical for addressing this challenge, but what all of these analyses lack is an understanding that even where the grid reaches in the developing world it is not reliable. Blackouts are common and lengthy. The trade-off is not necessarily just between bulk grid and DER, but whether both are necessary, affordable and optimal for supporting near term and long term development. This research will (1) provide a brief examination of the importance of electricity for development; (2) review some of the past work in grid power vs. DER; (3) provide some context for the lack of reliability in the developing world’s electricity infrastructure; and finally (3) provide recommendations for modifying grid vs. DER analysis to consider an unreliable grid. Analyses will focus on Northern Uganda using three sites where we are installing kilowatt scale solar-hybrid microgrids.

W4-D.2 Nadadur, S*; NIEHS; [email protected] Toxicological and Health Effects Assessment efforts for MWCNTs in NCNHIR Consortium The mechanical strength and excellent physicochemical properties of the carbon nanotubes (CNT) led to their widespread industrial use and application. It’s that with diverse surface and functional modifications, about 50,000 different types of carbon nanotubes can be produced. The expanded use of CNTs in flame-retardant coating for textiles and upholstery presents increased human exposure. Animal toxicological studies suggest the propensity for CNTs in the induction of frustrated phagocytosis, granulomatous inflammation and fibrosis. Hence, the toxicological assessment approaches towards providing biological plausibility and scientific bases for health risk assessment should incorporate systematic high throughput in vitro screening and further in vivo validation. This presentation will detail the current research efforts at the NIEHS Center for Nanotechnology Health Effects Implications Research (NCNHIR) consortium efforts in integrating in vitro and in vivo studies on a select set of commercially produced multiwall carbon nanotubes (MWCNTs) across different laboratories. These efforts are hoped to provide a detailed understanding on the contribution of MWCNTs purity, length, state of aggregation, and surface chemistry in biological and statistical modeling for determining hazard potential and health effects towards risk assessment. The views and opinions expressed here do not necessarily represent the opinions or conclusions of NIEHS, NIH or the US Government.

T4-I.5 Naito, W*; Yasutaka, T; National Institute of Advanced Industrial Science and Technology; [email protected] Cost and Effectiveness of Decontamination Options in Special Decontamination Areas in Fukushima The objective of this study is to evaluate the cost and effectiveness of decontamination options on external doses in the Special Decontamination Areas in Fukushima. Geographical Information System (GIS) was used to relate the predicted external dose in the affected areas with the number of potential inhabitants and the land-use in the areas. The comprehensive review of the costs of different decontamination options were conducted and compiled to be used for the analyses. The results suggested that the areal decontamination in the Special Decontamination Areas in Fukushima would be effective for some but not all the area to reduce air dose rate by the target level in the short period of time. In a standard option scenario, the analysis of cost and effectiveness implied that the decontamination costs for agricultural areas were estimated to account for the most part of the total decontamination costs. Within the total decontamination costs for agricultural areas, the costs of storage are estimated to account for approximately 60%. In addition, the costs of decontamination per person for each unit area are estimated to vary greatly. Our analysis implied that the choice of decontamination options would result in a significant decrease in the costs of decontamination and to make the most of the limited budget in the meaningful decontamination, the decontamination options should be determined according to the air dose rates and future land-use or town planning.

P.134 Nakayachi, K; Doshisha University; [email protected] Trust in a wide variety of risk managers after a catastrophic disaster The results of an opinion survey suggest that the public’s trust in technological and scientific experts deteriorated overall after the 2011 Tohoku earthquake in Japan. It is not surprising that trust in risk managers responsible for nuclear technology declined after the Level 7 accident at the Fukushima Daiichi nuclear power plant. Similarly, public confidence in experts responsible for disaster protection against earthquakes and tsunamis also decreased as a result of the disaster casualty of over twenty thousand. Does the public, however, distrust experts who were not directly related to predicting and responding to these disasters? The opinion survey mentioned above used the general term “scientists” and “technical experts” as the job description to be evaluated by the respondents. An expert is a specialist with competency in a particular area, however. Therefore, if the public’s trust in experts had deteriorated overall, risk managers in different areas (e.g., BSE, new infectious disease, agrochemicals, etc.) would have been mistrusted more compared with the period before the Tohoku earthquake. This research empirically examines the fluctuation in public trust of a wide variety of risk managers after the 2011 Tohoku earthquake, comparing the survey data conducted before and after the earthquake. The results revealed that, of the fifty-one risk management systems, only two (earthquakes and nuclear power) showed decreased levels of trust. There was no significant difference in the trust score concerning the thirty hazards unrelated to earthquakes and nuclear power. Interestingly, regarding the other nineteen hazards, trust has actually increased without any evidence that the risk management has improved. The fact that public trust in risk managers unrelated to earthquakes and nuclear power has increased after the earthquake is well explained by the finite-pool-of-worry hypothesis and supported by findings of Nakayachi et al. (in press).

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.87 Nance, P; Kroner, O; Dourson, M*; Toxicology Excellence for Risk Assessment; [email protected] Kids + chemical safety: a tool for educating the public about chemicals With an increasingly “plugged-in,” connected, and informed public, there is an evolving need for rapid availability and global dissemination of accurate information. Important decisions about personal health, and public health and safety are made daily by the scientific and medical community, legislators, the public and the media often based on inaccurate, incomplete or biased information on the internet. The exposure of children to chemicals in their environment and the possible effects on childhood growth and development is a paramount social concern. Many websites dedicated to children and chemical exposures are available. However, these websites can be generally characterized as either government sites that are technically dense, not interactive with users, and primarily targeted to the scientific community; or, sites developed by special interest groups, that lack technical depth, may/may not accurately represent the toxicology of the subject chemicals, may/may not be interactive with users, but that are nevertheless written at a level understandable to a broad public audience. A challenge for protection of children’s health to chemical exposures is to develop a website that can rapidly communicate independent, scientifically accurate information needed to make important decisions in a way that a broad user audience can understand and apply. Kids + Chemical Safety is scientifically accurate website, staffed by experts in toxicology, public health protection and scientific communication, that evenly represents perspectives, provides current information, interactive, and understandable to serve a broad audience, inclusive of scientists, parents and the media.

T1-J.2 Nardinelli, C; Food and Drug Administration; [email protected] Good Practices, Bad Practices, Benefits and Costs The Food and Drug Administration enforces regulations calling for current good manufacturing practices for food, feed, pharmaceuticals, medical devices, and dietary supplements. The regulations prescribe manufacturing practices that must be followed, as well as certain recommended but not required practices. Although the regulations in different forms have been in place for several decades, the underlying risks targeted by the regulations remain uncertain. In addition to uncertainty about the risks, the relation between the provisions of the rules and the risks to be mitigated is also uncertain. These uncertainties lead to the following questions: What are the risks to be mitigated? How do these regulations address them? Why are design standards mixed with performance standards? Is the mix of design and performance standard compatible with the risks? The difficulty in answering these questions has undermined efforts to estimate benefits and costs. In order to develop methods to estimate the benefits and costs of good manufacturing practices, I look at several justifications for these regulations and assess their plausibility in light of the risk management strategy. The justifications considered include preventing bad practices, information differences between experts and lay persons, enforcement costs, and differences in risk aversion. These various theories underlying the need for good manufacturing practices can be analyzed in light of existing evidence on the risks to be mitigated and the value placed on mitigating those risks. An additional assessment will deal with whether general good manufacturing practices are the most efficient way to deal with the different risks associated with different products.

W3-K.2 Nateghi, R; Johns Hopkins University; [email protected] Reflections on how to conceptualize and assess the performance and risk of different types of complex systems This talk provides some reflections on how to conceptualize and assess the performance and risk of complex systems that are impacted by a wide range of endogenous and exogenous stressors, such as energy systems. Different performance and risk metrics are discussed, both qualitative and quantitative, using probability and alternative ways of expressing uncertainties. The suitability of these metrics are studied under different frame conditions such as the assessment’s timeline; uncertain future regulations and policies; technological, environmental and demographic changes, etc.

M4-E.3 Nemickas, H; Sager, S*; Navon, D; Hubbard, T; ARCADIS; [email protected] Evaluation of a simple steady-state model: estimating resuspended particles in indoor air This presentation evaluates a screening-level methodology used to quantify potential inhalation of resuspended constituents present on surfaces within buildings. The U.S. Army Center for Health Promotion and Preventative Medicine, Technical Guide 312, Health Risk Assessment Methods and Screening Levels for Evaluating Office Worker Exposures to Contaminants on Indoor Surfaces Using Surface Wipe Data (TG 312) includes methodology to estimate particulate concentrations in indoor air based on resuspension from contaminated surfaces. The simple steady-state model accounts for the rate at which surface contaminants are resuspended into the air, loss due to deposition, and air exchanges in the building or room. TG 312 documents the uncertainties associated with estimating resuspended particulate concentrations, recognizing there are numerous factors that affect resuspension of surface particles. While no quantitative relationships between surface wipe concentrations and concurrent air concentrations have been reported, the most common way to estimate particles resuspended from a surface is to use empirical or estimated resuspension factors (i.e., ratio of air concentration to surface concentration) from the literature. Differences between surface wipe data and concurrent air concentrations that range over six orders of magnitude have been reported. The simple steady-state model, while incorporating deposition loss and air exchange, ultimately relies on a resuspension factor to estimate air concentrations. Empirical data collected during the renovation work at one site indicated air concentrations closely matched (i.e., within an order of magnitude) air concentrations estimated using the simple steady-state model. Surface wipe and indoor air data from multiple sites will be used to evaluate this simple steady-state model for different constituents and under different building conditions. Potential implications for application of the methodology will be discussed.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-H.3 Nilsen, M*; Hawkins, B; Cox, J; Gooding, R; Whitmire, M; Battelle Memorial Institute and the Department of Homeland Security Chemical Security Analysis Center; [email protected] Conquering the Iron Triangle of SME Elicitation The Chemical Terrorism Risk Assessment (CTRA) is a DHS CSAC funded program that estimates the risk among chemical terrorism attack scenarios and assists in prioritizing mitigation strategies. The CTRA relies on Subject Matter Expert (SME) elicitation of Law Enforcement and Intelligence Communities to quantify the threat and vulnerability parameters associated with various attack scenarios, as well as the capabilities of terrorist organizations. The term “Iron Triangle” is frequently used to describe situations with three interconnected constraints in which one constraint cannot be changed without changing at least one of the other constraints. One canonical example of this concept is the Iron Triangle of project management: Scope, Schedule and Budget. The challenges of SME elicitation can also be characterized as an Iron Triangle of Level of Detail, Data Quality, and SME Fatigue. Level of Detail in this context can be defined as the resolution of the elicitation results with respect to capturing intricacies and dependencies among the components of attack scenarios. Data Quality in elicitation terms can be defined as the level to which the results accurately capture the analysts’ opinions. SME Fatigue is a commonly used term in elicitations referring to the level of effort, measured in both hours and number of in-person meetings, required for an SME to participate. Innovative techniques, such as interactive tools, utility models, and YouTube-style video clips, are discussed as methods to conquer the Iron Triangle of elicitation and improve both the level of detail and quality of data, while reducing SME fatigue. The impact of these techniques will be highlighted by comparing past CTRA elicitation procedures with more recent CTRA elicitation techniques.

W3-K.1 North, DW; NorthWorks, Inc. ; [email protected] Probability theory for inductive reasoning: The "necessarist" viewpoint as an alternative, and supplement, to subjective probability. Many practitioners of decision analysis (applied Bayesian decision theory) regard its foundation for the use of probability theory as the axiomatization by Leonard J. Savage, and the procedure for assessing probabilities as subjective judgment in choosing among wagers. Another tradition, the “necessarist” approach, views probability theory as the logic for scientific inference. That is, judgments are in principle built up from available information using an explicit formal process, rather than dependent on the individual cognition of subjects as the source for the probability judgments - with the multiple potentials for biases that have been explored in psychological research. The basis for the necessarist approach can be found in the writings of Laplace, two centuries ago. Early twentieth century scholars include economist John Maynard Keynes and geophysicist Harold Jeffreys. Later contributors include Bruno de Finetti, physicist Richard T. Cox, and physicist E.T. Jaynes, whose 2003 book, Probability Theory: The Logic of Science provides one of the best recent expositions of the necessarist viewpoint. We review this history and provide an overview of how symmetry or invariance, generalizations of Laplace’s Principle of Insufficient Reason, can provide the basis for choosing probabilities in a way that avoids recourse to subjective judgments about wagers.

P.86 Ohkubo, C; Japan EMF Information Center; [email protected] Risk Communication Activities of Health Risks by the Japan EMF Information Center In response to the World Health Organization (WHO)’s publication of the Environmental Health Criteria monograph (EHC) No. 238 and WHO Fact Sheet No. 322 on extremely low frequency (ELF) electromagnetic fields, the Japanese Ministry of Economy, Trade and Industry (METI) set up a Working Group on Electric Power Facility and EMF Policy in June 2007. The Working Group compiled their report in which their recommendations to the METI were incorporated. To address issues related to potential long-term effects of ELF-EMF, the Working Group recommended that a neutral and permanent EMF information centre should be established to promote risk communication and facilitate peoples’ understanding based on scientific evidences. In response to this recommendation, the Japan EMF Information Centre (JEIC) was established in July 2008. JEIC is financed from donations by stakeholders. The Administration Audit Committee was founded in order to ensure and monitor the neutrality and transparency of JEIC operations. The JEIC institutional system is determined to develop itself into a world-class risk communication center with expertise in EMF. JEIC’s philosophy and purpose are to provide easy-to-understand scientific information on EMF and its health effects and minimize the gap of risk perception among stakeholders and promote risk communication from a fair perspective. JEIC’s activities to achieve its purposes include (1)Creating an EMF information database including EMF research database, (2)Communication with mass media, (3)Organizing public meetings, (4)Q&A by telephone and emails.

P.117 Ohno, K*; Asami, M; Matsui, Y; National Institute of Public Health, National Institute of Public Health, Hokkaido University, Japan; [email protected] Questionnaire survey on water ingestion rates for various types of liquid and the seasonal differences between summer and winter Water ingestion rates not only from community water supply but also from commercial beverages were surveyed to obtain the water ingestion rates for various types of water and to investigate the seasonal differences. The surveys were conducted during winter and summer in 2012. As a general population of the ages 20–79 in Japan, members of an Internet research company were invited and 1188 individuals in winter and 1278 in summer responded. The respondents were asked to record the daily water intake volume of each kind of water in arbitrary two working days and one non-working day during the research period. The kinds of water were non-heated and heated tap water as a beverage, soup made from tap water, bottled water, and commercial beverage including alcoholic drinks. There were no much differences of water ingestion rate among the three days; the first day’s results were presented hereafter. The mean water intakes from all kinds of water in summer and winter were 1936 and 1638 mL/day, respectively (95%iles: 3570 and 2900 mL/day). The mean water intake in summer is 1.2 times greater than in winter. Nevertheless, the mean water intakes from tap water including soup in summer and winter were 1159 and 1124 mL/day, respectively (95%iles: 2400 and 2200 mL/day); there were small seasonal differences. The main component that caused seasonal differences was the commercial beverage. The mean intakes in summer and winter were 635 and 437 mL/day, respectively (95%iles: 2500 and 1200 mL/day). With respect to tap water intake, large seasonal differences of non-heated water were observed. The mean intake in summer was 2.1 times greater than in winter (545 and 255 mL/day; 95%iles: 1676 and 950 mL/day). As conclusions, seasonal differences of water intake should be considered in risk assessment, especially in microbial risk assessment; the default water ingestion rate of 2 L/day may not always be most appropriate in risk assessment depending on the types of water taken into account.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.44 Oka, T; Fukui Prefectural University; [email protected] Cost-effectiveness of the decontamination activities in the evacuation zones due to the Fukushima nuclear accident Under the Act on Special Measures Concerning the Handling of Radioactive Pollution, the areas contaminated with radioactive materials due to the accident of Fukushima Daiichi Nuclear Power Station are broken down into two categories: i) Special Decontamination Area, which consists of the areas located in 20km radius from Fukushima Daiich Power Station and of the areas where annual cumulative dose could be more than 20mSv, and ii) Intensive Contamination Survey Area, in which over 1mSv/y of additional exposure dose were observed. We have estimated costs and effectiveness of the decontamination works being implemented in the Special Decontamination Area, by specifying the decontamination method for houses and buildings, agricultural lands, forests, and roads, by assuming efficiencies of the methods, and on the basis of the data on land-use in each 1km mesh. Effects of decontamination appear as reductions in air dose rate, from which reductions in cumulative dose in a year are calculated assuming a value for occupancy/shielding factor. Taking into account the return ratio of the evacuated people as a function of cumulative dose in a year, which has been estimated from questionnaire surveys of the municipalities in the affected area, the effects of decontamination are represented by reductions in the cumulative dose taking the return ratio into account, which are measured by the integral from the dose after decontamination to the dose before it of the return ratio with respect to the dose. These values are converted into reductions in the loss of life-expectancy due to the exposure in the next 30 years, which produce, when combined with the costs of decontamination, values for cost per life-year saved. The result for the cost per life-year saved is 5 billion yen on the average over the total area (ranging from 10^8 to 10^12 yen) when the occupancy/shielding factor is 0.6. The relation of the cost-effectiveness to the level of contamination depends on the value for the occupancy/shielding factor.

P.97 Okada, T*; Inaba, T; Hitotsubashi University; [email protected] Exploring the impact of negative emotions on information seeking about radioactive food contamination in Japan after March 11, 2011 Although previous research shows that negative emotions can increase information seeking, few have examined the qualitative outcomes of this behavior. The nature of these outcomes is crucial, for if people receive biased information or if acquire incorrect knowledge, the uninformed may suffer. Thus, we explore the extent, viewpoints (negative or positive) of information sought, and the knowledge gleaned by anxiety- and anger-induced information search on measures of radioactive food contamination. After the March,2011 nuclear power plant accident in Japan,the government has monitored the radioactivity of foods produced in affected areas and bans shipments if levels exceed scientifically established legal limits. But widespread skepticism remains as to whether food within those limits is truly safe. Survey data (N=800) provided the following results. First, anxious people were more likely to seek news from many sources. Next, anxiety increased exposure to negative views about legal standards for food radioactivity, while anger led to more exposure to positive views, no significant effect of emotions was found on knowledge of food safety. Although anxiety was positively related to knowledge of the impact of low-dose radiation on health (government-provided information frequently reported shortly after the accident), no relationship was found between emotions and knowledge of rationales for changed legal standards or their implementation. Finally, the model was not significant in predicting whether or not one’s understanding of the changing legal standards was accurate. These results imply that each emotion influences the quality of information seeking differently, and that irrespective of the number of news sources sought, they tend to be biased toward negative views. Further, while anxiety may direct attention to news about potential harm, the timing and characteristics of the reported news may be critical to the economic consequences of that information seeking.

T4-E.4 Okelo, PO*; Hooberman, B; Graber, G; Bartholomew, MJ; Stewart, KN; FDA Center for Veterinary Medicine; FDA Office of Foods and Veterinary Medicine; AFSS Consulting; FDA Center for Veterinary Medicine; FDA Center for Veterinary Medicine; [email protected] Risk-Ranking Model for Hazards in Animal Feed The FDA Center for Veterinary Medicine is developing a semi-quantitative model that ranks health risks to animals and humans associated with animal feed hazards. Explanatory variables to represent two components of the risk model namely health consequence and exposure components were selected as follows. Risk Score = Health Consequence Score (HCS) x Exposure Score (ES), where: HCS = Severity of Illness Score (SIS) x Likelihood of Illness Score (LIS). The variables were then characterized and rated as follows. Severity of illness variables: 1) organ system (e.g., renal, respiratory, cardiovascular), 2) signs and symptoms severity (e.g. low, moderate, high), and 3) signs and symptoms duration (e.g., < 1 day, 2-5 days, > 5 days). Likelihood of illness variables: 1) safe exposure level (e.g., high, moderate, low), and 2) agent stability (e.g., low, moderate, high). Exposure variables: 1) contamination rate (e.g., low, moderate, high), 2) manufacturing effect (e.g., high, moderate, low, no effect), 3) post processing control (e.g., in control, out-of-control), proportion of animals consuming feed (e.g., small, medium, large), and 4) feed consumption frequency (e.g., once a day, 3 times a day, ad libitum). The model was implemented on a user-friendly Excel spreadsheet platform that prompts the user to select from a list of qualitative statements and/or to provide quantitative data concerning the identified explanatory variables with respect to a specific species-feed-hazard combination. An algorithm written into the spreadsheet converts the qualitative inputs into numerical values, which are combined with the quantitative inputs in a series of logical steps using standard mathematical spreadsheet functions. The results are used to generate indices of animal and human health risk. We will walk through the model to develop a risk score for one animal feed hazard.

T4-F.3 Okwesili, P*; Mazzuchi, T; Sarkani, S; George Washington University; [email protected] A Risk Assessment for an Informed Decision-making for Non-traditional Pharmacy Compounding The recent multi-state outbreak of fungal meningitis, and other infections, has killed 53 people and sickened over 733 people across 20 states to date. The multi-state outbreak linked to the non-traditional compounding pharmacy in Framingham, Massachusetts, demonstrates the need for more rigorous regulatory oversight of compounding pharmacies. This most recent outbreak is one of over 240 incidents reported to FDA associated with improperly compounded products over the last two decades. The regulation of pharmacy compounding has proven challenging because non-traditional compounders act more like pharmaceutical manufacturers, than traditional compounding pharmacies. Some of these pharmacies prepare medicines in large volume production and ship across state lines while claiming the protections of traditional compounding pharmacies, which are subject to state regulations, and not directly regulated by federal oversight. In her congressional testimony in November 2012, FDA Commissioner Dr. Margaret Hamburg proposed a three-tiered policy framework for a risk-based regulatory system that will make non-traditional compounding subject to federal safety and quality standards. Those tiers are 1) Manufacturers, 2) Traditional Pharmacy Compounders, and 3) Non-traditional Pharmacy Compounders. To address the challenge associated with the policy-making for the non-traditional pharmacy compounding tier, the authors identify the critical risk factors of non-traditional pharmacy compounding, assess their degree of influence of these risk factors on the likelihood of experiencing an adverse event, and analyze their impact on public health. This is accomplished using a Hierarchical Holographic Modeling (HHM) approach and Multi-Objective Decision trees (MODT) to create an integrated method to analyze the critical risk factors that can be used in a decision-making framework.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T1-B.1 Olden, K*; Vandenberg, J; Kadry, A; Deener, K; US Government; [email protected] Advancing Human Health Risk Assessment at the United States Environmental Protection Agency There are currently more than 80,000 chemicals in commerce, and an additional 1,000 new chemicals are introduced each year. Only a small fraction of these chemicals have been adequately assessed for potential risk. Human health risk assessment (HHRA) is a critical tool for characterizing the potential risk to human health of exposure to environmental contaminants. To produce defensible, scientific assessments of risk and therefore make defensible decisions to manage or mitigate risk, risk assessors and managers need scientifically rigorous information about hazard, dose-response and exposure. EPA’s HHRA research program develops health assessments that include hazard identification and dose-response information; exposure and risk assessment tools; and methods to advance the science of human health risk assessment. This information is used by EPA and others to make decisions, develop regulatory standards for environmental contaminants, and manage cleanups. The HHRA program is committed to generating timely, credible human health assessments of individual chemicals and chemical mixtures to support priority EPA risk management decisions, thereby enabling EPA to better predict and prevent risk. The program is compromised of four research themes: 1) Integrated Risk Information System (IRIS) health hazard and dose-response assessments; (2) Integrated Science Assessments (ISAs) of criteria air pollutants; (3) Community Risk and Technical Support for exposure and health assessments; and (4) Methods, models, and approaches to modernize risk assessment for the 21st century. The HHRA research program collaborates with Agency, federal, state, regional, national, and international partners in the scientific and risk assessment communities to ensure that assessment products are scientifically rigorous, developed in a transparent manner, and relevant to stakeholder needs. The views expressed in this abstract do not necessarily represent the views or policies of the U.S. Environmental Protection Agency.

P.146 Ollison, W*; Capel, J; Johnson, T; 1 - American Petroleum Institute, 1220 L Street, NW, Washington, DC 20005; 2 Consultant, 1005 Demerius Street, Durham, NC 27701; 3 - TRJ Environmental, Inc., 713 Shadylawn Road, Chapel Hill, NC 27914; [email protected] Sensitivity of regulatory ozone risk assessment to improved exposure and response models We evaluate the sensitivity of EPA’s current ozone (O3) exposure model (APEX) to (1) alternative pulmonary function response models, (2) attainment AQ rollback approaches, (3) altitude effects, and (4) newly measured O3 penetration/deposition rates and microenvironmental (ME) factors, corrected for O3 measurement error. Results are provided for Denver air quality (AQ) scenarios representing 2006 “as is” conditions and the attainment of the current ozone NAAQS. We test recently published pulmonary function models that incorporate realistic O3 response thresholds and subject response variability proportional to the level of response. A CAMx model is used to adjust 2006 Denver AQ to simulate NAAQS attainment conditions. The modeled rollback projections account for NOx control-related increases in urban and background O3 levels from reduced NO-O3 titration that are not addressed by EPA’s quadratic rollback approach. Inhaled O3 mass is adjusted to account for altitude acclimation among Denver residents. Impacts of newly measured indoor O3 penetration-deposition rates on estimated responses are compared to projections using current APEX indoor mass-balance model assumptions. APEX ME factors are also adjusted according to recent field measurements made using new interference-free O3 photometers. Cumulative impacts of these updated components in the APEX exposure analysis are tabulated and compared to those of the current APEX model.

W2-H.4 Olson, KC*; Karvetski, CW; George Mason University; [email protected] Probabilistic Coherence Weighting for Increasing Accuracy of Expert Judgment In this presentation, we provide a new way to generate accurate probability estimates from a group of diverse judges. The experiments we report involve elicitation of probability estimates that belong to sets in which only one probability is of primary interest; the other estimates serve to measure the individual judges' coherence within sets. These experiments extend previous efforts to increase accuracy of aggregate estimates by weighting probabilistic forecasts according to their coherence. Our method shows that asking for only two additional judgments can achieve significant increases in accuracy over a simple average. In the aggregation, we adjust the judgments of each participant to be coherent and calculate weights for the judgments based on the earlier incoherence across all participants. More generally, our findings provide insight into the trade-off between methods that aim to increase accuracy of judgments by improving their coherence with experimental manipulations and methods that leverage the incoherence of judgments to increase accuracy during aggregation. Our two studies show that concurrent judgments of related probabilities produce more accurate equal-weight averages but have less incoherence on which our coherence weighting operates. However, independent judgments of related probabilities produce less accurate linear averages but have more incoherence on which our coherence weighting can capitalize.

M3-I.3 Orosz, M*; Salazar, D; chatterjee, S; Wei, D; Zhao, Y; University of Southern California; [email protected] Using PortSec for policy-making and risk-benefit balancing Seaports, airports, and other transportation nodal points face many challenges – including maximizing operational efficiency, minimizing risk from terrorism or other man-made and natural disaster events and minimizing impacts to the environment. Often these challenges are at odds with one another – increasing one often comes at the expense of achieving others. This is particularly true in our Nation’s seaport infrastructures where there is a need to secure the ports but not at the expense of maintaining port operations. The University of Southern California’s National Center for Risk and Economic Analysis of Terrorism Events (CREATE) is developing PortSec – Port Security Risk Analysis and Resource Allocation System. Under funding from DHS S&T and in collaboration with the Ports of Los Angeles and Long Beach (POLA/LB) and the USCG, USC is currently extending a previously developed proof-of-concept PortSec prototype used for tactical security decision-making into a tool that allows decision-makers to examine the trade-offs in implementing security measures to protect the port and support continued movement of shipped goods.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.106 Orozco, G; Universidad del Norte; [email protected] Risk Management in Colombia: The Challenge of Development This article addressed both the vulnerability of the biodiversity in the Colombian territory and the priority of the government to improve the mining activities in favor of the economic growth. The main purpose is to explain how risk management is not appropriate to the major environmental problems in Colombia.

T4-K.3 Ortwin Renn, O; State University of Stuttgart; [email protected] Socioeconomic Dimensions of Geo-engineering and Carbon Sequestration: Requirements for Sustainable Risk Governance In recent times, attempts to improve adaptation to climate change by means of large-scale CO2 absorption, sequestration and storage options have been discussed in scientific and political communities. These options range from increasing the absorption capability of the oceans, over weather changing patterns by inserting chemicals into the atmosphere to separating and storing carbon from fossil power stations. All these options can only be effective in global terms when they are applied either in large quantities or extended over vast areas. So far the discussion has been focused on technical feasibility, effectiveness, efficiency and to a lesser degree environmental implications. However, projects of such size will trigger major social, cultural and psychological impacts that need to be anticipated and taken into account before making any far-reaching decisions. The model of risk governance provides a suitable framework to include such impacts in the assessment, evaluation and management of the anticipated benefits and risks of these projects. The paper will address the risk governance framework and explore the methodologies for including the socioeconomic dimensions in an integrated effort to balance the pros and cons of large-scale geo-engineering. Based on empirical studies on public perception and evaluation of carbon sequestration in Germany, the emphasis will be on public acceptability, environmental justice issues, and social costs.

M4-D.4 Oryang, D*; Fanaselle, F; Anyamba, A; Small, J; Food and Drug Administration, Center for Food Safety and Applied Nutrition; NASA Goddard Space Flight Center; [email protected] Using Geospatial risk assessment to forecast produce contamination potential New responsibilities and challenges require FDA to develop innovative tools and approaches to protect the food safety and public health. Risk assessment is increasingly used to support the science basis of, and inform decision making. In a novel approach, geographic information systems (GIS) and remote sensing are being used by FDA, in collaboration with other agencies, to enhance the capabilities of risk assessment to take into account spatial and temporal dimensions, and thereby forecast where and when produce contamination events are more likely to occur. The GIS model requires an assessment of: hazards found in fresh produce; contamination modes; production practices (such as soil amendment, and irrigation water sources and systems), factors that impact growth and spread of pathogens; environmental impacts of urbanization and industrialization; livestock and wildlife population and proximity to crops; topography and soil types; microbial status of surface water, wells, and reservoirs proximate to crops, and used for irrigation; and impacts of temperature, rainfall, irradiation, fog, and extreme events on contamination likelihood and amounts. These factors vary spatially and temporally, and acquiring geospatial and time series data on these factors, is crucial to the development of this novel system. This paper presents a synopsis of the effort by FDA, NASA, USDA-ARS, and USDA-APHIS to compile the data necessary to develop this geospatial risk assessment forecasting tool to provide early warning to industry and government about future potential locations, modes, and dates of produce contamination. This approach will enable industry and government to take the necessary pre-emptive measures to prevent contaminated produce from entering the food supply and causing illness/death.

M3-D.3 O'Rawe, J; Ferson, S*; Sugeno, M; Shoemaker, K; Balch, M; Goode, J; Applied Biomathematics; [email protected] Specifying input distributions: No method solves all problems A fundamental task in probabilistic risk analysis is selecting an appropriate distribution or other characterization with which to model each input variable within the risk calculation. Currently, many different and often incompatible approaches for selecting input distributions are commonly used, including the method of matching moments and similar distribution fitting strategies, maximum likelihood estimation, Bayesian methods, maximum entropy criterion, among others. We compare and contrast six traditional methods and six recently proposed methods for their usefulness in risk analysis in specifying the marginal inputs to be used in probabilistic assessments. We apply each method to a series of challenge problems involving synthetic data, taking care to compare only analogous outputs from each method. We contrast the use of constraint analysis and conditionalization as alternative techniques to account for relevant information, and we compare criteria based on either optimization or performance to interpret empirical evidence in selecting input distributions. Despite the wide variety of available approaches for addressing this problem, none of the methods seems to suffice to handle all four kinds of uncertainty that risk analysts must routinely face: sampling uncertainty arising because the entire relevant population cannot be measured, mensurational uncertainty arising from the inability to measure quantities with infinite precision, demographic uncertainty arising when continuous parameters must be estimated from discrete data, and model structure uncertainty arising from doubt about the prior or the underlying data-generating process.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W4-F.4 Palmer, MJ; UC Berkeley, Stanford; [email protected] Globally Networked Risks and the Decentralization of Biomanufacturing Advances in bioengineering technologies are catalyzing an expansion of research and development models. Horizontal technology platforms are emerging alongside more traditional vertically integrated activities (e.g. in medicine, industrial processing, agriculture, etc.). These horizontal technology platforms – sometimes referred to as bio-manufacturing capacities – include tools such as lower cost DNA synthesis, sequencing and assembly. These capacities have also expanded across a new set of actors, leveraging information sharing tools and employing novel models to organize communities (e.g. the internet; social networking; crowd-sourced funding; DIY bio labs). This potential “decentralization of capacity” presents key challenges for the governance of biotechnology. These developments have been heralded as fueling an industrial revolution in the life sciences with significant economic potential. Yet biotechnology can both pose and mitigate key safety and security concerns (e.g. bioweapons development versus deterrence and preparedness; environmental release). This presentation will discuss organizational and institutional challenges in building bio-manufacturing capacities internationally. It will examine examine strategies in the US and abroad to foster distributed or centralized technology deployment and propose key factors essential to building resilient governance strategies.

P.67 Pang, H*; Buchanan, RL; Schaffner, DW; Pradhan, AK; Pang, H; Buchanan, RL; Pradhan, AK; University of Maryland, College Park, MD, 20742; Schaffner, DW; Rutgers University, New Brunswick, NJ 08901; [email protected] Quantitative Risk Assessment for Escherichia coli O157:H7 in Fresh-cut Lettuce Leafy green vegetables, including lettuce, are of serious food safety concern, as those are recognized as vehicles for foodborne pathogens such as Escherichia coli O157:H7 that could cause human illnesses. Development and application of quantitative risk assessment models have been recognized as strong tools to identify and minimize potential risks associated with foodborne pathogens. This study was aimed at developing a quantitative microbial risk assessment model (QMRA) for E. coli O157:H7 in fresh-cut lettuce and evaluating the effects of intervention strategies on public health risks. The supply chain of fresh-cut lettuce was modeled from infield production until consumption at home. Using @RISK software a simulation model was developed for exposure and health outcome assessment. The developed model was simulated using Latin Hypercube Sampling for 100,000 iterations to estimate the number of illnesses due to consumption of fresh-cut lettuce in the U.S. With a prevalence of 1% of lettuce coming to the processing plant, the baseline model (with no inclusion of intervention strategies) predicted 8921 number of cases per year in the U.S. For each of the intervention strategies evaluated, the public health risks were reduced and the bacteriophage was the most effective in reducing the public health risks. Sensitivity analysis results indicated that irrigation water quality is the most important factor affecting the number of cases predicted. The developed risk model can be used to estimate the public health risk of E. coli O157:H7 from fresh-cut lettuce and to evaluate different potential intervention strategies to mitigate such risk.

T1-H.3 Panjwani, S; THANE Inc; [email protected] Making risk-informed decisions using the next generation algorithms for cyber-security and Cyber-Physical Systems (CPS) risk assessment Cyber-security has become a game that is almost impossible to win(1). According to the Global State of Security survey,(1) “The rules have changed, and opponents−old and new−are armed with expert technology skills, and the risks are greater than ever.” Despite this, more organizations are investing in security to achieve due-diligence as opposed to reducing the risks. Inconsistencies and inability of current expert driven risk assessment methods to improve the state of security are often cited as a reason to use a due-diligence driven security management approach. Cyber-Physical System (CPS) is hybrid network of physical and engineered systems. CPS will play an integral role in developing the next generation of critical infrastructure. CPS faces new types of risk profile including cyber-security risks. Little research is done to develop scientific foundation for managing CPS risks. Better risk assessment methods are needed to improve decisions for managing complex and dynamic cyber systems. Currently artificial intelligence algorithms are used to generate system reliability and cyber-security risk scenarios. These algorithms function by assuming that there are no unknowns and everything is known a priori. This assumption is not valid for cyber-security and CPS risk assessment. Making this assumption produces counter-intuitive results and provides a false sense of security. However, eliminating this assumption precludes using majority of current algorithms for generating risk scenarios. Using lessons learned by conducting risk assessment for a critical infrastructure CPS, the author developed a new type of algorithm for automatically generating risk scenarios for cyber-security and CPS. This new framework captures dynamic information, and assumes that there are unknowns and new knowledge is available frequently. This framework allows making risk-informed decisions in an unknown world. (1) PWC. Key findings from The Global State of Information Security® Survey 2013.

M3-A.2 Panjwani, S; THANE Inc; [email protected] Cyber-security Risk Management Current cyber-security risk management is driven by two complementary philosophies. The first strategy is called is called “penetrate-and-patch”, which focuses on identifying and patching vulnerabilities. Often a team of security experts is used to identify exploitable vulnerabilities. The second strategy is called “secure-design,” which suggests preventing vulnerabilities by designing more secure systems and developing more secure software. This approach identifies secure coding practices by studying known vulnerabilities. To support these risk management strategies, current risk assessment methods focus on identifying vulnerabilities. The challenge with current vulnerability centric risk management strategies is that, in the last decade the overall number of reported vulnerabilities has increased. More importantly, despite efforts to make software and cyber-infrastructure more secure, all types of vulnerabilities that existed at the beginning of the decade still existed at the end. Current risk management methods also assume that there are no unknowns in the cyber-security domain and all information is available a priori. This assumption produces counter intuitive results. Some experts have suggested replacing the risk based approach with a due-diligence based approach citing inconsistencies and inability of current expert driven risk management methods to improve the state of security. Current cyber-security risk management and assessment methods need to be improved. However, the lack of improved state of cyber-security is not only because of the limitation of current methods, but is caused by the failure to understand unique characteristics of the cyber-security domain. The author developed a new risk assessment framework by capturing these unique requirements of cyber-security domain. A new risk management philosophy is also developed that uses the attacker behavior to lead the attacker away from the target.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-I.1 Pant, R*; Thacker, S; Hall, JW; Barr, S; Alderson, D; University of Oxford,University of Oxford,University of Oxford,Newcastle University, Newcastle University; [email protected] Building an integrated assessment methodology for national infrastructure risk assessment due to climate hazards For modern cities addressing critical infrastructure risk has become very relevant from a safety and security point of view. As seen in recent times, among the different shocks that large-scale infrastructures are exposed to climate hazards have the potential of causing the most widespread disruptions. As such it is important to have an understanding of the risks due to extreme climate events if we plan towards future infrastructure protection and sustainable. In this research we have developed a national infrastructure risk assessment tool that improves our understanding of interdependent infrastructure failures and provides answers to important aspects of damage propagation across infrastructures. The risk assessment tool estimates some key aspects of spatial infrastructure risk which include finding: (1) The scenarios and probabilities of national infrastructure failures; (2) The locations of key vulnerabilities in national infrastructure networks; (3) The implications of interdependent failure propagation; (4) The consequences of national infrastructure failures; (5) The sensitivity of the national infrastructures to multiple climate loading conditions and system states. We provide a demonstration of the risk assessment tool through some initial analysis of interdependent national-scale energy and transport networks risk analysis. The outcomes of such analysis provide important insights into critical infrastructure protection and risk management.

W3-H.1 Park, J*; Son, M; Park, C; Richardson, H; State University of New York at Buffalo; [email protected] Hurricane Sandy and Lost Four Days in the U.S. Economy Measuring economic impacts stemming from various natural disasters is an increasingly common interest in the U.S. Even since the U.S. economy experienced severe losses from the two hurricanes which consecutively hit the Gulf of Mexico coast in August 2005, we are still experiencing similar damage every year. The recent Hurricane Sandy is one of the greatest storms ever to hit the U.S. The recent physical disruptions and environmental damages caused by Hurricane Sandy demonstrate the fragility of NYC and Long Island in terms of built and natural environmental systems, having prompted the discussion of constructing seawalls and other coastal barriers around the shorelines of the NYC and Long Island area in order to minimize the risk of destructive consequences from another such event in the future. Unfortunately, the majority of these types of studies has depended upon governmental reports, focusing on the magnitude of direct building losses or on speculations about future impacts on a damaged area. However, these economic impact readings have not accounted for indirect effects via economic and trade linkages, even though most direct economic losses lead to further economic losses via inter-industrial and inter-regional economic relations. This study connected coastal hazards to economic impacts using the lost jobs during the first four days affected by Sandy available from a Census data source applied to the NIEMO model. While New Jersey and Connecticut are two other States seriously affected, we analyzed what economic impacts using short-term job losses associated with Sandy, tracing Sandy’s moving path from Florida to New Hampshire. Since Hurricanes Katrina and Rita, we found Sandy had brought another tragedy mainly to the NYC and Long Island areas, reaching $2.8 billion in four days with 99% of the loss occurring in the last day of Sandy. Furthermore, the national impacts attained $10 billion losses as suggested by the NIEMO inter-industrial and inter-regional economic model.

P.32 Parker, AL*; Nance, PM; Maier, A; Toxicology Excellence for Risk Assessment; [email protected] Workplace Environmental Exposure Level (WEEL) Methodology with Octamethylcyclotetrasiloxane (D4) as a Case Study Workplace Environmental Exposure Levels (WEELs) are health-based guide values for chemical stressors developed by a volunteer group of professional experts known as the WEEL Committee. The WEEL committee is a collaborative initiative with the goal of promoting worker health protection through increased access to high quality occupational exposure limits, enhancements in methods for establishing worker-health exposure guidelines, and education and training in occupational risk assessment methods. A WEEL is intended to be an airborne chemical concentration to which nearly all workers may be repeatedly exposed, for a working lifetime, without experiencing adverse health effects. WEELs are derived using scientifically sound, state-of-the-art risk assessment procedures and a multi-tiered review process. An extensive review of all available relevant information of sufficient quality is used in the development of a WEEL. The Committee only considers developing WEELs where no valid guidance exists for chemical exposure. Candidate chemicals are identified from various sources including HPV (USEPA High Production Volume lists) and those solicited from stakeholders. A new stakeholder process allows interest groups or companies to request the development of a WEEL for a specific chemical of interest through the Occupational Alliance for Risk Science (OARS) initiative. The first stakeholder sponsored WEEL for octamethylcyclotetrasiloxane (D4) is in the final steps of this new process. The new WEEL and all other WEELS developed through OARS will be provided at no cost on the OARS website.

P.121 Parra, LM*; Munoz, F; Universidad de los Andes; [email protected] QUANTITATIVE APPROACH TO RISK ON FUEL TRANSPORTATION PIPELINES Hazardous materials transportation by pipeline is a widespread practice in the industry all over the world. It is socially accepted as favorable, however, like any other industrial practice is dangerous and represents risks to the society, the environment and infrastructure. Since production sites are often away from consumption centers, increased demand in Colombia and the world has led to increased transportation needs, forcing the growth of the pipeline network. Performing risk analysis before incidents occur, can provide engineering tools to support decision making, regarding the compatibility of activities within a territory. This type of analysis examines the information related to the consequences of a critical event, as radiant or overpressure effects that can affect communities in the vicinity of industrial facilities. This work consists of the development of a methodology for risk assessment and evaluation of societal and individual risk associated with fuel pipelines related to accidental events. This work includes the study of past events in Colombia and Latin America, as well as an analysis of the causes and types of events. As risks values vary along pipelines due to differences in environmental and societal conditions, analysis is performed for segments of a pipeline. In order to have manageable segments of pipeline to develop the analysis a dynamic method for segmenting is proposed. Individual and societal risk is estimated for each segment, identifying those which require priority actions. Societal risk values are defined as curves with frequency of accidents vs. expected number of fatalities. The last part of this work is the proposal of risk criteria, to support decision-making in Colombia with respect to land-use planning. This may generate tools to support decision making and resolve whether risk values can be reduced.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M4-C.2 Pate-Cornell, ME; Stanford University; [email protected] On black swans and perfect storms I will discuss the origins of the terms "black swans" and "perfect storms", their relations to different types of uncertainties, and the way they have sometimes been used as excuses for bad decisions. I will then address some risk management strategies to deal with different classes of situations. These include monitoring (e.g. of signals of epidemics for the CDC) and gathering of information about marginal and conditional probabilities (e.g. weather data) of events that in rare combinations, can have disastrous effects. In all cases, the use of systems analysis and probabilities (mostly Bayesian) is key to decision support.

W4-A.4 Patel, M*; Owens, EO; Kirrane, E; Ross, M; National Center for Environmental Assessment, U.S. Environmental Protection Agency; [email protected] Transparently Implementing the Causal Framework in the EPA NAAQS Review For Integrated Science Assessments (ISAs), EPA assesses the body of relevant literature to draw conclusions on the causal relationships between relevant air pollutant exposures and health or environmental effects related to the review of the National Ambient Air Quality Standards (NAAQS). Causal determinations are made by applying EPA’s causal framework that describes consideration of the consistency of evidence from various scientific disciplines (e.g., epidemiologic, controlled human exposure, animal toxicological studies), as well as evidence for plausible modes of action for each of five causal determinations. A challenge in the ISAs is communicating the consistent application of the framework across the various evaluated outcomes for which the evidence may vary in quantity, consistency, and the relative contributions from the various scientific disciplines. In order to better communicate how EPA considers the supporting evidence, uncertainties, and coherence across disciplines in drawing causal determinations, EPA developed summary of evidence tables for use in the ISAs. With these tables, EPA concisely summarizes the available evidence across scientific disciplines and demonstrates how this evidence relates to the attributes described in the causal framework. To describe the nature of the available evidence, these tables summarize the types of study designs available, potential biases, the control for potential confounding factors, consistency of evidence, the relative contributions from various scientific disciplines, and the exposure or biomarker levels associated with outcomes. This session will describe how the causal framework used in the NAAQS review is implemented and transparently applied using the ISA for lead as an example. Disclaimer: The views expressed are those of the authors and do not necessarily reflect the views or policies of the US EPA.

W2-F.4 Patterson, J*; Nance, P; Dourson, M; Toxicology Excellence for Risk Assessment; [email protected] Best Practices for Independent Peer Reviews High quality peer review is very valuable to ensure that the science used to support regulatory and public health decisions is sound. Peer reviewers evaluate the adequacy of the scientific data to support the conclusions. They consider whether key data were identified and interpreted correctly, appropriate methodologies were used, analyses of data are appropriate, uncertainties have been clearly identified along with the attending implications of those uncertainties; and information and conclusions are clearly communicated. Increasing attention has been given to selection of peer reviewers and determining whether particular individuals may have conflicts of interest. Ensuring the independence of the experts is an essential principle for high quality peer review. Other key principles and practices are also important: using a robust scientific approach to focus the experts on the key scientific issues and questions; selection of experts with appropriate discipline knowledge and contextual experience; and transparency in the process and communication of results to be most beneficial to informing public health decisions. Additional practices and activities have been suggested, such as instituting an independent evaluation of whether document authors addressed the recommendations of the peer reviewers in finalizing their documents. As peer review becomes more widely used by government agencies and others, the perspectives and thoughts of the experts themselves must also be considered.

P.33 Patterson, J*; Becker, R; Borghoff, S; Casey, W; Dourson, M; Fowle, J; Hartung, T; Holsapple, M; Jones, B; Juberg, D, Kroner O, Lamb J, Marty S, Mihaich E, Rinckel L, Van Der Kraak G, Wade M, Willett C; 1,5,11 Toxicology Excellence for Risk Assessment (TERA); 2 American Chemistry Council; 3, 9, 15 Integrated Laboratory Systems (ILS); 4 National Institute of Environmental Health Sciences; 6 independent consultant; 7 Center for Alternatives to Animal Testing, Johns Hopkins University; 8 Battelle;10 Dow AgroSciences; 12 Exponent, Inc.; 13 The Dow Chemical Company; 14 ER2;16 University of Guelph; 17 Health Canada; 18 Humane Society of the United States; [email protected] Workshop on lessons learned, challenges, and opportunities: The U.S. Endocrine Disruptor Screening Program

Fifty-two chemicals were recently screened using 11 Endocrine Disruptor Screening Program (EDSP) Tier 1 assays and the data submitted to the EPA for review. Over 240 scientists participated in a workshop on the EDSP in April 2013 to share scientific learnings and experiences with the EDSP and identify opportunities to inform ongoing and future efforts to evaluate the endocrine disruption potential of chemicals. The first session focused on the conduct and performance of the 11 Tier 1 assays. Speakers and workshop participants highlighted challenges in conducting the assays and solutions developed by the laboratories, as well as issues relevant to data interpretation. The second session focused on how to apply relevant information from the current Tier 1 battery to identify potential modes of action and the value of a weight of evidence (WoE) assessment for evaluating potential interactions with endocrine pathways. Presentations and discussions explored the development of critical systematic evaluation of existing data prior to implementation of Tier 2 testing, and application of alternative data to replace Tier 1 assays. The third session provided perspectives on the future of endocrine screening and the promise of in vitro high-throughput analyses, toxicity pathways, and prediction models. A number of common themes and suggestions emerged from the extensive discussions, including that a critical review and update of current Tier 1 testing guidelines is needed, the use of biomonitoring data for exposure-based prioritization, reducing the number of animals used in testing, and use of a robust WoE approach to align available Tier 1 data with potency and exposure information to better inform decisions on Tier 2 testing.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-F.2 Paulson, G; Brennan, T*; US Environmental Protection Agency; [email protected] Developments in Scientific Peer Review at EPA The EPA has a strong tradition of scientific peer review. In fact, the agency’s Peer Review Policy notes that “Peer review of all scientific and technical information that is intended to inform or support agency decisions is encouraged and expected.” This presentation will cover the tradition of peer review at the agency as well as a few of the recent developments on peer review activities. First, the historical context for the importance and practice of peer review at the agency as summarized in the EPA Peer Review Handbook will be presented. The Agency is currently in the process of revising the 2006 third edition of the Handbook. One area of focus for this revision is options to increase agency oversight of implementation of the peer review policy. In addition, in May of 2013, EPA’s Acting Administrator announced a new process for peer reviews conducted by panels of experts selected and managed by EPA contractors. Much of the new process focuses on opportunities for public engagement on the peer review process and conflict of interest procedures for these contractor-led reviews. Finally, the Science Advisory Board’s (SAB) process of peer review will be summarized, from panel selection, to access to meeting materials, to new efforts by the SAB Staff Office to enhance public transparency of the work of the SAB.

P.51 Pawlisz, AV; Conestoga-Rovers& Associates; [email protected] Hydraulic Fracturing Failure Rates – Key to Understanding Actual Risks Extraction of natural gas deposits via hydraulic fracturing (fracking) has grown at an unprecedented rate in the United States and worldwide. For various reasons, this method of natural resource retrieval has met considerable opposition from the regulatory community and the public. One of the sensitive issues is the potential for adverse impacts to the environment and human health, particularly relative to groundwater extraction, drinking water pollution, deep chemical injection, well failures, blow outs, on-site spills, air emissions, transport accidents, and noise. This presentation compiles the most recent data on various incident/accident/spill/release rates published by the industry, government agencies, and open literature. Failure data are used to conduct a predictive risk assessment where the calculated odds ratios are compared to those for conventional hydrocarbon extraction methods. The overall objective is to provide an insight on how fracking compares to other drilling and oil/gas operations in terms of the potential for overall environmental impacts.

M4-A.5 Perez, V*; Garry, MR; Alexander, DD; Tsuji, JS; Exponent; [email protected] Noncancer risk assessment of epidemiological studies of arsenic and cardiovascular disease The U.S. Environmental Protection Agency is currently revising their 1988 noncancer and cancer risk assessments for inorganic arsenic. Cardiovascular disease (CVD) is an endpoint with considerable recent studies supporting derivation of a non-cancer reference dose (RfD) for arsenic. We conducted a systematic review of the epidemiologic literature through March 2013 for scientific evidence that may support an RfD specific to CVD. Eleven cohort and case-control studies (Taiwan, Bangladesh, and China) assessing the effect of water arsenic at levels <100 parts per billion (ppb) were the main focus for the estimation of a dose-response relationship in the region of lowest- and no-observed-adverse-effect levels (LOAELs and NOAELs, respectively). Although the six U.S. studies were limited to ecologic or cross-sectional study designs, they were included because of their relevant target population. Consistent dose-response relationships were not evident at exposure levels <100 ppb (arsenic water concentration). A prospective cohort study (Chen, 2011) from Bangladesh provided the strongest evidence for a candidate RfD based on mortality from ischemic heart disease and other heart diseases combined. The point of departure (POD) for a NOAEL based on arsenic in water was 100 ppb based on all subjects (smoking-adjusted). PODs ranged from 0.0085-0.0094 mcg/kg-day for an inorganic arsenic dose from water and diet. Application of an uncertainty factor resulted in an RfD for CVD of 0.003 mg/kg-day. Caution should be exercised in extrapolating from populations with different lifestyles and factors to the United States. However, the Bangladesh population is likely more susceptible to arsenic-related health effects compared with the general U.S. population, thereby justifying a relatively low interindividual uncertainty factor. Similar evaluations for other potential noncancer endpoints can be conducted as a part of the overall RfD development.

T2-B.3 Philbert, MA; Cory-Slechta, DA*; Author 1) Philbert (University of Michigan School of Public Health); Author 2) Cory-Slechta (University of Rochester School of Medicine); [email protected] The Power of Scientific Peer Review and IRIS Peer review of scientific data, grant applications and programs is by no means perfect. It is typically expensive, time-consuming, is subject to individual biases, may be abused and has not been effective in detecting even the most blatant plagiarism, fraud or other forms of misconduct. Nevertheless, peer review is at the heart of a scientific system that with limited resources attempts to make the best possible decisions with respect to funding, decision-making, setting policy and so on. Several editorials and research articles have been published in the recent past that extoll the virtues of blind, double-blinded, anonymous, open, and other forms of peer review. Each has its own sets of advantages and disadvantages, however, it is generally recognized that peer review in the context of a FACA panel allows for impartial review in the presence of peers without undue influence by the author/applicant, minimizes overt reviewer bias and provides the authoritative stamp of the scientific community on those issues judged to be of merit by the panel. The Institute of Medicine admits that everyone has bias and that it is important to openly acknowledge influences that may sway one’s opinion in one direction or another. However, if properly composed, the peer review panel may take advantages of these biases and explore all relevant sides of an issue to arrive at a consensus on the science. It is the collective scientific judgment of the panel that adds to the validity of the scientific finding and at its best highlights areas of ambiguity or gaps in the available data. In order to maintain its own validity, scientific peer review must be well documented, transparent and must always maintain a bright line of separation between evaluation of the data and engagement in policy.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.127 Phillips, JK*; Anderson, JK; TRC Solutions; US Air Force; [email protected] Challenges Associated with Practical Environmental Restoration Risk Assessment and Management Decisions for Perfluoroalkyl Substances (PFASs) Perfluoroalkyl substances (PFASs) are environmental emerging contaminants with widespread applications in industry. PFASs do not have federal cleanup standards; however, some PFASs are environmentally persistent, bioaccumulate in living organisms, and have demonstrated toxicity in laboratory animals. Thus, despite the lack of federal regulations, it may be prudent to assess and potentially mitigate human and/or environmental exposures. A risk management decision process for the management of emerging contaminants such as PFASs at restoration sites is outlined. The identification of PFASs can significantly impact site objectives, schedule, cost and ongoing remedial activities, particularly without clear regulatory criteria. PFASs present unique challenges including identifying potential sources related to PFAS release, and characterizing PFAS contaminated groundwater and/or soil. EPA’s Office of Water is conducting reanalysis of PFAS toxicity information to revise their 2009 subchronic Provisional Health Advisories (PHAs). PHAs are non-enforceable guidelines that may or may not be utilized within state-led regulatory environmental cleanup decisions, leading to inconsistent national application. Within the US, there are several States with PFAS guidance levels, however only Minnesota has promulgated standards. This poster presentation will introduce PFASs, their sources, and the available screening levels for data comparison. It will also highlight the management challenges and current technical options available for groundwater contaminated with PFAS. Until consistent and defensible toxicity values are developed and practical remedial technologies are available, it remains challenging to execute consistent risk management practices to protect human health and the environment from PFAS exposures.

T3-A.2 Pinto, A; [email protected]; [email protected] A Qualitative Safety Risk Assessment Method to Construction Industry incorporating uncertainties by the use of fuzzy sets A new fuzzy Qualitative Occupational Safety Risk Assessment model (QRAM), was developed to mitigate occupational injuries in construction sites by improve the quality of occupational safety risk assessment. The innovative aspects of QRAM model is to embody the safety climate and the safety barriers effectiveness as assessment dimensions and the use of fuzzy sets theory to enhance the use of imprecise information. The QRAM model grouped the safety risk factors in four dimensions: Safety Climate, Severity, Possibility and Safety Barriers, used to estimate the risk of the 8 accident modes that encompass 97% of the work accidents that occur on construction sites Safety Climate importance lies in support the management of safety risks, i.e., safety climate factors are not direct agents in the occurrence of work accidents but create the conditions for accidents happening. Safety Climate estimates is made by the set of predictors. Severity is assessed, qualitatively, by fuzzy functions modeled from predictors related to the amount of energy dissipated/absorbed and that can be evaluated in situ like, heights, speeds, weights, morphology of objects, etc..., and using the biomechanical limits of the human body appointed in several studies. AP is the possibility of work accident occurrence. Each accident mode may be caused by a specific set of factors that determine the greater or lesser possibility of occurring a work accident. For each accident mode, safety barriers are divided in 4 types (physical, functional, symbolic and incorporeal). Real tests proved that QRAM is user friendly and its results are more accurate that obtained by other methodologies.

M3-A.7 Pluess, DN*; Groso, A; Meyer, T; Swiss Federal Institute of Technology Lausanne; [email protected] Analyzing and managing risks in research labs: How it is done Available risk analysis techniques are well adapted to industry since they were developed for their purposes. For academic research environment, most of these techniques are of limited use because of several differences compared to the industrial environment. Due to the nature of scientific research, accurate statistical data for processes or equipment are hardly available. However, most of the existing techniques depend on these data, e.g. studies on reliability for risk quantification. Another difficulty is to take into account special conditions present in research laboratories when using available methodologies. A majority of these techniques are designed for analyzing clearly defined processes. In academic research settings, most of the process’ variables are not well defined or continuously evolving. Additionally, different hazards present in the same laboratory may influence each other and can therefore be amplified. Different solutions for the challenge of adapting an existing method to research laboratories are available in the literature. However, most of recommendations focus on a specific field of scientific research, such as chemistry. In order to tackle this problem, we developed a holistic risk analysis technique for research and development environment. This newly developed method features an enhancement of the risk estimation (using probability, severity and detectability) with a new risk dimension, called worsening factors. Additionally, a semi-quantitative calculation method based on Bayesian networks is used to improve the risk estimation. This new risk analysis technique, specific for the research environment, is intuitive, easily performable by non-experts (web interface), less resource demanding than other techniques and more accurate. Linked with an adapted safety policy it becomes a comprehensive risk management tool. We will illustrate the application of this new method through several real research laboratories’ risk assessments.

M3-F.1 Pollard, SJT*; Mauelshagen, C; Prpich, G; Lickorish, F; Delgado, JC; Jude, S; Cranfield University; [email protected] Risk analysis for better policies – environmental risk governance for the green economy Straightened financial times are forcing a reappraisal of public risk governance in the UK. A tension exists between a necessity to share risk and cost with other actors, and a smaller public administration managing more risk - for the risk it retains - as Government becomes fleet of foot. Coincident with this, environmental policy is expected to support economic growth; this shift highlighting themes such as the effective appraisal of distant environmental threats, the apportioning of shared accountabilities for public risk, and the development of risk management maturity in Government. Taken in concert, these changes are improving environmental risk governance practice and providing rich opportunities for risk analysts. We summarise this new policy landscape, illustrating it with practical examples to show trends in environmental risk governance practice. Examples include the application of risk characterisation tools for appraising strategic policy risk and environmental futures, an examination of infrastructure risk under climate change and the systems approach to animal disease threats. What emerges is a reappraisal of some research themes familiar to the risk analysis community, but set in a new landscape of devolved accountability and networked risk. These are discussed by reference to the new opportunities they provide for policy and risk analysts alike.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W4-H.4 Poortvliet, P.M.*; Lokhorst, A.M.; Wageningen University; [email protected] Risk-related uncertainty and its relationship with citizens’ demand for regulation and institutional trust In this research, it is shown that psychological uncertainty about risks may play a moderating role on individuals’ demand for government regulation of risks and their trust in risk-managing government institutions. To test these ideas, first a survey (N = 1029) was conducted. Descriptions of specific risks from different risks domains (e.g., food hazards, medical risks) were presented to respondents and they were asked to react to questions about risk-specific uncertainty, risk perception, and demand for regulation by the government of the risk involved. Also, participants’ level of risk-specific knowledge was assessed. Results indicated that under the condition of uncertainty about risks a positive relationship emerged between the risk perceptions and demand for government regulation of risks, whereas no such relationship showed when individuals do not experience uncertainty about the risk. Conversely, when people experience little risk-related uncertainty, having more knowledge about the risk topic involved lead to a weaker demand for government regulation of risks. For uncertain people, this relationship between knowledge and demand for regulation did not emerge. Secondly, in an experiment (N = 120) psychological uncertainty was manipulated to investigate effects on ability, benevolence and integrity types of trust. Results showed that people who received open vs. non-open government communication about a zoonosis reported higher levels of trust in the government agency in the uncertainty condition compared to the control condition. Altogether, the findings of the survey and the experiment suggest that successfully informing people about risks may preclude them from demanding government action—when they are not uncertain about the risk. Also, it is concluded that having strong risk perceptions only leads to a demand for government regulation and preclude them from lowering their levels of trust when individuals have feelings of uncertainty about those risks.

M3-D.2 Powell, MR; U.S. Dept. of Agriculture; [email protected] Risk-Based Sampling: I Don’t Want to Weight In Vain Recently, there has been increased interest in developing scientific schemes for risk-based sampling of food, animals, and plants for effective enforcement of regulatory standards and efficient allocation of surveillance resources. The problem of risk-based sampling allocation presents a challenge similar to financial portfolio analysis. Markowitz (1952) laid the foundation for modern portfolio theory based on mean-variance optimization. However, a persistent challenge in implementing portfolio optimization is the problem of overfitting the model based on limited data, leading to false “optimal” portfolios and unstable asset weights (churn). In some cases, portfolio diversification based on simple heuristics (e.g., equal allocation) has better out-of-sample performance than complex portfolio optimization methods due to estimation uncertainty. Even for portfolios with a modest number of assets, the estimation window required for true optimization may imply an implausibly long stationary period. The implications for risk-based sampling are illustrated by a simple simulation model of lot inspection for a small, heterogeneous group of producers. Under constrained optimization, the annual frequency of lot inspection for each producer is defined to be at least one and otherwise proportional to the product of known production volume and estimated prevalence of contaminated lots. Under a simpler decision rule, frequency is proportional to volume. Assuming stationarity, the “risk-based” sampling frequencies assigned to producers by constrained optimization remain highly unstable after 20 years. In the presence of infrequent transients (e.g., outbreaks or extreme contamination events), the relative performance of the decision rules converges in terms of the number of contaminated lots detected as the intensity of transients increases, and simply sampling proportional to volume is more likely to detect the occurrence of transients than the complex optimization decision rule.

W4-D.1 Powers, CM*; Lehmann, G; Gift, J; Grieger, K; Money, M; Hendren , CO; Beaudrie, C; Davis , JM; U.S. EPA; [email protected] Comprehensive Environmental Assessment: Strategically linking Research, Assessment and Risk Management— Applied to Multiwalled Carbon Nanotubes The Comprehensive Environmental Assessment (CEA) approach is being developed by the U.S. Environmental Protection Agency (EPA) to identify and prioritize research gaps and risk related trade-offs in order to inform decisions that will protect human health and the environment. The most recent application of the CEA approach to multiwalled carbon nanotubes (MWCNTs) successfully engaged expert stakeholders representing diverse technical (e.g., toxicology, material science, ecology) and sector (e.g., industry, academia, government) perspective. Stakeholders identified specific research priorities that could inform future risk assessment and management efforts for these materials. Identified MWCNT research priorities generally related to 1) releases during the product life cycle; 2) environmental behavior in air, waste water and sediment; 3) exposure and dose in human occupational and consumer populations; and 4) impacts to human health and aquatic populations, as well as economic, societal and environmental resources. This presentation introduces these research priorities with a focus on initial work to describe how they might inform future risk assessment and policy decisions for MWCNTs in a particular application, flame-retardant coatings applied to upholstery textiles. This introduction will help provide the foundation for a panel discussion focused on approaches to and benefits of creating better linkages along the information chain from research to policy. Participation in this discussion will inform researchers, risk assessors, and risk managers about (1) specific research initiatives that could inform future assessments and policy efforts for MWCNTs, and (2) the potential utility of applying CEA to fortify linkages between research planning, risk assessment and risk management for other emerging chemicals, materials or technologies. The views expressed in this abstract are those of the author and do not necessarily represent the views or policies of the U.S. EPA.

M4-D.3 Pradhan, AK; University of Maryland, College Park; [email protected] Application of quantitative microbial risk assessments to address critical and emerging food safety issues Many foodborne pathogens including the zoonotic ones continue to cause significant disease burden worldwide. These pathogens cause considerable public health impact and are a major concern to food industries and regulatory agencies. In a recent report by the Centers for Disease Control and Prevention (CDC), it has been estimated that three pathogens, Salmonella spp., Listeria monocytogenes, and parasite Toxoplasma gondii, together account for more than 70% of all estimated deaths in the U.S. per year attributed to foodborne pathogens. Salmonella spp. and Toxoplasma gondii with 28% and 24% of total deaths were ranked the first and second in terms of estimated annual deaths. Given the current emphasis on risk-based approaches to evaluate food safety issues, both in industry and regulatory agencies, and to develop risk-informed polices and strategies, it is critical to provide science-based information that will aid in better understanding and managing food safety risks arising from different pathogens. Recent advances in microbial risk assessments provide tools for modeling food systems in a systematic and objective way for making better informed food safety decisions and for evaluating potential intervention strategies. In this presentation, the importance of information and data collection, and applications of quantitative microbial risk assessments to address critical food safety issues will be discussed with respect to case studies such as Salmonella in dry pet foods and Toxoplasma gondii in organic or free range meats. The recent outbreaks of salmonellosis associated with dry pet foods and treats have considerably emphasized the role of dry pet foods as a vehicle of pathogen exposure for both pets and their owners. In the CDC report, it has been estimated that 50% of all human exposures to Toxoplasma gondii are foodborne, thus making this parasite a serious food safety concern.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M4-E.4 Prasad, B*; Sunger, N; Lennon, E; Drexel University; [email protected] A risk model for inhaled toxins and spores associated with Stachybotrys chartarum Stachybotrys chartarum, a black mold frequently present in damp indoor environments, has been well-characterized due to its ability to produce human toxins such as trichothecene. The inhalation exposure to S. chartarum has been implicated in cases of acute idiopathic pulmonary hemorrhages (AIPH) in infants. The goals for this study were (1) to determine the risk of death in human infants using the maximum reported concentration of toxins and spores in an average water-damaged residential unit and (2) to determine the levels at which S. chartarum might pose as an acceptable risk. Studies of experimental administration of S. chartarum spores or trichothecene toxin on young animals were selected to model the dose response relationship between exposure and probability of death. The best-fit dose-response models for the threshold data were the Log-logistic for S. chartarum spores as exposure indicator in infant rats, and the Weibull for the trichothecene toxin in young mice. Stochastic risk estimates were developed using the proposed dose-response models to predict the risk of death in infants exposed to spores or toxins via inhalation. Analysis results showed that there is significantly high daily risk for acute 24-hour exposure to toxins ranging from 1 in 10,000 to 3 in a million. For a 10% daily risk based on inhalation instillation, the acceptable level of human equivalent dose in environmental samples was 413 spores/m3 or 3.94E-05 mg-toxin/m3. Sensitivity analysis was conducted using Monte Carlo simulations to identify factors that are highly correlated with risk estimates. This study indicates that S. chartarum toxin exposure via inhalation in indoor environments may cause a risk for AIPH, but a conclusive epidemiology study is needed to validate the risk estimates. However, the conclusion of insignificant health risk to infants in moldy indoor environments is based on spore quantification and may under-report the disease burden associated with this fungi.

W3-J.3 Price, JC*; Strellec, K; Industrial Economics, Inc.; Bureau of Ocean Energy Management; [email protected] The Development and Use of the Bureau of Ocean Energy Management’s Offshore Environmental Cost Model (OECM) to Evaluate the Environmental Risks of Offshore Energy Development Under the Outer Continental Shelf (OCS) Lands Act, the Bureau of Ocean Energy Management (BOEM) is required to prepare forward-looking five year schedules of proposed OCS lease sales that define the size, timing, and location of proposed oil and gas leasing activity. Policy decisions regarding the inclusion or exclusion of areas from these Five Year Programs require careful consideration of the environmental risks associated with exploration and development, as well as the risks that might be avoided when OCS development displaces production from other sources. BOEM requires the capacity to assess these risks in a timely manner, to allow for comparisons across multiple exploration and development scenarios and to inform refinements to policy options under consideration. To address these competing needs, BOEM developed the Offshore Environmental Cost Model (OECM), an MS Access-based model that quantifies and (where possible) monetizes the net environmental and social impacts of each OCS exploration and development scenario. We present an overview of the model and illustrate how it is used to inform development of BOEM’s Five Year Program.

P.1 Qian, H*; Zaleski, R; Money, C; ExxonMobil Biomedical Sciences, Inc.; [email protected] Approach for developing Specific Consumer Exposure Determinants (SCEDs) for fuel and lubricant scenarios The ECETOC TRA tool, a preferred lower tier exposure tool under REACH, provides conservative (intentionally high) estimates of consumer exposure. Under REACH, if a predicted exposure exceeds a substance hazard benchmark (Derived No Effect Level or DNEL) using lower tier tools, the assessment is either refined via higher tier analysis or Risk Management Measures are implemented to reduce the exposure predictions to values < DNEL. Much effort has recently been directed to identifying a rigorous and transparent approach for refining the TRA defaults so that initial exposure estimates are closer to reality, limiting the need to perform higher tier analysis which requires more data. In 2012, ECETOC introduced the concept of Specific Consumer Exposure Determinants (SCEDs), a template that helps provide a basis for delivering a more realistic estimate of consumer exposure. We populated this template and developed 9 SCEDs to cover a range of consumer fuel use scenarios and 4 SCEDs to better define consumer lubricant use scenarios based on public data. SCED development required: data mining, data assessment, data documentation, and evaluation of the final scenario as whole. We describe the steps taken in SCED development, and provide examples of a completed product. This approach has general utility for documentation of consumer exposure determinants and development of improved consumer exposure scenarios. It is currently being implemented by other industry sectors as well.

W3-F.1 Race, MS; SETI Institute; [email protected] Risk Communcation and Information Needs for Anticipated Catastrophic Threats by NEOs Research on Near Earth Objects (NEO’s) has intensified in recent decades following the recognition that certain categories of objects have the potential to cause extensive or even global catastrophic impacts on life, environments, infrastructure, and perhaps even civilization as we know it. A small sector of the space science community has been systematically surveying and tracking asteroids in an attempt to detect potentially hazardous objects (PHOs) in advance, determine their probability of Earth impact, and subsequently send missions to deflect those that might be catastrophic. At a recent international meeting, deliberations after a conference-wide simulation exercise focused on the many scientific, technological, risk assessment, geopolitical, infrastructural and communication issues that will be involved in planning and developing an international Planetary Defense system. This presentation dissects the anticipated process for developing international planning and response preparedness for PHO threats and compares it with patterns pf familiar disaster cycles and threat levels Just as with preparedness and response to extreme threats(e.g., bioterrorism, WMDs, massive tsunamis or pandemics), there is need to incorporate fast-evolving science information with new technologies and traditional disaster management infrastructures that involve coordination with officials and organizations at international, national, state/regional and local levels. In addition to addressing the risk communication needs of experts, there are also many unusual issues and information needs of the diverse publics at risk. The operational and implementation challenges are unprecedented, and the associated risk communication and information needs are likewise complex for both expert and public audiences. In addition to assessing the full spectrum of communication needs for potentially hazardous NEOs, the paper identifies a number of possible complications which will need special research and technological attention in the coming years.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.120 Rak, A*; Vogel, CM; Bass, N; Noblis Inc., US Army Public Health Command; [email protected] Phase I Impact Assessment Results for 1-bromopropane and 3-nitro-1,2,4-triazol-5-one (NTO) The Department of Defense’s (DoD’s) Emerging Contaminants (EC) Program has a well-established three-tiered process for over-the-horizon scanning for ECs, conducting qualitative and quantitative impact assessments in critical functional areas, and developing sound risk management options. This “Scan-Watch-Action” process was used to examine potentials risks from 1-bromopropane and the insensitive high explosive NTO. Subject matter experts (SMEs) from throughout the DoD used the Emerging Contaminants Assessment System (ECAS) tool to evaluate the potential risks to DoD associated with these two mission critical chemicals. Members of the EC Program team used the Impact Assessment Criteria Assessment Tool (ICAT) to analyze SME input. Together, these two groups developed a set of initial risk management options (RMOs) within the DoD. The risks identified by the SMEs and the potential RMOs for each chemical are presented for each of five different functional areas. The uncertainties in the SME’s risk estimates are also discussed and recommendations for further analysis are presented. The conclusion of these assessments indicates that 1-bromopropoane requires significant risk management actions to mitigate possible risks from occupational exposure while NTO requires additional toxicity and environmental fate data be collected.

P.110 Rao, V*; Francis, R; The George Washington University; [email protected] The role of statistical models in drinking water distribution system asset management A robust asset management plan needs to be in place for water utilities to effectively manage their distribution systems. Of concern to utilities are broken pipes, which can lead to bacteria entering the water system and causing illness to consumers. Typically, water utilities allocate a portion of funds every year for renewal of pipes and valves. However, pipe renewal is largely based on replacing current broken pipes, and long- term asset management planning to replace pipes is not a priority for water utilities. Water utilities are beginning to use probabilistic break models and other statistical tools to predict pipe failures. These models incorporate variables such as pipe length, diameter, age, and material. Although incorporation of these models is emerging in the water industry, their direct impact on long term asset planning remains to be seen. In addition, the effectiveness of these models is questionable, as there is currently little research done to evaluate the ability of these models to assist in asset management planning. This paper discusses the role of probabilistic pipe break models in structuring long-term asset management decisions and tradeoffs taken by drinking water utility companies.

P.50 Reid, R; Loftis, B; Dwyer, S*; Kleinfelder, Inc.; [email protected] Constraint analysis for siting solar energy projects A risk analysis methodology (constraints analysis) was developed to evaluate conditions affecting site selection for ground mounted solar photo-voltaic (PV) systems. Utility companies active in the solar market have applied this methodology in their site selection efforts to evaluate environmental, engineering, and regulatory constraints that could render a site economically or physically infeasible for development. The constraints analysis addresses up to 16 characteristics for a given site, including flooding, presence of jurisdictional waters, threatened and endangered species, sensitive habitats, regulatory environment, topography, land ownership, zoning, site access, geotechnical conditions, and distance to electrical transmission infrastructure. The primary goals of the constraints analysis are to optimize the allocation of capital and to minimize capital at risk. Presently, the constraints analysis tool is largely qualitative and relies on subjective judgments regarding each site characteristic. Approaches to advancing the constraints analysis through the use of advanced analytical tools, such as multi-criteria decision analysis and GIS, will be discussed.

M4-I.5 Reilly, AC*; Guikema, SD; Johns Hopkins University; [email protected] Bayesian Multiscale Modeling of Spatial Infrastructure Performance Predictions A number of models have been developed to estimate the spatial distribution of the likelihood of infrastructure impact during a natural hazard event. For example, statistical approaches have been developed to estimate the percentage of customers without power due to a hurricane, with the estimates made at the census tract level. However, such statistical models for predicting outage rates do not fully account for the spatial structure of outage patterns, leading to predictions where adjacent regions are dissimilar. In this paper, we develop a tree-based statistical mass-balance multiscale model to smooth the outage predictions at granular levels by allowing spatially similar areas to inform one another. Granular observations are then aggregated based upon their intrinsic hierarchical spatial structure leading to courser, region-wide predictions. We use a generalized density-based clustering algorithm to extract the hierarchical spatial structure. The “noise” regions (i.e., those regions located in sparse areas) are then aggregated using a distance-based clustering approach. We demonstrate this approach using outage predictions from Hurricanes Irene and develop outage prediction maps at various levels of granularity.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-F.4 Reinhardt, JC*; Chen, X; Liu, W; Manchev, P; Paté-Cornell, ME; Stanford University; [email protected] Project Fox: Taming Asteroid Risks Understanding and mitigating the risks posed by asteroids is important to the scientific and policy-making communities, as near earth objects threaten human lives and may threaten the very existence of human civilization. Qualitative and quantitative studies have been done to assess such risks, and some reasonable point estimates have been proposed. However, due to the low probability/high consequence nature of asteroid risks, these measures provide limited actionable insights and may even lead to false confidence when interpreted inappropriately. Project Fox aims to provide a probabilistic model that evaluates the collective threat from asteroids events. We implement a modularized simulation architecture that takes probabilistic inputs to construct the probability distribution of rare events efficiently. The model suggests that although the point estimates of annual death rates due to asteroid events have small numerical values, the probability of incurring a catastrophic loss of human lives and property damages is significant. Furthermore, the probability of a cataclysm scenario (an impact event with globally reaching consequences) due to asteroid impacts is far from negligible, and the current understanding of impactors that could trigger global consequences underestimates the actual risk by leaving a large fraction of cataclysmic risk scenarios unaccounted for. Specifically, we find that the majority of global consequence risk is attributable to asteroids between 300 and 1000 meters in diameter, signaling the importance of missions to observe and track asteroids in this range. Finally, “civil-defense” counter-measures mitigate the risk of human casualties from asteroid impacts to some small level, but they are certainly not a panacea. Because of the potentially poor performance of civil defense methods, further analysis on the effectiveness and feasibility of space missions to interdict asteroids are warranted.

W2-F.3 Reiss, R; Exponent; [email protected] What can we learn and apply from journal peer review? Scientific peer review is a critical component of the regulatory process. The peer review model has been used in the evaluation of scientific papers for publications long before peer review was used for regulatory programs. Therefore, it is reasonable to ask if there are lessons from the journal peer review process that can be applied to regulatory peer review. Some of the issues that will be discussed include the selection of peer reviewers, achievement of balance of perspectives in peer review, and the requirements to respond and modify analyses based on peer review. Another area for improvement is the process of revision that occurs after a peer review. The journal model of accepting, rejecting, or sending papers back for more work, may provide a possible model for regulatory peer reviews. Another topic that will be discussed is a possible model for enhanced review of select regulations that are of high value, such as sending a select number of assessments automatically to the National Academy of Sciences each year. All of these ideas have to be balanced with the need for timely assessments.

W4-G.1 Reitman, F*; Sun, T-J; Beatty, P; LeHuray, AP; Hammon, TL; Juba, MH; Palermo, C; Lewis, RJ; White, RD; 1 Shell; 2 Chevron; 3 American Petroleum Institute; 4 Naphthalene Council; 5 ConocoPhillips; 6 Koppers, Inc.; 7, 8 ExxonMobil Biomedical Sciences, Inc. 9 American Petroleum Institute; [email protected] Naphthalene Rodent Inhalation Bioassays and Assessment of Risk to Exposed Humans: Problem Formulation

T1-F.1 Renn, O; University of Stuttgart; [email protected] Emerging Risks: Concepts and Approaches Emerging risks denote future threats where the potential losses as well as the probability distribution of their occurrence are either unknown or contested. Emerging risks may be further broken down into three distinct, but overlapping categories, based on their main characteristic at a certain stage. These categories are described as follows: (i) Emerging technologies with emerging risk profile based on high uncertainty and lack of knowledge about potential impacts and interactions with the affected risk absorbing systems; (ii) Emerging technological systems with emerging interactions and systemic dependencies. The main issue here is the not the risk of the technologies (that may be known or well estimated) but the interactions of these risk (and also benefits) with other types of risks or activities that could lead to non-linear impacts and surprises; and (iii): Established technologies in a new emerging context or environment: The main problem here is that familiar technologies are operated in a new context or in different organizational settings that may change both the probability as well as the magnitude of potential impacts. One could also include here risks driven by complacency and overconfidence on one’s own ability to cope with sudden crisis. Conventional approaches to projecting loss size, relative frequencies or probability distributions over time or severity of consequences are usually ineffective if applied to emerging risks. Furthermore, attempts to assess emerging risks with technical or scientific instruments may prove futile as scientific understanding of emerging risks can change rapidly. Therefore, adaptability and flexibility are vital to manage emerging risks in terms of individual, social, and economic impacts. This paper aims at developing a conceptual orientation for risk managers to better address emerging risks and be better prepared for the challenges of the future.

The National Toxicology Program (NTP) conducted 2-year naphthalene inhalation cancer bioassays in B6C3F1 mice (1992) and in F-344 rats (2000) and observed cancers in the mouse lung and the rat nose. Naphthalene exposure was not previously thought to pose a cancer risk, however, based on these two NTP studies EPA’s Integrated Risk Information System (IRIS) released a draft cancer risk assessment in 2004 which proposed naphthalene as “likely to be carcinogenic in humans”, with a calculated cancer potency 10 to 40-fold greater than benzene. However, the available human and mechanistic toxicity data did not support the association between naphthalene exposure and cancer, particularly nasal cancer. A 2006 Naphthalene State- of- Science Symposium (NS3), independently organized and funded by EPA, Naphthalene Research Committee (NRC) members and others, provided a rigorous review of the state -of-science for cancer risk assessment. The NS3 raised concerns in extrapolating these rodent bioassay results to human cancer risks including: (1) the NTP bioassays were conducted at high vapor concentrations above the Maximum Tolerated Dose, complicating low-dose extrapolation, (2) naphthalene likely does not represent a classical genotoxic carcinogen, (3) cytotoxicity and cellular proliferation appears to have played a role in the carcinogenic responses, (4) tumor formation rates at environmental, noncytotoxic exposure levels cannot be meaningfully predicted by simple linear extrapolation from the tumor rates observed in the rodent bioassays, and (5) there are important differences in naphthalene metabolism and toxicity between rodents and humans. Based on the study recommendations from NS3, industry associations and individual companies formed the NRC to co-sponsor research that strives to improve naphthalene risk assessments. The NRC’s objective has been to collect data to properly evaluate the carcinogenic potential of naphthalene.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-F.2 Renn, O.; Jovanovic, A.; Schroeter, R.*; University of Stuttgart; [email protected] Social Unrests as systemic Risks In this paper we develop a framework of social unrest within a complex understanding of systemic risk. The term ‘systemic’ describes the extent to which any risk is embedded in the larger contexts of social and cultural aspects that shape our understanding of risk, influence our attention to causal relationships and trigger our activities for handling these risks. Social unrest can be grouped into this framework of systemic risks. It can be a cause of risk to others, it can be a consequence of experiencing risk or the manifestation of such a risk or it can be a promoter of a risk chain that is located in other functional systems of society . Since social unrest is more a process of escalation than a finite state of the world we have conceptualized the term in from of a step-by-step escalation scheme. Each step makes social unrest more severe. We assume that that people who will engage themselves publicly on any subject have to be dissatisfied with their situation or perceive a problem that they would like to address. Even if people are dissatisfied nothing will happen unless that dissatisfaction is displayed in some kind of public arenaUnsatisfied people have to become active . If public expression of dissatisfaction and the organization of protest does not help to improve the situation the probability for further social mobilization increases. Social mobilization goes beyond expressing dissatisfaction. It comprises all activities that require an organizational effort to concentrate forces, to develop and enact a strategy for gaining public attention and for putting pressure on those who are targeted to make changes. In the course of this process, activities may get more and more radical, in particular if these collective protest actions are ignored or even oppressed . Then the continuum enters the next step: violent outbreak. This can ultimately lead to civil war.

T4-B.4 Rhomberg, LR; Gradient; [email protected] Using existing study data or methodologies from epidemiology and toxicology to evaluate diverse stressors Much discussion of cumulative risk assessment (CRA) has focused on exploring in principle whether multiple chemical exposures and variation in levels of non-chemical stressors can lead to risks different than those estimated on an agent-by-agent basis. This perspective talk examines practical issues in conducting and applying CRA in real cases, in which one seeks to support risk management actions with empirical evidence on combinations and interactions, where the tangle of conceivable causes and interactions needs to be resolved into a set that are important and can be characterized. Experimental data limit the scope of varying influences and are hard to apply to complex real populations, while with epidemiological data it is often hard to sort out identifiable roles for individual influences. Examining patterns of correlation of exposures, stressors, and outcomes in real populations offers some insights. Distinguishing confounding from joint action, and correlation of independent influences from their modification of one anothers' actions are key questions. Rigorous problem formulation can help focus efforts on finding ways to support the needed distinctions for decisionmaking.

M4-B.3 Rhomberg, LR; Gradient; [email protected] Challenges and approaches for evidence integration regarding endocrine disruption, exemplified by the case of bisphenol A As least in part, the often rancorous debates about scientific support regarding endocrine disruption in general – and more particularly about nonmonotonic dose-response curves and the application of hormonal mode-of-action data not tied to clear adverse effects – may be ascribed to lack of clearly agreed upon approaches for integration of diverse and unfamiliar lines of evidence. The debate on the possible low-dose effects of bisphenol A exemplifies many of these issues. This perspective talk examines the issues and suggests approaches to weight of evidence and integration among lines of evidence. Endocrine disruption is a mode of action, not an endpoint, and so traditional endpoint-oriented toxicity testing may need different kinds of examination. Endocrine activity is inherently about modulation of physiological states through small changes in low concentrations of causative agents, so low-dose issues, shapes of dose-response curves, and effects of natural background processes need special consideration. This puts a premium on consistency of effects and their dose dependence across studies and on a plausible h o rm on a l l y medi a t ed mec h a n i s m in assessing effects that also applies with a consistent rationale across endpoints and studies. The application of these principles to evaluation of bisphenol A low-dose toxicity is discussed.

M2-A.1 Rhomberg, LR; Bailey, EA*; Gradient; [email protected] Hypothesis-based weight of evidence: an approach to assessing causation and its application to regulatory toxicology Regulators are charged with examining existing scientific information and coming to judgments about the state of knowledge regarding toxicological properties of agents. The process needs to be seen as sound and objective. The challenge is that information is often far from definitive, containing gaps and outright contradictions. The particular results of studies must be generalized and extrapolated to apply to the target populations of the risk assessment. Existing weight-of-evidence approaches have been criticized as either too formulaic, ignoring the complexity and case-specificity of scientific interpretation, or too vague, simply calling for professional judgment that is hard to trace to its scientific basis. To meet these challenges, I discuss an approach – termed Hypothesis-Based Weight of Evidence (HBWoE) – that emphasizes articulation of the hypothesized generalizations, their basis and span of applicability, that make data constitute evidence for a toxicologic concern in the target population. The common processes should be expected to act elsewhere as well – in different species or different tissues – and so outcomes that ought to be affected become part of the basis for evaluating success and defining the limits applicability. A compelling hypothesis is one that not only provides a common unified explanation for various results, but also has its apparent exceptions and failures to account for some data plausibly explained. Ad hoc additions to the explanations introduced to "save" hypotheses from apparent contradictions need to be recognized. In the end we need an "account" of all the results at hand, specifying what is ascribed to hypothesized common causal processes and what to special exceptions, chance, or other factors. Evidence is weighed by considering whether an account including a proposed causal hypothesis is more plausible than an alternative that explains all of the results at hand in different ways.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W4-G.2 Rhomberg, LR*; Bailey, LA; Nascarella, MA; Gradient; [email protected] Hypothesis-based weight-of-evidence and dose-response evaluation for naphthalene carcinogenicity Human health risk assessment consists of bringing to bear a large body of in vitro, animal, and epidemiologic studies on the question of whether environmental exposures to a substance are a potential risk to humans. The body of scientific information, however, is typically less than definitive and often contains apparent contradictions. We applied our Hypothesis-Based Weight-of-Evidence (HBWoE) method to systematically and comparatively evaluate the large body of data and hypothesized modes of action for inhaled naphthalene carcinogenesis, expressing the relative degrees of credence that should be placed in alternative possible interpretations of the data, considering consistencies, inconsistencies, and contradictions within the data set. Guided by the outcome of our WoE evaluation, we conducted a dose-response evaluation of naphthalene exposure and neoplastic and non-neoplastic lesions, with the ultimate goal of deriving naphthalene toxicity values applicable to human health risk assessment. Our approach was to consider the applicability of the rat nasal tumors to serve as a basis for estimation of potential human respiratory-tract cancer risk. We did this by considering the mode of action (MoA) underlying the animal tumors seen in bioassays, including the metabolic activation and detoxification of inhaled naphthalene as they depend on air concentration, as well as the nature, tissue locations, and dependence on tissue-dose of key precursor responses. Species differences in tissue dosimetry were used to evaluate whether parallel tissues in humans, or other tissues in the respiratory tract, could be subject to tissue doses that could prompt the key events of the apparent MoA. The points of departure derived from rodent dose-response evaluations were extrapolated to human equivalent concentrations through application of a rat/human PBPK model that describes cross-species dosimetry of the upper respiratory tract, lung, and liver.

T4-B.2 Rice, GE*; Teuschler, LK; National Center for Environmental Assessment/ORD/US EPA; [email protected] Grouping of diverse stressors for cumulative risk analysis (CRA) by media, time and toxicity CRAs may address multiple chemical, physical, biological or psychosocial stressors. Approaches for grouping diverse stressors prior to risk analysis can simplify some complexities associated with CRAs. For CRAs involving chemical mixtures, this entails developing CRA exposure groups based on the media and timing of specific exposure combinations, including duration and intermittency of exposures/effects. CRA toxicity groups are developed using toxicological and epidemiological information to ascertain whether the chemicals in question share a common toxic mode of action or cause the same primary or secondary health outcome(s). Integrated CRA groups are then developed by merging the CRA exposure and toxicity groups. The chemicals in an integrated CRA group can plausibly exhibit exposures that overlap sufficiently in terms of pharmacokinetics or pharmacodynamics (e.g., persistent tissue damage); and exhibit interactions or cause the same health outcome(s). We extend this simple grouping concept to include buffers, factors that decrease the risk associated with a stressor. Finally, we extend this approach to non-chemical stressors, proposing a set of preliminary considerations for identifying non-chemical stressors that could be grouped with chemicals. Key considerations include: 1) ongoing or previous health conditions and disease states causing effects similar to those associated with a chemical’s effect; and 2) lifestages, lifestyle/occupational factors, and physical/biological stressors that may contribute to or buffer against a chemical’s effect. (The views expressed in this abstract are those of the authors and do not necessarily reflect the views or policies of the U.S. EPA.)

T4-E.2 Rico, CM*; Barrios, AC; Hong, J; Morales, MI; McCreary, R; Lee, WY; Peralta-Videa, JR; Gardea-Torresdey, JL; The University of Texas at El Paso, University of California Center for Environmental Implications of Nanotechnology; [email protected] The interaction of CeO2 nanoparticles with rice: Impacts on productivity and nutritional value Cerium oxide nanoparticles (nCeO2) have significant interactions in plants; however, there are no reports yet on their impacts in rice (Oryza sativa). Rice is an important crop that support the economic activity, and nutritional and health status of more than four billion people around the globe. This study was performed to determine the influence of nCeO2 on rice at seedling and grain production stages by employing biochemical assays, in vivo imaging, and ICP, FTIR and synchrotron spectroscopy techniques. For the seedling stage study, rice seeds were grown for 10 days in nCeO2 suspensions (0, 62.5, 125, 250 and 500 mg l-1). In case of the grains study, rice was grown in soil at 0 and 500 mg nCeO2 kg-1. The data showed that Ce in root seedlings increased as the external nCeO2 increased without visible signs of toxicity. The H2O2 generation increased at 500 mg nCeO2 l-1, while the membrane damage was enhanced at 125 mg nCeO2 l-1. Fatty acid and lignin contents were markedly reduced at 500 mg nCeO2 l-1. The FTIR spectromicroscopy showed modifications in the biomolecular components of root xylem while synchrotron -XRF analysis revealed the presence of Ce in the vascular tissues of the roots. Results from the grains study showed that the starch, sugar, and antioxidant contents were not affected by the treatment. However, nCeO2 caused a dramatic reduction in globulin, prolamin, and glutelin protein contents, and in lauric, valeric, palmitic and oleic acids. In addition, the concentration of K, Na and Fe increased while S was markedly reduced. Rice grain also showed significantly higher amount of Ce in the nCeO2 treatment. Although no phenotypical changes were observed, the results illustrate that nCeO2 cause toxicity to rice. This study sheds light on the impacts of nCeO2 on the production and nutritive value of rice.

T2-H.4 Risz, Y; Reich-Weiser, C; Cammarata, C*; Enviance Inc. ; [email protected] Identification of Hidden Risks and Associated Costs using Integrated Life Cycle Impact and Cost Assessment (ILCICA) Organizations are exposed to risks and associated hidden costs that are embedded in their supply chain and operations (e.g., health care, resource scarcity, remediation, reporting, fines, training, and safety equipment). Before developing risk mitigation strategies, organizations first must identify, quantify and prioritize these risks and resulting costs. This presentation will introduce an innovative methodology that blends life cycle assessment (LCA) with financial valuation techniques to more quickly and accurately assess environmental and health impacts and associated costs. Known as Integrated Life Cycle Impact and Cost Assessment (LCICA), this best-in-class method can assess cradle-to-grave impacts at any level of the organization and then monetize those impacts to clearly quantify risk to the bottom line. The discussion will be informed by lessons learned from two pilot studies conducted for the Department of Defense and in collaboration with leading Aerospace & Defense companies.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-G.1 Rivers, III, L.*; Arvai, J.L.; North Carolina State University; [email protected] Roundtable: Effective Risk Communication: Launch of a A New Book from EarthScan The field of risk communication is at a crossroads. Interest in risk communication across multiple fields is considerable, and research and practice focused on it continues to unfold at a rapid pace. However, there is still little agreement among scholars and practitioners about what constitutes effective risk communication. The goal for this roundtable discussion, spurred by the release of Effective Risk Communication a new book from EarthScan, is to begin a critical examination of the current state of risk communication. We will explore the past and future of risk communication focusing on what we have learned from past work, and what is needed to push the field forward. The roundtable will take a broad view of risk communication, presenting perspectives from multiple disciplines (psychology, communications, risk sciences, decision sciences, etc.), a diversity of practitioners, and a range of contexts. The roundtable will feature contributors to the book, each offering a unique perspective toward the study and practice of risk communication. The roundtable will also provide a forum for dialogue between the roundtable participants and the audience moderated by the editors of the book. The following speakers will participate: Joe Arvai, Roger Kasperson, Robyn Wilson, Cindy Jardine, Lauren Fleishman, Frederic Bouder, Julie Downs, Ragnar Lofstedt, Adam Zwickle and others.

M4-J.1 Robinson, LA*; Hammitt, JK; Zeckhauser, R; Linhart, M; Harvard University; [email protected] Barriers to assessing the distribution of regulatory impacts Before they promulgate major environmental, health, and safety regulations, U.S. government agencies must assess each regulation’s aggregate economic impact and are also expected to assess how the impacts are distributed. We find, however, that agencies focus on estimating national benefits and costs and provide very little information on their distribution. To the extent that the distribution is mentioned, the discussion is often limited to noting that the examined regulation will not impose disproportionate adverse health effects on children, minorities, or low income groups. We explore several reasons for this approach. First, it may reflect philosophical framing: regulators may believe that they should choose the approach that maximizes net benefits as long as groups of concern are not harmed. Second, it may reflect political concerns: regulators may be worried that considering the distribution of costs and benefits will raise issues that they lack the legal authority to address. Third, it may reflect unstated and unexamined assumptions: regulators may believe that the distribution is insignificant or inconsequential. Fourth, it may reflect analytic challenges: regulators may need more technical guidance, data gaps may be substantial, and time and resource constraints may be severe. We conclude that each of these factors contributes to the lack of attention to the distribution of costs and benefits. However, to understand whether this inattention is problematic, we first need to better understand how costs and benefits are likely to be distributed and whether these impacts are significant. Decisionmakers can then determine whether more analysis is desirable.

W3-B.1 Rodricks, JV*; Kaden, DA; ENVIRON International Corp; [email protected] Integration of the science necessary for assessing potential carcinogenicity of formaldehyde: Introduction Recent risk assessments have been conducted, with much scientific debate surrounding the potential for formaldehyde to cause both nasopharyngeal cancer and lymphohematopoietic malignancies, including myeloid leukemia. Conducting a risk assessment for formaldehyde presents many challenges. This is largely due to the significant database for this compound, as well as the increasing amount of scientific research being conducted to address the carcinogenic potential of the compound. There are also significant challenges in characterizing acceptable exposures of humans to formaldehyde in the low concentration range due to its presence endogenously from normal biological processes. Highly sensitive analytical methods have been developed to characterize differences in biomarkers of exposure resulting from endogenous versus exogenous formaldehyde. These results, combined with epidemiological, pharmacokinetic and mode of action data that are available for formaldehyde, provide the data needed to draw conclusions regarding the potential for formaldehyde to cause different types of cancer following exposure to at low concentrations (< 1 ppm). Furthermore, this issue allows for the development of new methodological approaches in assessing risk of chemicals with both endogenous and exogenous exposures. The integration of these data is critical not only for understanding the potential for formaldehyde carcinogenicity, but also to provide high quality information to inform regulatory decisions for compounds, such as formaldehyde, that present complex challenges. This session will focus on the available scientific data for formaldehyde relevant for assessing its leukemogenic potential and demonstrate how these data can be integrated to draw conclusions critical for regulatory decision making.

T2-G.4 Roh, S*; Schuldt, JP; Cornell University; [email protected] Where there’s a will: Can highlighting future youth-targeted marketing build support for health policy initiatives? Amid concern about high rates of obesity and related diseases, the marketing of nutritionally poor foods to young people by the food industry has come under heavy criticism by public health advocates, who cite decades of youth-targeted marketing in arguing for policy reforms. In light of recent evidence that the same event evokes stronger emotions when it occurs in the future versus the past, highlighting youth-targeted marketing that has yet to occur may evoke stronger reactions to such practices, and perhaps, greater support for related health policy initiatives. Web participants (N=285) read that a major soda company had already launched (past condition) or was planning to launch (future condition) an advertising campaign focusing on children. Measures included support for a soda tax and affective responses to the company’s actions. Greater support for the soda tax was observed in the future condition compared to the past condition, an effect that was fully mediated by heightened negative emotions reported toward the soda company in the future condition. The same action undertaken by the food industry (here, marketing soda to children) may evoke stronger negative emotions and greater support for a health policy initiative when it is framed prospectively rather than retrospectively.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-G.3 Roh, S; McComas, K*; Decker, D; Rickard, L; Cornell University; [email protected] Perils and Promises of One Health Risk Messages about Lyme Disease The next several decades are predicted to witness increasing prevalence of zoonotic diseases; however, not everyone will experience these risks equally. One such disease is Lyme disease, which is a growing risk to many but not all U.S. regions. Using a randomized experiment based on a nationally representative sample of U.S. adults (N = 460), we examine how respondents react to messages about Lyme disease depending on its prevalence in their state. Additionally, in support of our ongoing research program that examines ways to communicate about the interconnectedness of human, environmental, and wildlife health (e.g., One Health), we investigate how the attribution of responsibility for Lyme disease and its temporal proximity influences perceptions of disease risk. Respondents received one of four messages varying the temporal proximity (today vs. in the next ten years) and responsibility (wildlife vs. human and environmental factors) for Lyme disease. We used CDC data to categorize respondents as living in high, medium, or low prevalence states. The multi-level modeling shows the temporal frame × responsibility frame × prevalence (fixed effects) interactions, suggesting some perils and promises of One Health risk messages. Among them, respondents living in low prevalence states tended to see the risks of Lyme disease as decreasing when they read the message blaming wildlife behavior and using a temporally distal frame. Conversely, the same message resulted in increased risk perceptions among respondents living in mid-prevalence. Respondents living in high prevalence states who read the One Health message with a temporally proximal frame, which we might consider the most scientifically accurate message, tended to see the risks as decreasing. We consider these results in light of efforts seeking to enhance risk communication of Lyme disease, as well as the implications of One Health risk messages. We also discuss plausible mechanisms of the observed message effects.

M4-I.2 Rose, SM*; Handschy, MA; Apt, J; CARNEGIE MELLON UNIVERSITY, ENDURING ENERGY LLC; [email protected] Estimating the probability of extreme low-wind periods in the central United States We estimate the probabilities of extreme low-wind-power events across regions ranging in size from tens to a few thousand kilometers. We first estimate the distributions of aggregate wind power for a range of aggregation area sizes using historical wind speed data and semi-empirical wind speed data from meteorological reanalysis projects. We then derive similar distributions theoretically by aggregating the output of individual wind farms modeled by correlated probability distributions. In both approaches, we attempt to characterize how the tail of the distribution of aggregate power changes as a function of the number of wind farms and the separation between them. These estimates will aid electrical grid operators in determining the quantity of conventional power that must be available over the long term as the penetration of wind power increases, and will aid in understanding debt-financing risks for wind project development.

T2-J.1 Rosenstein, AB*; Mori, CS; Collier, TK; Ruder, E; Risk Assessment Consultant, Industrial Economics Incorporated (IEc); [email protected] Review of Marine Mammal Inhalation Toxicity for Petroleum-Related Compounds: Potential Applications to Risk Assessment After a major oil spill, chemical exposures may have health impacts for marine mammals, both individually and at population levels. In the sparse literature reporting health effects of petroleum exposure in marine mammals, exposure levels are not usually quantified and exposure routes are often unspecified or consist of concurrent inhalation, ingestion, and dermal exposures. It has been hypothesized that, following oil spills, inhalation of petroleum-associated compounds may be of most concern for causing health impacts in marine mammals, due to the time spent at the air-water interface where the greatest concentrations of volatile hydrocarbons are found. Determination of inhalation toxicity is complicated because different sources of oil consist of different mixtures of volatile compounds that are emitted in varying concentrations to the air overlying a spill. In order to help assess marine mammal inhalation toxicity, published inhalation toxic effect levels (TELs) for individual petroleum-related compounds from laboratory studies have been compiled. We review inhalation TELs for individual petroleum-associated compounds, discuss the potential application of these TELs to marine mammal risk assessments following oil spills, and describe possible approaches for adjusting laboratory animal TELs to marine mammals.

M4-H.4 Rosoff, H*; John, R; Cui, T; University of Southern California, CREATE; [email protected] To transaction online or to not transaction online: Dilemmas involving privacy, security and identify theft As internet transactions continue to become more pervasive, the intersection between trust, privacy, convenience, and security also becomes more important. For instance, individuals regularly are drawn to the convenience of conducting financial transactions online, whether it is for e-commerce, online banking, or bill payments. However, does the lack of protection (e.g. limited identity authentication) or too much protection (e.g. too rigorous authentication) alter an individual’s preference for convenience? Similarly, when engaging in online transactions there is a certain amount of trust that the transaction occurs as intended. On one level this involves individuals trusting that purchased merchandise will arrive on time or that ATMs with distribute money and debit funds appropriately. This also pertains to trusting that the information exchanged is both secure and privacy is respected (e.g. information is not shared with third parties). The question remains, do individuals no longer trust systems and cease to engage in online transactions? We present the results of a study that evaluates the tradeoffs between trust, privacy, convenience, and security in varied online financial transactions. We also consider how the implementation of various cyber policies may or may not impact individual decision making relative to these tradeoffs.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-B.3 Ross, MA*; Owens, BO; Vandenberg, JM; U.S. Environmental Protection Agency; [email protected] The EPA Causality Framework for Assessment of Air Pollution-Related Health Effects The periodic review of U.S. National Ambient Air Quality Standards (NAAQS) for each of the six criteria air pollutants -ozone, particulate matter, carbon monoxide, nitrogen oxides, sulfur oxides and lead -- starts with the synthesis and evaluation of the most policy-relevant science in Integrated Science Assessments (ISAs). EPA has developed an approach for formal characterization of the strength of the scientific evidence and drawing conclusions on causality for exposure-effect relationships. The framework establishes uniform language concerning causality and brings greater consistency and specificity to the ISAs. EPA drew on relevant approaches for similar scientific decision-making processes by EPA and other organizations. In Findings from multiple lines of evidence -controlled human exposure, epidemiologic and toxicological studies -- are evaluated and integrated to draw conclusions on health effects with regard to factors such as consistency, coherence and biological plausibility. The relative importance of different types of evidence varies by pollutant or assessment, as does the availability of different types of evidence for causality determination. The use of the framework is demonstrated with several examples of determinations for various health outcomes and pollutants, particularly drawing from the recently-completed ISA for Lead (Pb). Disclaimer: The views expressed are those of the authors and do not necessarily reflect the views or policies of the US EPA.

M2-H.3 Rothschild, C; McLay, LA*; Guikema, SD; University of Wisconsin-Madison; [email protected] Adversarial risk analysis with incomplete information: A level-k approach This paper proposes, develops, and illustrates the application of level-k game theory to adversarial risk analysis. Level¬-k reasoning, which assumes that players play strategically but have bounded rationality, is useful for operationalizing a Bayesian approach to adversarial risk analysis. It can be applied in a broad class of settings, including settings with asynchronous play and partial but incomplete revelation of early moves. Its computational and elicitation requirements are modest. We illustrate the approach with an application to a simple bioterrorism Defend-Attack model in which the defender’s countermeasures are revealed with a probability less than one to the attacker before he decides on how or whether to attack.

T2-H.3 Rouse, J.F.; Arete Associates, Joint Staff; [email protected] The Chairman of the Joint Chiefs of Staff: Risk Assessment System The Chairman of the Joint Chiefs of Staff (CJCS) serves as the principal military advisor to the President, the Secretary of Defense, and the National Security Council. He is also required to annually submit to Congress his assessment of the strategic and military risks in executing the missions called for in the National Military Strategy. At the intersection of these two responsibilities lies the Chairman's Risk Assessment System; which incorporates the major tenets of the International Risk Governance Council’s 2003 White Paper- Risk Governance-an Integrative Approach. This framework has now been employed by three different Chairmen, with each focusing efforts on different components – highlighting the highly individual interaction of the senior leader with his supporting risk system. This paper describes the current system and how it has adapted to meet the needs of different Chairman and support strategic decision-making - with specific examples of national decisions that were impacted. Lessons learned have direct applicability to other government agencies and efforts to deal with highly qualitative risks arising from the strategic environment.

T4-C.2 Sacks, JD*; Vinikoor-Imler, LC; Ross, M; U.S. Environmental Protection Agency; [email protected] Identifying populations at-risk of air pollution-induced health effects through the use of a novel classification scheme Under the U.S. Clean Air Act, the National Ambient Air Quality Standards (NAAQS) are established that provide an adequate margin of safety requisite to protect the health of portions of the population at increased risk of an air pollution-induced health effect. It is, therefore, imperative that the available scientific evidence that examines factors that may result in certain populations being at increased risk is fully characterized to inform decisions made in risk and exposure assessments as part of the NAAQS review process. Investigators have often classified these “at-risk” populations as being either susceptible or vulnerable to air pollution. However, this dichotomous approach ignores the complexities involved in identifying these populations. It has been well characterized that risk to air pollution-induced health effects can be influenced by: (1) intrinsic factors, (2) extrinsic factors, (3) increased dose, and/or (4) increased exposure. The U.S. EPA in the development of Integrated Science Assessments characterizes populations that may be at increased risk of an air pollutant-induced health effect to support policy decisions made during the course of the NAAQS review process. The first step of this characterization consists of integrating evidence across scientific disciplines, i.e., epidemiologic, controlled human exposure, toxicological, and exposure sciences studies. To facilitate the identification of factors that contribute to increased risk we then developed an approach to evaluate the strength of evidence and determine the level of confidence that a specific factor affects risk of an air pollutant-induced health effect. The classification for a specific factor is based on whether: consistent effects are observed within a discipline; there is evidence for coherence of effects across disciplines; and there is evidence for biological plausibility. We will demonstrate the application of this framework using examples from recent NAAQS reviews.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T3-A.4 Sadeghi, F*; Assadian, MS; Pardazesh Samaneh Farboud Consulting Co.; [email protected] Risk management in international construction joint ventures: Lessons learned from a case study in Iran Construction industry is one of the challenging and rewarding industries which faces with different aspects of risks. These risks are resulted from broad variety of issues such as economic, political, social, and environmental. In recent decades, with the continued globalization of the world’s economies, international joint ventures (IJV) are commonly used by firms as an international strategy and as a means of competing within global competitive area. IJV provides stimulating advantages including international market development, technology transfer, enhancing partners’ capacities, and clarifying the local obscure and complex environment. IJVs are often thought to enhance corporate flexibility and thereby shift, share, or reduce risks. However, despite reducing the technological, financial, operational, and market risks, at the same time certain types of risks would be introduced into these settings. Based on the literature review, these risks can be categorized into: internal risks (e.g. policy misalignment, partner financial problems, allocation of work), project-specific (e.g. conditions of contracts, demands & variation by clients, incompetence of suppliers), and external risks (e.g. economic fluctuation, policies, laws and regulations, environmental concerns). This implies a vital need for IJVs to implement risk management as a policy making tool in order to lead them towards success. In this research, we aim at finding out the practical key points of risk management implementation. For this purpose, an IJV - Chinese and Iranian companies - in an Iranian construction project has been chosen as a case study. The case findings indicate firstly the effect of risk management in different stages of IJV life cycle and secondly necessity of external risk identification and mitigation in developing countries like Iran. It is hoped that proposed approach could be applicable for IJVs to determine better policies to speed up their movement towards success and sustainability.

P.112 Sager, SL*; Locey, BJ; Schlekat, TH; ARCADIS U.S., Inc.; ssager[email protected] The Balance between Protection of Human Health and Compliance with Regulatory Standards Drinking water standards such as maximum contaminant levels (MCLs) are frequently selected as remedial goals for groundwater whether or not this resource is used as a potable water supply. Some states even go as far as promulgating groundwater standards based on MCLs. However, when toxicity values change, there can be a lag time between those changes and a revision to the drinking water standard. This lag time has vast implications for industrial sites. As examples, the changes in the toxicity values of 1,1-dichloroethene (1,1-DCE) and tetrachloroethene (PCE) recommended by the United States Environmental Protection Agency (USEPA) and their implications for setting groundwater standards and remedial goals will be discussed. The groundwater standard for 1,1-DCE in the State of North Carolina will be presented as a case study. Implications for the groundwater standard for PCE will also be discussed. North Carolina recently revised the groundwater standard for 1,1-DCE from the MCL of 7 micrograms per liter (µg/L) to a health-based concentration of 350 µg/L. In their economic analysis in support of the revision, the State of North Carolina reported cost savings of over $1,000,000 reflected by reduced sampling and analysis, reporting, and regulatory oversight. However, since compliance with the MCL is still required for public water supplies, these savings are primarily achieved for sites without potable water. A similar scenario is envisioned for PCE with broader economic impacts. The State of North Carolina groundwater standard is a health-based concentration of 0.7 µg/L based on an old toxicity value. However, a revised standard could be set equal to the current MCL of 5 µg/L or it could be recalculated to equal 16 µg/L if the latest toxicity information from USEPA is utilized. In addition this paper will discuss the cost of the time lag to update standards based on the latest science to both industry and the public.

P.118 Sahmel, J; Devlin, KD; Hsu, EI*; Cardno Chemrisk; [email protected] Measurement of Hand to Mouth Lead Transfer Efficiency - A Simulation Study There are currently no known empirical data in the published literature that characterize hand-to-mouth transfer efficiencies for lead. The purpose of this study was to quantify the potential for the hand-to-mouth transfer of lead in adult volunteers using human saliva on a test surface as a surrogate for the mouth. Commercially available 100% lead fishing weights, confirmed by bulk analysis, were used as the source of dermal lead loading in this study. Volunteers were instructed to collect saliva in a vial prior to the study. A small amount of saliva was poured on to a sheet of wax paper placed on a balance. Volunteers were instructed to handle lead fishing weights with both hands for approximately 20 seconds and then press three fingers from the right hand, ten presses per finger, into the saliva with approximately one pound of pressure. The left hand remained as a control with no saliva contact to compare total dermal loading. Palintest® wipes were used to perform a series of wipes to collect lead from the saliva and skin surfaces. Samples were analyzed by the NIOSH 7300 method, modified for wipes. Quantitative analysis yielded a lead hand-to-mouth transfer efficiency that ranged from 12 to 34% (average 24%). A two-tailed paired t-test determined that the amount of lead loaded onto each hand was not statistically different (p-value: 0.867). These quantitative transfer data for lead from the skin surface to saliva are likely to be useful for the purposes of estimating exposures in exposure assessments, including those involving consumer products, and human health risk assessments.

P.103 Sakata, N*; Kuroda, Y; Tsubono, K; Nakagawa, K; The University of Tokyo; [email protected] Public response to information about the risk of cancer after the nuclear disaster in Fukushima The purpose of this study was to assess the response of residents in the affected area of Fukushima to a presentation on cancer risk that compared exposure to radiation and lifestyle choices. In March 2013, residents of Fukushima who had not been evacuated attended a lecture by an expert about cancer risks. After the lecture, a questionnaire about their response to the presentation was completed by 173 residents. The questionnaire concerned the perceived usefulness of or aversion toward the comparison of exposure to radiation and lifestyle choices. Residents responded on a 4-point Likert scale. In addition, the reason for any aversion was requested. Content analysis was performed for the qualitative data. Of the 173 residents (mean age ± SD = 59.53 ± 11.1), the expert’s information was rated useful or very useful by 85.5%, while 14.5% responded that the discussion was not very useful or not useful. Additionally, 59.3% responded that they did not feel any aversion toward the comparison of exposure to radiation and lifestyle and 40.7% responded they had feelings of aversion. Five categories and twelve codes were extracted from the residents’ responses, including “could not understand the methodology” “did not like the fact that the expert classified the risk of radiation as low,” “it was inappropriate to compare exposure to radiation and lifestyle choices,” “distrust of government and experts,” and “the risk assessment for children was incomplete.” Comparing exposure to radiation and lifestyle choices was considered helpful in understanding the risk of cancer by most residents, but feelings of aversion were also present. Reducing and addressing aversion engendered by information about cancer risks should be addressed in future presentations.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-K.3 Salazar, DE*; Chatterjee, S; University of Southern California; [email protected] Managing risk through resilience and recovery in seaport operations Port operations are vital for the welfare of the country, supporting food security and economic activity. Arguably, natural hazards constitute the main threat to port operations, although intentional attacks can inflict significant damage too. Depending on the importance of each port and its exposure to risk, different strategies for risk management are possible. However, in some cases and due to the criticality of certain locations, resilience should be enhanced to secure at least some minimal level of operation capacity in the aftermath of a disaster. Although some frameworks for allocation of resources in critical infrastructure protection have been published, they pay more attention to prevention than resilience and recovery. In this research we focus on what to do in the aftermath of a disaster. We present a methodology to assess resilience in port operations using statistical and simulation modeling methods for risk and operability assessment and decision analysis tools for resource allocation.

M2-D.1 Sasso, AF*; Schlosser, PM; US Environmental Protection Agency; [email protected] A harmonized PBPK model of hexavalent chromium in rats and mice Hexavalent chromium (Cr(VI)) is an environmental and occupational contaminant, and is present in both soil and drinking water in the United States. In 2-year drinking water bioassays, the National Toxicology Program found clear evidence of carcinogenic activity in male and female rats and mice. Because reduction of Cr(VI) to Cr(III) is an important detoxifying step that can occur in the gastrointestinal (GI) tract prior to systemic absorption, numerous physiologically-based pharmacokinetic (PBPK) models have been developed over the past two decades to estimate inter-species differences in toxicity. While the currently available models adequately simulate the available dietary and drinking water datasets available for Cr(VI) and Cr(III), intravenous and gavage data were typically not evaluated. Due to uncertainties related to the kinetics and absorption of Cr(VI) in the GI tract, data for other routes of exposure provide valuable toxicokinetic information. Furthermore, all previous kinetic models for GI reduction assume a single pathway is responsible for the reduction of Cr(VI) to Cr(III), which does not capture the underlying complexities of GI reduction kinetics. The current work attempts to 1) harmonize assumptions between alternate prior PBPK models, 2) adequately simulate data for different routes of exposure and study designs, and 3) incorporate a revised model for GI reduction kinetics (which assumes multiple parallel reduction reactions). The potential impacts on future human health risk assessments will be discussed. The views expressed are those of the authors, and do not necessarily represent the views or policies of the U.S. EPA.

P.16 Sax, S*; Prueitt, R; Goodman, J; Gradient; [email protected] Weight-of-Evidence Evaluation of Short-term Ozone Exposure and Cardiovascular Effects There is a considerable body of research on the cardiovascular (CV) effects associated with ozone exposure, including epidemiology, toxicology, and controlled human exposure studies. US EPA is considering these data to determine whether to update the ozone National Ambient Air Quality Standards (NAAQS). We conducted a weight-of-evidence (WoE) analysis to determine if there was an association between CV effects and short-term ozone exposures at levels below the current primary ozone NAAQS of 75 parts per billion. Our analysis followed an updated WoE framework based on EPA's NAAQS framework. We found that the epidemiology evidence of CV morbidity and mortality is inconsistent and lacks coherence across specific CV endpoints. Specifically, the lack of epidemiology evidence of morbidity effects is not coherent with reported mortality estimates. Toxicology studies, although somewhat more consistent, are conducted at high exposure levels well above the current NAAQS, and there is limited information on dose-response relationships. Furthermore, there is a lack of coherence between reported results from epidemiology studies (suggesting no effects) and results from animal studies (suggesting small effects at high exposure levels). Similarly, controlled human exposure studies report inconsistent effects after exposure to high ozone levels above the current NAAQS. Overall, our WoE analysis indicates that CV effects are not associated with short-term ozone exposures below the current NAAQS.

W4-D.5 Sayes, CM; RTI International; [email protected] Life cycle considerations for nano-enabled products containing multiwalled carbon nanotubes (MWCNTs): Research to inform future risk analyses and risk management Engineered nanomaterials can bring great advantages to the development of sustainable products in multiple industries (e.g., construction, textile goods, sporting equipment). For example, multiwalled carbon nanotubes (MWCNTS) can increase the resistance of construction materials to aging (UV, mechanical stress, etc.), particularly when used as a coating. Risk analyses can inform the dynamic development of MWCNTs used in this industry by providing information to risk managers on potential MWCNT risks to humans and the environment. Risk assessors can use a product life cycle perspective to better inform risk managers about critical considerations in the development of safe nanomaterials, such as MWCNTs. To carry out these analyses, risk assessors need data from researchers on product development and manufacturing, exposure (consumer and occupational) and human and environmental impacts of MWCNT-enabling technologies. The objective of this talk is to present the most recent environmental health and safety findings on MWCNTs incorporated into current and emerging products. Specifically, this talk outlines our data on: 1) production of unique nanocomposites for products, 2) characterization methods for nano-enabled products, and 3) exposures and hazards to nanocomposites during the production phase. Information presented here will support an interactive discussion on what additional data in these areas would most facilitate risk analyses that use a product life cycle perspective to inform subsequent risk management decisions about MWCNTs.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T3-H.2 Scanlon, KA*; Yaroschak, PJ; Consultant; [email protected] Identifying and Mitigating Worker Health Risks from Lead Exposure in the Department of Defense This presentation will highlight the progression of proactive risk management actions taken by the Department of Defense (DoD) to mitigate impacts to worker health associated with the evolving science and regulations for lead (Pb). Lead is classified as a DoD emerging contaminant (EC) because the Centers for Disease Control and Prevention lowered the childhood blood lead level reference value from 10 micrograms per deciliter (ug/dL) to 5 ug/dL. This change in BLL may impact lead soil screening concentrations, air quality standards, occupational exposure limits, or medical management guidelines. An EC impact assessment for lead concluded that the changing science and regulatory landscapes presented a high risk of impact to the DoD during use of lead-containing munitions on test and training ranges, maintenance of these assets, and demilitarization and disposal. As a result of these impact assessment findings, risk management options were proposed by the DoD that aimed to assess occupational lead exposures to determine the effectiveness of current occupational health programs for protecting workers, specifically firing range personnel. In turn, the DoD asked the National Research Council to evaluate potential health risks related to recurrent lead exposure of firing range personnel and to determine whether exposure standards for lead adequately protect these personnel. The subsequent report, published at the end of 2012, found that there was overwhelming evidence that the general industry standards for lead exposure, set by the Occupational Safety and Health Administration more than 30 years ago, are inadequate to protect employees at DoD firing ranges. DoD has since begun a comprehensive review of its medical management guidelines and practices for protecting workers from lead exposure. The current status of this review will be provided.

T1-F.3 Scheer, D; University Stuttgart; [email protected] Knowledge transfer of simulation-based knowledge from science to policy makers Computer simulations have established as a fundamental accomplishment in the field of information and communication technologies. Benefiting from its increasing capacities for digital data processing, simulations play an important role in a great variety of applications. However, simulations are of major significance in science-based fundamental and application-oriented research in the field of emerging technologies where detailed technology configurations and impacts are not yet specified. Simulations contribute particularly with analyzing, assessing and predicting technology-based emerging risks. Knowledge gained from computer simulations is not limited to the scientific community itself but impacts other domains of society such as politics, business and industry, and the public at large. Production of simulation-based knowledge and its communication towards political decision-makers have become a crucial factor within policy-making. Impacting societal domains, simulations meet two principal functions: they serve as a knowledge instrument as well as a communication instrument at the science-policy interface. Nonetheless, so far science did not consider in depth how processes and circumstances of simulations-based knowledge transfer works. How are policy-relevant scientific simulation results processed and used in policy-making? The conceptual analysis to identify knowledge and communication functions is based on theories of both knowledge transfer at the science-policy interface and knowledge transfer as a communication process. In addition, a more detailed focus is on policy-makers as information receivers with specific communication theories tackling the issue of human information processing. The empirical results reveal a great variety of coexisting perception and reception patterns as well as assessment and use patterns how decision-makers process simulations results in the field of carbon capture and storage (CCS) technology development and implementation.

T2-F.3 Schetula, VS; Dialogik non profit institute for communication and cooperation research; [email protected] Decision-making and participation with a special focus on energy policy and climate change: how to integrate the knowledge of citizens and associations. In the aftermath of the Fukushima nuclear disaster, Germany decided to phase out nuclear energy. The government has announced that it will shut down the last reactors by 2022. In the course of implementing this decision, the government faces mores skeptical views and even opposition because if rising electricity prices. Now the government in Germany is forced to deal not only with technical issues but also with a declining acceptance in society as a whole. In response to this growing dissatisfaction, the state government of Baden-Württemberg, led by a green prime minister, has started an ambitious public participation program. Part of the participation program is the issues of risk tolerance and risk management. How concerned are people about energy security, energy prices and new technologies for storing or converting energy in the framework of the new energy transition? Our group was asked to design and conduct a broad public participation process with the aim to assist policy makers in identifying these risks and evaluate measures to deal with them. Furthermore the participatory processes were designed to facilitate compromise between different interest groups. This, in turn, could have a positive effect on public acceptance of the proposed energy transition. This paper explains the participation process in which stakeholders, representatives of the general public and activists were involved to evaluate the policies and strategies for the energy transition. The paper will introduce the concept of the process, explain the major goals and procedures and report about the interactions between the three types of constituents that were included in the process. In addition, the following questions will be addressed: How did the authorities respond to the outcomes of the process? What kind of challenges did we face and how did we deal with difficult situations? What has been the outcome of the project? What lessons have been learned?

M3-A.5 Schimmel, JD; Lovely, RK*; Springfield Water and Sewer, Kleinfelder; [email protected] EPA Promotes Risk Based Asset Management as Deployed in Springfield Massachusetts The Springfield Water and Sewer Commission (SWSC) is responsible for managing the wastewater system in Springfield, MA. Of special concern are the wastewater interceptors that allow Combined Sewer Overflow (CSO) in to the rivers around the City during wet weather events. After finding water quality issues the EPA issued several Administrative Orders requiring the Commission to perform over $300M in CSO related work. Concurrently the SWSC has seen an increase in failures including pipe collapses within the wastewater collection system that threaten the City’s environment and communities. The SWSC did not have the resources to simultaneously address the Administrative Orders and imminent collection system failures. EPA educators promote a Risk based Asset Management strategy originally developed in Australia and New Zealand. This approach involves stakeholder input to rank the consequences of failure against an established set of service levels for which a wastewater utility is committed to provide. The method also requires failure mode assessments from which failure probabilities can be derived. With this information available a utility can calculate risks from consequences of failure and failure probability. The risk values are used to prioritize where to direct organizational resources. The information can also be used as a basis for other cost saving tools including deterioration modeling, life cycle costing, and business case evaluation. Through a series of workshops the SWSC - Kleinfelder team was able to demonstrate to the EPA how a Risk based Asset Management approach is the best method for the SWSC to meet all of its obligations. The presentation will walk the audience through the key elements of Risk based Asset Management and how it has been effectively deployed in Springfield to the benefit of all stakeholders including the EPA, City communities, and the Commissioners.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.143 Schlosser, PM*; Isaacs, K; Sasso, AF; Gift, JS; U.S. Environmental Protection Agency; [email protected] A Probabilistic Model of U.S. Intra-Day Tap Water Exposure and Its Application in PBPK Modeling While the previously developed SHEDS model (http://www.epa.gov/heasd/research/sheds.html) provides probabilistic sampling for total daily water consumption, it does not provide information on the intra-day distribution of that ingestion – the fraction of the total consumed in any hour of the day. For chemicals such as methanol or chloroform which are rapidly absorbed, the peak blood concentration (Cmax) depends strongly on this distribution and is a determinant for key toxic effects. We analyzed 2003-2010 NHANES dietary recall data for hourly ingestion, based on total moisture (g) in each food or beverage. (While the recall diaries allowed recording of events in 15-min intervals the data appeared biased to on-the-hour records, so consumption was summed into 1-h intervals.) Besides directly consumed tap water, a subset of NHANES foods and beverages assumed to be prepared with tap water was identified, and each was assigned an assumed fraction of total moisture attributed to tap water (e.g., 50% of canned soup prepared with water). Maximum-likelihood log-normal intake distributions were then fit for each 1-h interval. An assumed ingestion rate distribution (geometric mean, GM = 1 L/h) was applied allowing consumption to occur in less than an hour. 18% of the diaries included “extended consumption” occasions (vs. “Breakfast,” “Lunch,” etc.) , assumed to be drinking from a large water bottle, coffee mug, etc., over longer periods; these observations were analyzed separately from the hourly distributions and simulated with appropriate probability. Since the length of the extended consumption events were not reported, a slower rate distribution was assumed (GM = 0.2 L/h), with a minimum time of 1 h. With simulated distributions based on the NHANES diaries, most individuals appear to consume tap water less than 3 h each day. Use of these patterns as input to a PBPK model yields mean Cmax predictions 2x or more over an idealized pattern of 6 ingestion events per day.

T1-G.5 Schweizer, PJ; University of Stuttgart; [email protected] Requirements for climate change governance A global governance approach to tackle climate change seems now more out of reach than ever. One of the reasons for this deficiency is the sheer complexity of the issue. Climate change governance has to deal with uncertain science, the complexity of international negotiations, and ambiguous ethical issues, such as sustainability and international as well as intergenerational justice. Another reason is the politics of climate change governance. Climate change governance has sparked societal controversies in almost all countries of the world and has become a global issue. Consequently, the role of social trust and the role of civil society and the media have to be seen in the context of international relations and the management of common pool resources. In this multifaceted interplay of governance obstacles, it is a challenge if not an impossible task to propose a harmonized global governance framework of climate change. The intention of this presentation is to show ways in which climate change governance can be improved by a bottom-up rather than a top-down approach. Special attention will be paid to the question of how the “hard” facts of climate change science can be enhanced by “soft” parameters such as social acceptability and ethical as well as moral issues.

T2-I.2 Scott, RP*; Cullen, AC; University of Washington; [email protected] Applying multi-criteria decision analysis and life cycle approaches to direct engineering research regarding the selection of CZTS back-contacts for thin film solar photovoltaics Cu2ZnSnS4 (CZTS) thin film photovoltaics are currently in the research phase as a potential method of producing electricity for $0.50 per watt at terawatt scales of production using abundant earth materials. Many uncertainties have arisen in terms of scalable manufacturing methods, device materials, and social and economic feasibility. One area of uncertainty in the quest to scale up production results from current use of molybdenum as a back-contact material, due to concerns about price volatility. However, research on potential substitution materials, including graphite, graphene, and molybdenum on steel, is still in the earliest stages. Since the broader goals of photovoltaics involve producing energy in a manner that protects both the environment and human health, developing the new back contact material without first considering the long-term price feasibility and impacts to health and environment could derail the future commercial viability of the thin film technology. This case provides an empirically based analysis of using decision tools to inform research directions in emerging technology so as to avoid negative consequences. First proposed by Linkov et al (2007), a combined approach of Multi Criteria Decision Analysis, Risk Assessment, and Life Cycle Assessment allows for the application of available risk assessment and life cycle data within a larger decision framework. This work assesses the limitations of MCDA-RA-LCA while providing valuable decision analytic information to CZTS engineers. For the case of CZTS back contact selection, MCDA is used to assess the various material options with the goal of maximizing future device utility through optimizing across cost, environment, and health metrics. Application of utility theory and the use of Monte-Carlo simulation to partition significant sources of uncertainty allows the selection of materials to incorporate social responsibility and risk minimization alongside future technical promise.

T2-F.1 Sellke, P*; Amlot, R; Rogers, B; Pearce, J; Rubin, J; Mowbray, F; Dialogik non-profit institute; [email protected] Public Information Responses After Terrorist Events The threat western societies face through terrorist attacks became much more apparent than ever before through the attacks of 9/11 (New York and Washington 2001), 11-M (Madrid, March 11, 2004) and 7/7 (London, July 7, 2005). The new quality of those attacks comprised the deliberate attempt to cause as many fatalities as possible and to disrupt economic and social life. Not least the ruthlessness and sophistication of the attacks carried out made the use of radiological or biological substances for attacks conceivable, if not likely. How the public reacts to biological or radiological terrorism will help to determine how extensive the attack's medical, economic and social impacts are. Yet our understanding of what the public is likely to do in case of a radiological and/or biological attack is limited. Will they spontaneously evacuate affected areas? Are they willing to attend mass treatment centers? Will unaffected people demand treatment and monitoring? Will people avoid affected areas even after clean-up operations have been completed? As yet, we do not know. While emergency plans and simulations dealing with these scenarios assume a relatively compliant public with easily understood behaviors, evidence from previous incidents suggests that the reality may be different. As such, a first step to preparing better plans to protect the public is to identify actions they intend to take in the event of one of these scenarios occurring, and to assess how prevalent such intentions are in society. In this presentation results from a two-year reseach project will be presented, adressing the questions outlined above and comparing them between Germany and the United Kingdom. The presentation will emphasize the question of whether behavioral intentions of the public can be influenced by tailored emergency communication and the satisfaction of public’s information needs and what possible differences in the response to terrorist attacks exist between Germany and the United Kingdom.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-J.4 Sertkaya, A*; Jessup, A; Wong, H; Eastern Research Group; HHS Assistant Secretary for Planning and Evaluation; [email protected] Modeling Incentives for the Development of New Antibacterial Drugs We perform an economic analysis of possible incentives for the development of new antibacterial drugs for 6 different indications, including acute bacterial otitis media (ABOM), acute skin and skin structure infections (ABSSSI), community acquired pneumonia (CABP), complicated intra-abdominal Infections (CIAI), complicated urinary tract infections (CUTI), and hospital acquired bacterial pneumonia (HABP). To assess the current state of investment in antibacterial research and development (R&D) for areas of unmet medical need, we develop an economic model to calculate the expected net present value (ENPV) for prospective new drugs for each indication under various market conditions. The model is based on a multi-stage decision tree framework, in which there are a series of decision nodes. At each node there is an expected probability of success and marginal expected NPV. The ENPV uses the private opportunity cost of capital (also referred to as the private rate of discount) to account for changes in the value of money over time. Using the model, we then examine different types of incentives designed to stimulate antibacterial drug development. The incentive analysis involve solving for the level of each type of incentive to meet a set private ENPV target by indication using the decision-tree model developed.

W4-H.1 Severtson, DJ; University of Wisconsin-Madison; [email protected] How do maps influence perceived accuracy and validity and how do these perceptions influence risk beliefs? It is important to convey the uncertainty of modeled risk estimates depicted on maps. Findings from an earlier study indicated how 3 map features selected to depict the uncertainty of estimated cancer risk from air pollution influenced beliefs about risk that are predictive of decisions and the ambiguity of these beliefs. Viewers’ perceptions of the validity and accuracy of the mapped information may have a role in explaining how mapped risk influences risk beliefs and ambiguity. This study used previously unanalyzed data to assess how map features influenced (a) perceived accuracy and validity of the mapped information and (b) how these perceptions influenced risk beliefs and the ambiguity of risk beliefs. The full factorial 2 x 2 x 2 x 4 study used 32 maps that varied by the 3 features at 4 risk levels. Map features (uncertain vs certain) were: number of colors (1 vs 3), appearance of map contours (unfocused vs focused), and how risk was expressed in the legend (verbal and relative with no safety benchmark vs numeric natural frequencies with a safety benchmark). Each study map depicted an assigned “You live here” location within 1 of the 4 risk level areas. Maps were grouped into 8 blocks. Undergraduate participants (n=826), randomly assigned to a block of 4 maps, answered survey items assessing beliefs and ambiguity for each map and perceived accuracy and validity for each of the first two maps. Structural equation modeling was used to assess proposed relationships controlling for prior risk beliefs and ambiguity, perceived numeracy, and sex. Findings indicate perceptions of accuracy and validity are differentially influenced by map features and these perceptions differentially influence outcomes of beliefs and ambiguity. Numeracy moderated some of these relationships. Some perceptions mediated the influence of map features on outcomes, suggesting perceived trust and validity may have important roles in explaining how maps influence decisions.

T1-E.1 Shao, K*; Gift, JS; NCEA, USEPA; [email protected] The importance of within dose-group variance in BMD analyses for continuous response data Continuous data (e.g., body weight, relative liver weight) have been widely used for benchmark dose analysis (Crump 1984) in health risk assessments. The BMD estimation method for continuous data published in the literature (Crump 1995, Slob 2002) and used by the US EPA’s BMD software (USEPA BMDS 2013) essentially models continuous responses at each dose level as a certain distribution (e.g., normal distribution or log-normal distribution). However, the default method employed by regulatory agencies (USEPA BMDS 2013, European Food Safety Authority 2009) for BMD calculation defines the benchmark response (BMR) as a certain change relative to the mean response at background dose level. By this definition, the BMD calculation is based solely on the central tendency, which is just a part of the information of the response distribution. As EPA and other organizations move towards unifying the approaches for cancer and non-cancer dose-response assessment and defining risk-specific reference doses as suggested by the NAS (NRC 2009), a full consideration of the response distribution rather than just the mean value will play a more and more important role in dose-response analysis. This can be done by accounting for within dose-group variance using methods such as the “Hybrid” approach (Crump 1995). This study focuses on demonstrating the importance of considering within dose-group variance in BMD estimation for continuous data through a number of examples.

T1-J.1 Shapiro, S*; Carrigan, C; Carrigan - George Washington University, Shapiro -- Rutgers University; [email protected] What’s Wrong with the Back of the Envelope? A Call for Simple (and Timely) Cost-Benefit Analysis Cost-benefit analysis has been a part of the regulatory process for more than three decades. Over this period, it has been the subject of criticism across the ideological spectrum. Solutions to the perceived ineffectiveness of cost-benefit analysis tend toward one of two extremes. Opponents of analysis, not surprisingly, want to see it eliminated. Supporters of analysis often call for “deeper and wider cost-benefit analysis” We argue that cost-benefit analysis has evolved into a complex tool that does little to inform decisions on regulatory policy. Analyses either omit consideration of meaningful alternatives or are so detailed that they become practically indecipherable. And in either case they are often completed after a policy alternative is selected. Adding complexity or judicial review may eliminate the naive studies but will also increase incentives for agencies to make them even more opaque. Yet, eliminating analysis abandons all hope that an analytical perspective can inform critical policy decisions. We believe that a noticeably simpler analysis conducted much earlier in the regulatory process can play a critical role in regulatory decisions. Such an analysis would have to be completed well before a proposed rule and be subject to public comment. However, to ensure that it does not cripple the regulatory process, the examination could eschew the monetization and complex quantification that bedevils most current regulatory impact analyses. The more timely and modest analysis would be required to detail several policy alternatives being considered by the agency. The agency would list the likely benefits for each alternative in terms of gains to public health or welfare and be required to do the same with the probable costs. Public and political overseers would then be in a position to provide meaningful feedback before the agency has already decided on its course of action.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-D.4 Shatkin, JA; Vireo Advisors; [email protected] Life Cycle Risk Assessment of Nanocellulosic Materials As novel materials, cellulosic nanomaterials require proactive attention to ensure their environmental health and safety performance is addressed, and can be demonstrated. For novel nanoscale materials, limitations in the availability and reliability of data create challenges for measuring and assessing their potential risks to health and the environment in real world applications, particularly because of the dynamic nature of these materials in the environment. To date, studies of cellulosic nanomaterials have shown very little toxicity in the assays conducted. However, the emerging science on the behavior of nanoscale materials in the environment suggests a more thorough evaluation of safety to address occupational, consumer and environmental endpoints. In this talk, I will describe an analysis that broadly considered health and safety across the value chain for several anticipate product applications using cellulosic nanomaterials and outlines priorities for establishing safe management practices for the commercial use of nanocelluloses, including the status of standards. First, an assessment of the current state of knowledge and practice for managing EHS risks associated with nanocellulose throughout the product life cycle, during research and development, manufacturing, use in products, and environmental/end of life stages characterizes what is known, and what gaps exist regarding nanocellulose effects on health and environment, and what standards and guidelines already exist and what information is needed to address the breadth of stakeholder concerns. The analysis informs a roadmap that prioritizes the development of protocols most critical to advance R&D, manufacturing, and market development. Finally, the current status of standards development regarding EHS for cellulosic nanomaterials is discussed.

W2-C.3 Shereif, M*; Monshi , M; Alharbi , B; King Saud University; [email protected] Analysis and Monitoring of Criteria Air Pollutants in Selected Areas of Riyadh City, Saudi Arabia The present investigation was aimed at characterizing levels of the criteria air pollutants nitrogen dioxide (NO2), sulfur dioxide (SO2), ozone (O3), carbon monoxide (CO), particulate matter (PM10), and lead (Pb) in Riyadh city in selected areas representing industrial (Al-Manakh), residential (Al-Malaz), recreational (Al-Rawda) and urban/road locations (Al-Bat’ha). The average concentration levels of NO2 in the industrial and urban/road areas, and SO2 in the industrial area, during the summer months of June through September of 2009 and 2010 exceeded the Saudi Arabian Air Quality Standards (SAAAQS) and US National Ambient Air Quality Standards (NAAQS). The average CO concentrations in the four areas during 2009 were below the SAAQS and US NAAQS. However, the monthly averages of CO in the urban/road area exceeded the standards of both SAAAQS and the 8-h US NAAQS for CO (10,000 µg/m3) through the summer months of June, July, August, and September of 2010, which could be attributed to the emissions from excessive traffic and motor vehicles in this area. Ozone and lead concentrations at the four selected areas were below the SAAAQS and NAAQS. With the exception of the industrial area, the measured values of inhalable particulate matter (PM10) in the other areas were found to meet the SAAAQS

M4-J.2 Sheriff, G; Maguire, K*; US Environmental Protection Agency; [email protected] Ranking Distributions of Environmental Outcomes Across Population Groups This paper examines the use of inequality indices for evaluating distributional impacts of alternative environmental policies across population groups defined by demographic variables such as race, ethnicity, or income. The rich literature devoted to the use of inequality indices for analyzing income distributions within and across countries provides a natural methodological toolbox for examining the distributional effects of environmental outcomes. We show that the most commonly used inequality indices, such as the Atkinson index have theoretical properties that make them inconvenient for analyzing bads, like pollution, as opposed to goods like income. We develop a transformation of the Atkinson index that is suitable for analyzing bad outcomes. In addition, we show how the rarely used Kolm-Pollak index is particularly well-suited for ranking distributions of adverse health and environmental outcomes. We provide an illustration of its potential use in the context of emissions standards affecting indoor air quality.

M2-B.4 Shirley, SH*; Grant, RL; Honeycutt, M; Texas Commission on Environmental Quality; [email protected] Integrating evidence: The importance of exposure and framing the question In 2009 the National Academies (NAS) emphasized the importance of problem formulation and exposure assessment, among other best practices. Robust problem formulation is helpful when incorporating biological relevance and chemical specific knowledge to meaningfully guide risk management options. As a first step, it is important to clarify the potential for exposure to chemicals of concern (COC) in order to develop the most reasonable and useful risk assessments. Once a potential COC has been identified, it is crucial to characterize sources, pathways, receptors and temporal patterns of exposure. For evaluation of inhalation exposures, ambient air monitoring data becomes increasingly important, although uncertainty around how air monitoring data correspond to personal exposures and specific health endpoints often remains. Initial stages of an exposure assessment may identify discrepancies between concentrations of COCs described in epidemiological or toxicological studies and concentrations to which the general population may be exposed. For example, design values from centrally-located monitors may be used in observational epidemiology studies to identify statistical correlations between concentrations and health endpoints of interest. However, when exposures are characterized for populations of interest, it may become apparent that it is unlikely individuals were routinely exposed to these levels, and therefore it is improbable that a particular health effect is due to the concentrations described in the study (i.e., an association is shown, but not causation). In conclusion, regulatory agencies often utilize problem formulation and exposure characterization along with weight of evidence criteria (such as the Hill criteria) to derive useful and robust risk assessments that can inform risk management determinations. Exposure assessments and problem formulation play major roles in these decisions.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M3-D.4 Shoemaker, K*; Siegrist, J; Ferson, S; Stony Brook University, Applied Biomathematics; [email protected] Mixing good data with bad Data sets have different qualities. Some data are collected with careful attention to proper protocols and careful measurement using highly precise instruments. In contrast, some data are hastily collected by sloppy or unmotivated people with bad instruments or shoddy protocols under uncontrolled conditions. Statistical methods make it possible to formally combine these two kinds of data in a single analysis. But is it always a good idea to do so? Interval statistics is one convenient method that accounts for the different qualities of data in an analysis. High quality data have tighter intervals and poor quality data have wider intervals, and the two can be legitimately pooled using interval statistics, but it appears that it is not always advisable for an analyst to combine good data with bad. We describe examples showing that, under some circumstances, including more data without regard for its quality unnecessarily increases the amount of uncertainty in the final output of an analysis. Ordinarily, statistical judgment would frown on throwing away any data, but as demonstrated by these examples, it seems clearly advantageous sometimes to ignore this judgment. More data does not always lead to more statistical power, and increasing the precision of measurements sometimes provides a decidedly more efficient return on research effort. This result is highly intuitive even though these examples imply a notion of negative information, which traditional Bayesian analyses do not allow.

T2-E.3 Shortridge, JE*; Guikema, SD; The Johns Hopkins University ; [email protected] Measuring health impacts from breaks in water distribution systems using internet search data The aging condition of drinking water distribution infrastructure has been identified as a factor in waterborne disease outbreaks and a priority area for research. Pipe breaks pose a particular risk, as they result in low or negative pressure which could result in contamination of drinking water from adjacent soils. However, measuring this phenomenon is challenging because the most likely health impact is mild gastrointestinal (GI) illness, which is unlikely to go reported to doctors or hospitals although it can result in significant social costs. Here we present a novel method that uses data mining techniques to assess the correlation between pipe breaks and internet search volume related to symptoms of GI illness in two major U.S. cities. Weekly search volume for the term “diarrhea” was regressed against the number of pipe breaks in each city, and additional covariates were used to control for seasonal patterns, search volume persistence, and other sources of GI illness. The fit and predictive accuracy of multiple regression and data mining techniques were compared, and a Random Forest models resulted in significantly lower predictive errors in both cities. Pipe breaks were found to be an important and positively correlated predictor of internet search volume in both cities, as were seasonal fluctuations and search volume persistence. This correlation indicates that breaks in the water distribution system could present health risks that are unlikely to be considered when estimating the benefits associated with infrastructure repair and investment.

P.144 Smith, MN; Port, JA; Cullen, AC; Wallace, JC; Faustman, EM*; University of Washington; [email protected] A tool to facilitate the incorporation of metagenomic data into environmental microbial decision-making and risk analysis Advances in microbial genomics have opened up new opportunities for translation to public health risk research. Traditionally, methods for studying changes in the population dynamics of microbial communities have required cell culture and have focused on single organisms. Some microbes cannot be cultured with current methods or are poorly characterized, leading to incomplete assessment of microbial environmental health. The development of metagenomics, in which DNA is extracted directly from environmental samples and sequenced, now allows for the characterization of the taxonomic and functional potential of microbial communities, provides an expanded tool for microbial environmental health monitoring. However, new bioinformatics and analytical challenges have arisen in interpretation and translation of metagenomic data to decision-making and risk management. We provide a tool for the translation of metagenomic data to environmental health monitoring information relevant to public health decision-making and risk assessments. This framework allows functional data from Clusters of Orthologous Groups of proteins (COGs) to be interpreted within the context of public health. Using metagenomic data from 107 published biomes, we performed a functional analysis to identify COGs that are reflective of potential human impacts. Biomes with higher known potential human impact, such as the WWTP had a greater percentage of public health relevant COG relative abundance. Overall, we demonstrate that this is a valuable tool for distinguishing between environments with differing levels of human impact in the public health context. This project is supported by the NOAA-funded Pacific Northwest Consortium for Pre- and Post-doctoral Traineeships in Oceans and Human Health and the UW Pacific Northwest Center for Human Health and Ocean Studies (NIEHS: P50 ESO12762 and NSF: OCE-0434087), NOAA (UCAR S08-67883) and the Center for Ecogenetics and Environmental Health (5 P30 ES007033).

P.74 Smith, D.W.; Conestoga-Rovers & Associates; [email protected] Atrazine effects on amphibians: Is it safe to go back into the water? In the mid-2000s, scientific debate about atrazine’s potential to cause endocrine disruption of amphibians played out in the bright lights and high stakes of EPA re-registration. As part of that process, EPA had its Science Advisory Board evaluate the available science on endocrine disruption. Depending on which conventional wisdom on the internet one reads, the SAB review either found serious errors in the science showing significant negative effects on amphibian development at low atrazine concentrations. Other sources claim the complete opposite, e.g., that the SAB “found each and every one of the studies [showing no significant effects] to be fundamentally or methodologically flawed, some containing defects as egregious as allowing control and test subjects to intermix.” Since industry funded much of the latter research, this debate devolved into the all too familiar acrimonious discussions about conflict of interests, and even contained a published paper purported showing a relationship between funding source and results. From afar – I luckily had no involvement at all in any of this -- the debate seems to exemplify some issues bedeviling science in general and risk assessment in particular. Specifically, are scientists in general and the SAB in this specific case so poor at communicating to the public. Secondarily, what exactly are “conflicts of interests", who has the, and what effects do they likely have on science. Thirdly, what are professional ethics in the midst of debate about potentially important environmental risks. Thus, my talk will review this incident, give my best science view of what exactly the SAB said, and review current discussions of scientific ethics.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.38 Song, JW*; Small, MJ; Carnegie Mellon University; [email protected] Determining detection rates of environmental DNA sampling for monitoring the risk of invasive fish species Early detection of invasive species is critical to effective aquatic ecosystem risk management. A newly developed detection method is eDNA sampling, which is the analysis of water samples for the presence of species-specific environmental DNA (eDNA), DNA fragments that are released in the water, to infer the presence of the species. This technique promises improved detection sensitivity and specificity and reduced monitoring costs compared to traditional techniques. However, the use of eDNA sampling in decision-making frameworks is challenging due to the many uncertainties associated with the DNA technology and sampling methodology. These uncertainties have received particular attention in the use of eDNA sampling for detection of invasive Asian carp species in the Great Lakes region, where many costly and ineffective risk management efforts have been performed due to eDNA evidence. In this paper, the uncertainties in the relationship between fish presence and eDNA presence in a river system is explored. A one-dimensional advective-reactive-dispersive transport model is integrated with a fish dispersal model to determine the concentration profile of eDNA, spatially and temporally, in a specified river system. The model can then evaluate the relationship between fish density and eDNA concentration and the potential detection rates at each section of the river. The results suggest that under high flow conditions, such as in major river channels, there is a high likelihood of false negatives due to the washout of eDNA. The potential of false positives is higher under low flow conditions, such as in slower-moving backwater areas, because the persistence of eDNA can now influence the results. A stronger understanding of the detection rates of eDNA sampling will help inform improved sampling methodologies and better integration with decision-making frameworks.

P.131 Song, H*; Underhill, JC; Schuldt, JP; Song and Schuldt: Cornell University, Underhill: Johns Hopkins University; [email protected] Communicating conservation with labels: Experiment on the effectiveness of using IUCN categories for advocacy The Red List published by the International Union for Conservation of Nature and Natural Resources (IUCN) uses a categorical system with labels such as “Critically Endangered” or “Vulnerable” to communicate the level of threat faced by each species. This study examined whether messages using such categorization information would be as effective as messages using statistical information in communicating risk. In an online experiment, 169 participants were randomly assigned to read four descriptions about threatened species written with either categorization information (verbal group) or statistical information (statistical group). Readability measured by the Flesch-Kincaid Grade Level score was controlled for across conditions (e.g., “According to the IUCN, the Bigeye Tuna is classified as a Vulnerable (VU) species” vs. “According to the IUCN, the Bigeye Tuna population declined by 42% around the globe over the past 15 years”). Although there were no significant differences in perceived message clarity or behavioral intention, perceived risk of extinction was higher among the statistical group than the verbal group. Thus, professionals communicating with lay audiences about threatened species may wish to cite relevant statistics instead of, or along with, the Red List categories. A follow-up study featuring a more diverse participant sample and varying levels of statistical complexity is currently underway.

T4-A.1 Spada, M*; Burgherr, P; Laboratory for Energy Systems Analysis, Paul Scherrer Institute, 5232 Villigen PSI, Switzerland; [email protected] Quantitative Risk Analysis of Severe Accidents in Fossil Energy Chains Using Bayesian Hierarchical Models Risk assessment of severe accidents in the energy sector is an important aspect that contributes to improve safety performance of technologies, but is also essential in the broader context of energy security and policy formulation by decision makers. A comprehensive approach is needed because accidents can occur at all stages of an energy chain. The classical approach to assess the risk of severe accidents in fossil energy chains is the use of aggregated risk indicators focusing on human health impacts, i.e., fatality rates, and/or frequency-consequence curves. However, the analysis of extreme accidents contributing disproportionally to the total number of fatalities in an energy chain is often impeded due to the scarcity of historical observations. Furthermore, the common high uncertainties, in particular for the risk of extreme events, cannot be fully addressed using this standard approach. In order to assess the risk, including the one of extreme accidents, we apply a Bayesian hierarchical model. This allows yielding analytical functions for frequency and severity distributions as well as frequency trends. Bayesian data analysis permits the pooling of information from different data sets, and inherently delivers a measure of uncertainty. The current analysis covers severe (≥5 fatalities) accidents in the coal, oil and natural gas chains for the years 1970-2008, which are contained in PSI’s Energy-related Severe Accident Database (ENSAD). First, analytical functions for frequency and severity distributions common to all energy fossil chains were established. Second, these distributions served as an input to the Bayesian Hierarchical Model. The risks are quantified separately for OECD, EU 27 and non-OECD countries. The proposed approach provides a unified framework that comprehensively covers accident risks in fossil energy chains, and allows calculating specific risk indicators to be used in a comprehensive sustainability and energy security evaluation of energy technologies.

T3-I.1 Staid, A*; Guikema, SD; Nateghi, R; Quiring, SM; Gao, MZ; Yang, Z; Johns Hopkins University; [email protected] Long-term hurricane impact on U.S. power systems Hurricanes have been the cause of extensive damage to infrastructure, massive financial losses, and displaced communities in many regions of the United States throughout history. From an infrastructure standpoint, the electric power distribution system is particularly vulnerable; power outages and related damages have to be repaired quickly, forcing utility companies to spend significant amounts of time and resources during and after each storm. It is expected that climate change will have significant impacts on weather patterns, but there is much uncertainty regarding the nature of those impacts. In order to characterize some of this uncertainty, we simulate the long-term impacts of hurricanes on U.S. power systems and evaluate this impact looking out into the future under different scenarios. We evaluate future power system impacts while varying hurricane intensity, annual hurricane frequency, and landfall location. This allows us to understand the sensitivity of hurricane impacts to some of the possible scenarios expected under climate change. Using the historical data record as a baseline, hurricane intensity and annual frequency are independently varied both positively and negatively. Changes in landfall location are achieved by adjusting the probability distribution for landfall. The results of these simulations show the areas along the Atlantic Coast that will be hit hardest by hurricanes under possible changes in hurricane hazard scenarios. Areas that are heavily impacted under multiple scenarios, or under those scenarios expected to be the most likely to occur, can use this information to make more informed decisions about possible investments in power grid resilience or robustness. Adaptations to climate change need to be considered as soon as possible in order to maximize the benefits of these investments, even though considerable uncertainty remains about the potential impacts of climate change on hurricane hazards to the power system.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T1-I.3 Stavrou, DI*; Ventikos, NP; School of Naval Architecture and Marine Engineer in Technical University of Athens; [email protected] Submarine Power Cables (SPCs): The laying procedure, the fleet and reliability analysis of Medium Voltage Network. The use of SPCs over the last decades plays a significant role in the transfer of energy in worldwide scale. The aim of this study is to determine the aspects of the issue that concerns the laying procedure and develop a model of reliability analysis so to evaluate potential cable routes. Vessels with special characteristics are used to accomplish the laying procedure. A preliminary analysis is necessary so as to determine the factors that affect both the laying procedure. A reliability analysis of effectiveness is applied and presented using the raw data of the Hellenic SPCs Mediun Voltage Network; the model is based on the respective model that has been represented in the study “Reliability Analysis Of Submarine Power Cables And Determination Of External Mechanical Protection” by M. Nakamura at al. in 1992. The parameters that have been considered in the context of the presented model comprise: The depth of the cable route/depth at the point of failure of the cable; The length of the cable route/minimum distance of the point of failure from the coast; The seabed characteristics of the cable route at the area of failure; The level of protection of the cable along the cable route at the area of failure. The application of this reliability model can be used in a two-fold manner: To determine critical and safe areas along a certain cable route. During the phase of preliminary design to choose the optimum route for laying the cable. In particular the failure rate at any (Ei, Lj, Dk) of the cable is: Ri=F(Ei)·F(Lj)·P(Dk|E1,L0) For the entire length of the cable the reliability is: Rtotal =ΣRi The mean time between failures refers to the safe operating time for a cable route. MTBF=1/Rtotal

P.111 Stedge, J*; Brad, F; Abt Associates; [email protected] SafeWater CBX: Incorporating Uncertainty and Variability in Benefits Analysis Incorporating variability and uncertainty into public health regulation benefits assessments is critical to fully understanding the potential impacts; however, doing so can be data intensive and computationally complex. To support the development of national primary drinking water standards, we developed the SafeWater CBX model which is designed to estimate the health benefits associated with alternative maximum contaminant levels (MCLs) in drinking water. SafeWater CBX is the first model ever developed to fully incorporate both variability and uncertainty in drinking water benefits assessment. The model first estimates the exposed populations at each public water system (PWS) entry point to the distribution system (EP). The exposed population is categorized by age and gender. Based on EP-level distributions of contaminant occurrence (which vary by source water type and region and are uncertain), drinking water consumption (which varies by age), and dose-response functions (which vary by age and gender and are also uncertain), the model then estimates the expected cases of illness each year (over a 50-year period of analysis) at any number of alternative MCLs. SafeWater CBX then values both the reduced expected illnesses and deaths avoided using cost of illness estimates and the value of statistical life (both of which are uncertain). The health benefits can be displayed by source water type (ground or surface water), age group, sex, PWS system size, and region. In addition to its ability to incorporate variability and uncertainty into the benefits analysis, SafeWater CBX also provides users with the option to run in “mean mode” where all inputs are treated as certain (variability is still modeled). This capability allows users to conduct sensitivity analyses in real time (15 minutes per run), making it possible to incorporate information on benefits into the regulatory option selection process.

W4-H.2 Steinhardt, JS*; Shapiro, MA; Cornell University; [email protected] The impact of narrative messages on prospect theory framing effects. Several previous studies found that gain/loss framing can influence the effectiveness of non-narrative health and risk communication messages. However, narrative messages are increasingly popular in health campaigns but the effect of narrative messages on prospect theory framing effects is not fully understood. Three experiments examined the role of narrative messages on framing effects. Experiment 1 found that framing did not influence judgments about characters and decisions in a story derived from a prospect theory context. Experiment 2 found a shift in preferences when a decision about gambling was presented in the form of a narrative compared with traditional wording from prospect theory studies (Tversky & Kahneman, 1986). This shift was present regardless of whether or not the zero was deleted from the wording, as suggested might remove framing effects in previous research (Reyna, 2012). Using a different story/decision context based on a choice between radiation therapy and surgery (McNeil et al., 1982), Experiment 3 found a narrative presentation amplified a preference for surgery compared to a non-narrative presentation. The results of all three experiments suggest that health campaigns cannot assume that framing effects will be the same in narrative messages and non-narrative messages. Potential reasons for these differences and suggestions for future research are discussed.

P.98 Steinhardt, JS*; Niederdeppe, J; Lee, T; Cornell University; [email protected] Numeracy and Beliefs About the Preventability of Cancer Fatalistic beliefs about cancer and uncertainty about information in news stories about cancer are barriers to cancer preventing behaviors. This research explores the relationship between numeracy, the ability to reason mathematically and interpret basic statistical and probabilistic information, and both fatalistic beliefs and uncertainty about the information present in news stories about cancer. Numeracy is measured using a 7-item subjective numeracy scale. A sample of 601 adults aged 18 and older were asked to read news stories about cancer in one of 15 randomized conditions and then asked questions related to cancer prevention and subjective numeracy. Higher levels of numeracy were associated with less fatalistic beliefs about cancer and less uncertainty about cancer risk and prevention. Interactions between numeracy and cancer news story exposure on fatalistic and uncertain beliefs about cancer risks and preventions, however, were not statistically significant. We conclude with a discussion of implications of these findings for future research on numeracy and health communication about complex issues like cancer.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-K.1 Stern, PC; National Research Council; [email protected] Design principles for governing risks from emerging technologies Technological innovations are developed and promoted for the benefits they are expected to bring. The benefits tend to be readily apparent, with data that are ostensible and repeatable; the costs, particularly those that are delayed or borne by non-adopters, are usually less so, at least at first. Consequently, initial judgments about the relative benefits and costs of innovations must often be revised in light of experience, in processes that can be contentious. This paper reports on a project that began in the National Research Council’s Committee on the Human Dimensions of Global Change (now the Board on Environmental Change and Society), which sought to distill general lessons on the governance of emerging technologies. It relied on two sources: (1) induction from experience with emerging technologies, mainly technologies now in place (nuclear power, radioactive waste management, and DNA manipulation) but also at some that were emerging at the time the project began in 2008 (nanotechnology, biotechnology, and information science and technology); and (2) inference from the results of social scientific research on potentially relevant social processes, such as risk perception and management, management of common-pool resources, the workings of international institutions and networks, and science communication and utilization. The paper reports on this project and then proposes a set of design principles for the governance of emerging technological risks, and then briefly discuss how those principles might be applied to consider the risks associated with the extraction of natural gas and oil from shale formations by horizontal drilling and hydraulic fracturing.

T2-E.1 Stewart, RN*; Bright, EA; Rose, AN; McGinn, CW; Bhaduri, BL; Oak Ridge National Laboratory; [email protected] Enriching Environmental Risk Based Decision Support Models with Large Scale, High Resolution Population Data For nearly two decades, the Spatial Analysis and Decision Assistance freeware package (SADA) has enabled environmental risk assessors to situate risk analytics within a spatial context. By integrating risk models articulated by EPA Superfund with advanced geospatial methods, methods have been developed for directly inferring risk informed sample designs, remedial designs, cost analysis, and uncertainty analysis within a free and open modeling environment. Currently, these models focus on a hypothetical receptor engaging in contamination under current or future landuse scenarios and do not consider the existing population distribution or sociocultural processes near these impacted areas. In the next expansion of SADA, we integrate Oak Ridge National Laboratory’s LandScan population dataset with the current collection of existing analytics and significantly extend the ability to assess exposure and risks to impacted populations and demographic sub-populations. Explicitly modeling and visualizing the spatial context within which contamination, population dynamics, and exposure occur can substantially inform policy formulation and evaluation. This paper discusses integration of LandScan with SADA, adjustment and expansion of existing analytics, and presents a case study.

P.52 Stewart, D*; Glass-Mattie, D; Dorman, D; McConnell, E; Adeshina, F; University of Tennessee and Oak Ridge National Laboratory; [email protected] Provisional Advisory Level (PAL) Development for Superwarfarins (Brodifacoum and Bromidalone) PAL values developed for hazardous materials by the US EPA represent general public emergency exposure limits for oral and inhalation exposures corresponding to three different severity levels (1, 2, and 3) for 24-hr, 30-d, 90-d, and 2-yr durations. PAL 1 represents the threshold for mild effects; PAL 2 represents the threshold for serious, irreversible or escape-impairing effects; PAL 3 represents the threshold for lethal effects. PALs have not been promulgated nor have they been formally issued as regulatory guidance. They are intended to be used at the discretion of risk managers in emergency situations when site-specific risk assessments are not available. The mention of trade names does not imply EPA endorsement. PAL values were developed based on the SOP and QAPP requirements. Brodifacoum (CAS No. 56073-10-0) and Bromadiolone (CAS No. 28772-56-7) are both members of the newer generations of anticoagulant rodenticides collectively named superwarfarins. Anticoagulant rodenticides have been used historically as effective means to control populations of mice, rats and other rodents in urban and agricultural settings. All anticoagulants act by preventing the coagulation cascade, which can cause free bleeding that can be fatal. Superwarfarins work by inhibiting vitamin K1 epoxide reductase, which leads to a depletion of vitamin K1 and impairment of blood clotting ability. In humans and animals, Vitamin K1 can be administered after ingestion to act as an antidote to prevent free bleeding. Brodifacoum and bromadiolone are absorbed quickly, distributed primarily to the liver and excreted in the feces, mostly unchanged. Oral PAL values were developed by using human case reports and animal data. Oral PAL 1, 2, and 3 values are 0.017, 0.16, and 1.8 mg/L for 24-hours and NR (not recommended), 0.0025, and 0.0075 mg/L for 30-days. Values are not recommended for 90-day and 2-year oral exposure and all inhalation durations due to insufficient data.

M2-F.5 Stillman, M; Consultant; [email protected] Risk-based Need Assessments to Enhance Enterprise Program Management Offices In many industries team leaders and organizations are under constant pressure to deliver a portfolio of projects on time and within budget. Project Management Offices (PMO) are set up to address this challenge through the development of standard Project Management processes, and act as a central repository for Project Management knowledge, best practices and lessons learned. The best return on the PMO investment is realized by performing Need Assessments which result in the development of Project Management plans that decrease work flow complexity and increase project savings. Need Assessments are customarily conducted with key functional groups within an organization for the purpose of improving processes by identifying gaps between existing and desired conditions. The fundamental value is derived by gaining deeper insights into project risks from multiple perspectives that lead to improved clarity on the priority and sequence for task implementation. Key stages of the Need Assessment process to be described are: (1) goal setting with PMO Director, (2) data gathering to document existing conditions, (3) open-ended interviews that identify gaps and encourage “out of the box” brainstorming of solutions, (4) risk ranking of issues, (5) initial de-brief with organizational leaders, and (6) findings report to document risk findings and recommendations. Consistency of this process with the risk management framework in ISO 31000 will be highlighted. The collaborative risk assessment stage will be described in detail as it provides the core technical basis for sequencing of response strategies.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-C.3 Suarez, Maria*; Muñoz, Felipe; Universidad de los Andes; [email protected] Flexible framework for the study of dispersion scenarios by accidental events in the transportation of hazardous material Although pipelines have become a safe and efficient mean of transportation for hazardous materials, several accidents have occurred in the past. The total volume of hazardous materials spilled, seems to decrease as improvements in control and pipeline integrity have been implemented. Accidents often occur by a material loss of containment, and these materials can be flammable, explosive or toxic, therefore its emission can become a major accident. Taking this into account, a-priori risk analysis is a tool used to take precautionary measures with respect to potential accidents and therefore scenario building is implemented since it has proved to be a powerful and effective tool which enables to explore and clarify present actions and subsequent consequences. Accidental scenarios depend on site’s characteristics and conditions, topography, weather, moisture, soil type, proximity to sensitive resources (e.g. water sources) among others, generating different scenarios on the right of way or far field scenarios. This paper aims to analyze spill path scenarios taking into account world and Latin American accidents in order to define pipeline failure types and conditions. However limited historical data concerning pipeline accidents in Latin America generates uncertainty and lack of information needed to establish frequencies and severities. Determining simulation parameters that can affect pool formation supports the construction of a guide which facilitates pipeline accidental scenarios analysis, considering special characteristics, such as Colombian topography (i.e pronounced slopes), leading to far field scenarios. As a consequence, prediction of arrival times considering flow directions, amount of infiltrated and evaporated material, using water quality models helps to establish faster response times. The inclusion of atypical scenarios under Colombian topography (i.e steep slopes in Andean region) allows limiting generation of scenarios right of way scenarios.

T2-D.2 Suppes, L*; Canales, R; Gerba, C; Reynolds, K; The University of Wisconsin - Eau Claire and The University of Arizona; [email protected] Risk of Cryptosporidium Infection to Recreational Swimmers in Swimming Pools Infection risk estimates from swimming in treated recreational water venues are lacking and needed to identify vulnerable populations and water quality improvement needs. Currently, state and local health departments lack standardized pool safety regulations since there is no U.S. federal pool code. Infection risk differs among populations because ingestion, visit frequency, and swim duration depend on activity and age. Objectives of this study were to estimate per-swim and annual Cryptosporidium infection risks in adults (greater than 18), children (18 or less), lap, and leisure swimmers (splashing, playing, diving, wading, standing, and sitting). Risks were estimated using oocyst concentration data from the literature, and new experimental data collected in this study on swimmer ingestion, activity, and pool-use frequency. The average estimated per-swim risk of Cryptosporidium infection was 3.7 x 10-4 infections/swim event. We estimated 3.3 x 10-2 infections occur each year from swimming in treated recreational water venues, which exceeds fresh and marine water swimming risk limits set by the United States Environmental Protection Agency (8 and 19 x 10-3 infection/year, respectively). Leisure swimmers had the highest annual risk estimate of 2.6 x 10-1 infections/year. Results suggest standardized pool water quality monitoring for Cryptosporidium, development of interventions that reduce intentional ingestion, and improvement of oocyst removal from pool water are needed. Leisure swimmers were the most vulnerable sub-population, and should be targeted in healthy swimming education campaigns.

W3-B.3 Swenberg, J*; Moeller, M; Lu, K; Yu, R; Andrews Kingon, G; Lai, Y; Edrissi, B; Dedon, P; University of North Carolina at Chapel Hill, Massachussetts Institute of Technology; [email protected] Mode of action studies on inhaled formaldehyde causing leukemia

W3-I.3 Tabibzadeh, M*; Meshkati, N; University of Southern California; [email protected] A Risk Analysis Study to Systematically Address the Critical Role of Human and Organizational Factors in Negative Pressure Test for the Offshore Drilling Industry According to the Presidential National Commission report on the BP Deepwater Horizon (DWH) blowout, there is need to “integrate more sophisticated risk assessment and risk management practices” in the oil industry. In addition, National Academy of Engineering and National Research Council report on the same accident recommends that “the United States should fully implement a hybrid regulatory system that incorporates a limited number of prescriptive elements into a proactive, goal-oriented risk management system for health, safety, and the environment.” Reviewing the literature of the offshore drilling industry indicates that most of the developed risk analysis methodologies do not fully and systematically address contribution of Human and Organizational Factors (HOFs) in accident causation. This is while results of a long-term study (1988-2005) of more than 600 well documented major failures in offshore structures show that approximately 80% of those failures are due to HOFs. This paper introduces both a conceptual risk analysis framework and a Bayesian belief network to address the critical role of HOFs in conducting and interpreting Negative Pressure Test (NPT), which according to experts, is a critical step in well integrity during offshore drilling. Although the introduced conceptual framework has been developed based on the analyses and lessons learned from the BP Deepwater Horizon accident and the conducted NPT, the application of this framework is neither limited to the NPT nor to the DWH case. In fact, it can be generalized for risk analysis of future oil and gas drillings as well. In summary, significance and contribution of this paper is based on three main factors: introducing both qualitative and quantitative risk assessment frameworks, analyzing HOFs as a main contributing cause of offshore drilling accidents, and concentrating on the NPT misinterpretation as a primary factor that affected the loss of well control and the subsequent blowout on the DWH.

IARC, NTP and EPA conclude formaldehyde causes human leukemia based on epidemiology studies. In contrast, nasal squamus cell carcinoma but not leukemia are found in animal bioassays. Formaldehyde is formed in all cells due to natural metabolism. Inhalation studies in rats & nonhuman primates (NHP) using [13CD2]-formaldehyde allow endogenous & exogenous formaldehyde DNA adducts to be accurately measured using ultrasensitive nanoUPLC-MS/MS. Endogenous N2-OH-methyl-dG adducts are found in all tissues examined, but [13CD2]- N2-OH-methyl-dG adducts are only found in nasal respiratory DNA. In NHP exposed to 2 ppm formaldehyde for up to 28 days, no [13CD2]- N2-OH-methyl-dG adducts are found (sensitivity to quantify 1 adduct in 1010 dG). Likewise, no [13CD2]- N2-OH-methyl-dG adduct are found (sensitivity to measure 1 adduct in 13 billion dG)in mononuclear white blood cells. Only endogenous N-terminal valine globin adducts and WBC DNA adducts of formaldehyde present in rats & NHP exposed to [13CD2]-formaldehyde, showing inhaled formaldehyde does not reach circulating blood in an active form. Finally, N6-formyllysine was measured with LC-MS/MS in tissues of rats & NHP. All tissues examined had formyllysine, but only nasal tissue also had [13CD2]- formyllysine. Collectively, the total lack of [13CD2]-labeled biomarkers reaching distant sites including blood & bone marrow does not support inhaled formaldehyde causing leukemia. In contrast, an expanding literature demonstrating endogenous aldehydes damages hematopoietic stem cells if either Fanconi Anemia genes or ALDH2 genes are knocked out. Under such conditions, mice spontaneously develop leukemia with no external chemical exposure. Furthermore, 1/3 of the myelogenous leukemia patients have deficiencies of aldehyde dehydrogenase expression, particularly ALDH1A1 and ALDH3A1. It is imperative that the role of aldehyde dehydrogenases be examined in epidemiology studies of human leukemia in view of the lack of exposure to hematopoietic cells to better explain the causes of leukemia.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T2-D.3 Taft, SC*; Hines, SA; Chappie, DJ; Janke, RJ; Lindquist, HA; Ernst, HS; U.S. Environmental Protection Agency; Battelle Memorial Institute; [email protected] Assessment of relative potential for Legionella species inhalation exposure from common water uses The intentional or accidental introduction of microbial contaminants such as Legionella into a drinking water system could pose a health hazard to water customers. The Legionella species have been identified as important waterborne pathogens in terms of disease morbidity and mortality. A preliminary exposure assessment of Legionella spp. drinking water contamination was conducted to better understand potential inhalation exposure pathways, develop a means of prioritizing exposure pathways for further assessment, estimate potential inhalation exposure doses, and identify critical knowledge gaps for further study. Potentially complete exposure pathways were compiled, and a screening level exposure assessment was conducted for pathways where inhalation doses could be quantitatively estimated. Considerable variability in the calculated exposure doses was identified between the exposure pathways, with the doses differing by over five orders of magnitude in each of the evaluated exposure scenarios. The exposure pathways that have been epidemiologically associated with legionellosis transmission (ultrasonic and cool mist humidifiers) were assessed to have higher estimated inhalation doses than pathways where epidemiological evidence of transmission has been less strong (faucet and shower) or absent (toilets and therapy pool). This presentation will describe the complex assessment design and methodology to demonstrate applicability of these assessments to a wide range of microbial contaminants. While ingestion exposures of contaminated drinking water historically have been examined, there are still major gaps in the understanding of inhalation and dermal exposures of aerosolized pathogens during common uses of water.

P.56 Takeshita, J*; Gamo, M; National Institute of Advanced Industrial Science and Technology (AIST); [email protected] Proposing a framework of QAAR approaches for predicting the toxicity of chemical substances: A case study on predicting and extrapolating the missing NOEL values We propose a Quantitative Activity–Activity Relationship (QAAR) model to predict the unknown toxicity values of chemical substances in animal testing data from the existing acute oral toxicity and 28 days repeated dose toxicity studies. In view of the global movement to reduce animal testing to assesse and manage the chemical risk, OECD has been saying the aggressive use of statistical methods for predicting the toxicity of chemical substances. As one of the most popular statistical methods, there is the Quantitative Structure-Activity Relationship (QSAR). On the other hand, the Quantitative Activity-Activity Relationship (QAAR) was introduced to estimate unknown toxicity values from the relationship between difference toxicity endpoints. For example, suppose that a target substance and there exist in vivo data of some endpoints. When we would like to know the toxicity of every endpoint, we have been considering that the QAAR works well. The QAAR enables us to predict any endpoint's toxicity from the existing in vivo data. When we deal with an existing substance, we may face the situation like above since there are not a little literature information on the substance. In this study, we first develop a QAAR by using covariance structure analysis. Our model is based on correlations among organ-specific NOEL values that are included in the training set. The major advantage of the model is that it enables us to make estimations with the confidence intervals. Secondly, we predict the missing NOEL values of the substances for which NOEL data for some organs but not all on 28 days repeated dose toxicity studies are available. Finally, we extrapolate every NOEL values of the substances that have only acute oral toxicity studies from the LD50 values.

T4-I.2 Talabi, S; Carnegie Mellon Unversity; [email protected] Improving Nuclear Power Plant Construction Cost Learning Curves by Implementing Organizational Learning Tools for Risk Identification and Risk Assessment The nuclear industry has been historically plagued with considerable technology deployment risks, with project cost and schedule overruns presenting a significant risk to nuclear plant investors. Although several risk management practices have been put in place, considerable cost and schedule excursions have continued to occur in the construction of recent nuclear power plant projects. No evidence of learning has been seen based on an analysis of the cost trends for nuclear power plant construction. This research seeks to challenge the lack of a demonstrated learning curve in nuclear power plant construction cost, and to propose that such a learning curve can be established through the implementation of organizational learning tools for risk identification and risk assessment. An analogy is drawn between the nuclear industry’s development of a learning curve as a response to safety challenges which were brought to light by the Three-Mile Island plant accident, and the potential for a similar learning curve to address construction cost challenges. A method is developed to use documented risk occurrence trends on a sample of nuclear steam generator replacement projects to develop a potential learning curve. We define a learning coefficient based on risk management performance, and show that various functional groups supporting the deployment of steam generator replacement projects have lower rates of cost overruns as their learning coefficients increase. This trend demonstrates a potential for learning.

T4-H.1 Tambe, M*; Shieh, E; Univ of Southern California; [email protected] Stackelberg Games in Security Domains: Evaluating Effectiveness of Real-World Deployments A Stackelberg game is a game theoretic model that assumes two players, a leader (defender) and follower (attacker). The leader plays a strategy first (where the follower is able to observe the leader’s strategy) and the follower subsequently decides his own strategy based on the leader’s strategy. Stackelberg games have been in active use for resource deployment scheduling systems by law enforcements around the US. Sites include LAX International Airport (LAX), assisting the LAX Airport police in scheduling airport entrance checkpoints and canine patrols of the terminals, the Federal Air Marshals Service (FAMS) to schedule marshals on international flights, the United States Coast Guard (USCG) in scheduling patrols around Ports of Boston, NY/NJ, and LA/LB, the Los Angeles Sheriff’s Department (LASD) for patrolling Metro trains, and others. Measuring the effectiveness of the Stackelberg game-based applications is a difficult problem since we cannot rely on adversaries to cooperate in evaluating the models and results, and there is (thankfully) very little data available about the deterrence of real-world terrorist attacks. The best available evidence of the Stackelberg game-based applications’ effectiveness in providing optimum security at minimum cost includes: (1) computer simulations of checkpoints and canine patrols, (2) tests against human subjects, including USC students, an Israeli intelligence unit, and on the Internet Amazon Turk site, (3) comparative analysis of predictability of schedules and methodologies before and after implementation of a Stackelberg strategy, (4) Red Team, (5) capture rates of guns, drugs, outstanding arrest warrants and fare evaders, and (6) user testimonials. Even as we continue evaluations in additional security domains, the body of answers to “how well do Stackelberg game-based applications work?” enables policy makers to have confidence in allocating and scheduling security resources using applications based on the Stackelberg game model.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-B.3 Teuschler, LK*; Rice, GE; Mumtaz, M; Hertzberg, RC; U.S. Environmental Protection Agency; [email protected] Adapting chemical mixture risk assessment methods to assess chemical and non-chemical stressor combinations Chemical mixture risk assessment methods are routinely used to estimate joint toxicity of co-occurring chemicals. The impact of nonchemical stressors on mixture risk has rarely been addressed. To assess combinations of chemical and nonchemical stressors, along with protective buffers, component-based approaches may be applicable, depending on the toxic action among diverse stressors. Such methods are feasible but use simple depictions of complex biological processes. Proposed approaches include: response addition (RA) and effects addition (EA), assuming toxicological independence; dose addition in the form of a hazard index (HI), assuming similar toxicity; and the qualitative Weight of Evidence (WOE) method to evaluate joint toxicity. Under RA, the incremental probabilistic risks (e.g., of heart attack) developed for each individual stressor are summed; under EA, the incremental changes in a measured effect (e.g., diastolic blood pressure) are summed to estimate the total change in that effect, where both approaches are constrained to small incremental changes. Using the HI, hazard quotients are summed or subtracted, as appropriate, for the same effect (e.g., hypertension from particulate matter, lead, omega 3 fatty acids, stress). The WOE method indicates the impact of each nonchemical stressor/buffer on the toxicity of each chemical and the strength of available evidence (e.g., impacts of diabetes/omega 3 fatty acids on heart disease from particulate matter); criteria are needed to make these evaluations. The uncertainties in combined stressor assessments should be articulated, and the valid exposure ranges for the assessment should be described, along with the influence of each stressor on the analysis. Although these methods need to be more fully developed, they show promise for application in cumulative risk assessments. (The views expressed in this abstract are those of the authors and do not necessarily reflect the views or policies of the U.S. EPA.)

T2-I.4 Thekdi, SA; University of Richmond; [email protected] Risk-based investment for prison infrastructure systems By 2018, the Bureau of Prisons network is expected to operate at 45% above inmate capacity (GAO 2012). Limited research has evaluated the management of prison infrastructure systems from a risk-based perspective. Overcrowding may result in avoidable government expenditures, degraded security, sanitary concerns for inmates, and inadequate rehabilitation programming. Traditionally, demand planning and management for prison infrastructure have been addressed through prison sentencing in court systems. However, this work addresses the risks associated with prison management through the evaluation of rehabilitation programs, investments in capacities, and other program analysis methods. A multicriteria analysis is used to prioritize investments to reduce vulnerabilities of prison systems.

M4-A.2 Thomas, D; US EPA; [email protected] Metabolism and the toxicity of arsenic Chronic exposure to inorganic arsenic in environmental and occupational settings has been associated with increased cancer risk. Exposure to inorganic arsenic is also strongly implicated as a causative factor for a number of non-cancer health effects. Ingested or inhaled arsenic undergoes extensive biotransformation. In humans and many other species, inorganic arsenic and its mono- and di-methylated metabolites are excreted in urine after exposure to inorganic arsenic. Enzymatically catalyzed formation of methylated metabolites of arsenic produces an array of metabolites that contain arsenic in either the trivalent or pentavalent oxidation state. Production of methylated metabolites containing trivalent arsenic is problematic because these species are highly reactive. These metabolites are more potent cytotoxins and genotoxins than inorganic arsenic. Capacity to activate arsenic by methylation is influenced by a variety of biological and behavioral factors. For example, interindividual variation in genotypes for arsenic (+3 oxidation state) methyltransferase affects the arsenic methylation phenotype and disease susceptibility phenotypes. Although most studies have focused on characterization of exposure in populations that consume drinking water containing inorganic arsenic, in some cases food can be a significant source of exposure to this metalloid. In foods, arsenic can be present in inorganic or methylated forms or in complex organic forms. Linkages among the metabolism of these arsenicals have not been elucidated, although they may be important to assessing aggregate exposure to this toxin. (This abstract does not reflect US EPA policy.)

P.37 Timofeev, A.A.*; Sterin, A.M.; RIHMI-WDC; [email protected] Identifying regional features of temperature variability using cluster analysis and quantile regression applied to the daily surface level observations It is very important to assess extreme climate variability, which may cause unwanted weather anomalies. With appropriate usage, this information effectively allows reducing possible losses caused by extreme weather events. As we’ve shown in our previous presentations, quantile regression does not have drawbacks of traditional approaches, and provides comprehensive data, describing changes in statistical distribution across full range of quantiles. However, the more detailed information we get, the more difficult it gets to analyze and interpret. In our early research we’ve moved from linear process diagrams to colored cross sections to visualize quantile trend values, obtained in one location or distributed vertically along radiosonde soundings made from same weather station. Both methods are very effective for separate examination of limited number of weather stations. This time we introduced new approach - use vectors of quantiles as an input for cluster analysis. Then plot resulting clusters on a map as simple colored marks. Thus we can see if there are evidently similar points (weather stations) and how their similarity in changes of variability matches geographical Locations. We can use any part of distribution we are interested in just by making appropriate selection of quantiles. For example for extreme events analysis, top section of the distribution should be interesting. Using that approach, we’ve obtained very interesting results, with highly pronounced regional similarities of distribution changes in certain parts of statistical distribution of surface temperature. Thus quantile regression, combined with cluster analysis on it’s results provides comprehensive information about climate variability changes projected onto a geographical map.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-E.2 Toccalino, PL*; Gilliom, RJ; Lindsey, BD; Rupert, MG; U.S. Geological Survey; [email protected] Pesticides in groundwater of the United States: Occurrence and decadal-scale changes This study by the U.S. Geological Survey’s National Water-Quality Assessment Program evaluated (1) the occurrence of pesticides in groundwater and (2) decadal-scale changes in pesticide concentrations over a 20-year period. Untreated groundwater samples were collected from 1,309 wells located in 60 nationally distributed well networks and analyzed for as many as 83 pesticide compounds. Each well network was sampled once during 1993–2001 (Cycle 1) and once during 2002–2011 (Cycle 2). Shallow (mostly monitoring) wells were sampled in agricultural and urban areas. Deeper (mostly domestic and public) wells, which tap major aquifers used for water supply, were sampled in mixed land-use areas. Pesticides were frequently detected—about two-thirds of the samples from agricultural areas and about half of the samples from urban areas contained one or more pesticides. More than one-third of samples from major aquifers contained one or more pesticides, but concentrations seldom (about 1% of samples) exceeded human-health benchmarks. The five most frequently detected pesticide compounds—atrazine, deethylatrazine, simazine, metolachlor, and prometon—each were detected in 9% to 36% of all samples, and each had statistically significant changes in concentrations between Cycle 1 and Cycle 2 in one or more land uses. For all agricultural land-use networks combined, concentrations of atrazine, metolachlor, and prometon decreased from Cycle 1 to Cycle 2; prometon concentrations also decreased in urban areas. Conversely, for all major aquifers combined, concentrations of atrazine, deethylatrazine, and simazine increased; deethylatrazine concentrations also increased in urban areas. The magnitude of these concentration changes from decade to decade was small, and ranged from 0.001 to 0.09 µg/L (35- to 230,000-fold less than benchmarks). More than half of the individual well networks showed statistically significant changes in one or more pesticide concentrations between Cycles 1 and 2.

P.13 Tokai, A*; Nakazawa, K; Nakakubo, T; Yamaguchi, H; Kojima, N; Sakagami, M; Higuchi, Y; Nagata, Y; Ishimaru, T; Osaka University; [email protected] Development of practical risk evaluation method with the example of traffic relevant environmental measures Under the multiple risk situations in our daily life, to understand the actual risk condition goes underground the revealed risk is necessary. To grasp it, we examined the method that make up the result of traditional health risk assessment. For this purpose, we identified three objectives in this research project. The first one is to develop risk durability evaluation method and the second one is to apply this method to actual environmental measures especially we examined the transportation related technologies. The third one is to examine the possibility of voluntary action for risk reduction by citizens through questionnaire survey to residents in Osaka prefecture. Main findings were followings. Regarding first task, we developed risk durability evaluation method based on the concept of value of information and trade-off analysis with the example of flame retardant to plastics with the software Analytica. This software enables us to build user friendly model that support to obtain the insight of risk stems from this chemical through graphical user interface. To analyze the uncertainty included in the process of risk assessment of flame retardant, we estimated value of information of specific parameter required for risk estimation. As to the second task, we carried out the risk analysis and lifecycle analysis for three types of environmental measures for automobile industry, they are material substitutions, fuel substitutions and products replacement. We clarified the risk trade-offs for these three types of measures and evaluated relative superiority from the viewpoint of health risk and greenhouse gas emission. Regarding third task, results of 232 valid respondents who were aged 20 or over and lived in Osaka prefecture were obtained.

W2-K.2 Tonn, GL*; Guikema, SD; Johns Hopkins University; [email protected] Power outage analysis for Hurricane Isaac In August 2012, Hurricane Isaac, a Category 2 hurricane, caused extensive power outages in Louisiana. The storm brought high winds, storm surge and flooding to Louisiana, and power outages were extensive and prolonged. Hourly power outage data for the state of Louisiana was collected during the storm and analyzed. This analysis included correlation of hourly power outage figures by zip code with wind, rainfall, and storm surge. Results were analyzed to understand how drivers for power outages differed geographically within the state. Hurricane Isaac differed from many hurricanes due to the significant storm surge and flooding. This analysis provided insight on how rainfall and storm surge, along with wind, contribute to risk of power outages in hurricanes. The results of this analysis can be used to better understand hurricane power outage risk and better prepare for future storms. It will also be used to improve the accuracy and robustness of a power outage forecasting model developed at Johns Hopkins University.

W4-F.3 Tonn, BE*; Stiefel, D; Feldman, D; University of Tennessee-Knoxville; [email protected] Past the threshold for existential risks: Balancing existential risk uncertainty and governance Concerns about the potential extinction of the human race are growing. This paper addresses those concerns, and builds on previous research in that area, by presenting research into the conditions under which society ought to implement actions to reduce existential risk. This paper answers the question, “How do types of governance help or hinder society’s response to existential risks?” Specifically, this paper explores the balance between uncertainty; actions to reduce existential risk; philosophical perspectives about existential risk; and types of governance, with special attention for the role of complex adaptive systems. This paper presents two Frameworks for addressing those concerns. The first framework, the Framework for Implementing Actions to Reduce Existential Risk, answers the question, “Under what conditions ought society implement actions to reduce existential risk?” The second framework, the Framework for Governance Responses to Reduce Existential Risk, is based on types of governance and extends the first framework from the perspectives of fragile, rigid, robust, and flexible governance structures with special attention for the role of complex adaptive systems. Specifically, the second framework provides the foundational perspectives from which to identify the ways in which society’s actions might be helped or hindered by actual or perceived complications of governance. Support for different categories of actions to reduce existential risk are different across these governance types, as are the rates at which they help or hinder the societal response. The paper concludes with an assessment of the overall challenges and opportunities revealed by these frameworks. The paper also provides recommendations for reducing existential risk given governance types and the role of complex adaptive systems.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-E.3 Triantafyllidou, S; Le, TH; Gallagher, DL*; Edwards, MA; Virginia Tech; [email protected] Evaluating public health benefits from reductions in drinking water lead levels at US Schools After high drinking water lead concentrations were found in elementary schools in Seattle and Los Angeles, both school districts undertook steps to reduce children’s exposure. This research used reported water lead concentration distributions before and after remediation steps as inputs to the US EPA Integrated Exposure Uptake Biokinetic (IEUBK) model to estimate the resulting distribution of blood lead levels in exposed children. Sixty-three elementary schools in Seattle and 601 elementary schools in Los Angeles were evaluated. Seattle undertook active remediation measures by installing filters and replacing lead plumbing. The mean first-draw water lead levels across all Seattle schools decreased from 24.4 to 2.1 ppb, and mean 30-second flushed water lead levels decreased from 3.4 to 0.7 ppb. This reduced the estimated percentage of students exceeding a blood lead level of 5 ug/dL from 11.2% to 4.8%, with the post-remediation value primarily attributable to lead exposure from sources other than drinking water. While pre-remediation water lead levels and percent blood level ex c eed ance s varie d wi de l y from sc h o ol t o sc h o o l, p o s t remediation risks indicated a much lower variability. The Los Angeles school district used a flushing program to reduce exposure, with limited post-remediation testing. First draw and 30-second samples were 11.0 and 4.0 ppb respectively. Assuming flushing eliminated first-draw exposures, the percent blood level exceedances dropped from 8.6% to 6.0% across the 601 schools. School water lead remediation efforts can therefore be significant in reducing health risks to US elementary students.

M3-G.3 Trumbo, CW*; Peek, L; Meyer, MA; Marlatt, H; McNoldy, B; Gruntfest, E; Schubert, W; COLORADO STATE UNIVERSITY (1-4,7); UNIVERSITY OF MIAMI (5) UNIVERSITY OF COLORADO COLORADO SPRINGS (6); [email protected] Modeling Hurricane Preparedness and Evacuation Intention Our research team has just completed data collection for a project under support from the National Science Foundation and the National Oceanic and Atmospheric Administration (Dynamics of Hurricane Risk Perception, NSF CMMI-0968273). In this project we have used mail survey methods to study a panel of individuals located in the coastal area of the U.S. extending from Wilmington, North Carolina, to Brownsville, Texas. Study participants were sampled in a spatially random manner within an approximately 10-mile buffer along the coast. The same individuals were surveyed three times at one-year intervals. The initial response rate was 56%, with panel continuation rates of 75% and 85%, yielding a sample size ranging from approximately 650 to 400 depending on configuration. In our analysis, the level of hurricane preparedness and behavioral intention for evacuation are modeled by examining factors including hurricane risk perception, optimistic bias, individual and household vulnerability characteristics, evacuation barriers, and community resilience indicators. This presentation will offer a broad overview of the study and its preliminary results. Findings to date indicate that risk perception can be seen as both an affective and cognitive orientation of the individual, and we have developed a reliable item inventory for its measurement. We also examine optimistic bias for hurricane preparedness and evacuation and find that it is a related but independent factor from risk perception. Also among the results we find that households with disabilities, females, and having less confidence in community resilience is associated with greater levels of hurricane risk perception. Also, disabilities in the household, less hurricane experience, and fewer evacuation barriers (e.g., work or family related, transportation) are associated with a greater intention to evacuate from a major storm. Preparedness is moderately predicted by a number of variables including risk perception.

P.99 Trumbo, CW*; Peek, L; Laituri, M; COLORADO STATE UNIVERSITY; [email protected] Alternating Hydrologic Extremes: Risk Communication and Weather Whiplash Our focus in this work is on the cascading effects of alternating hydrologic extremes. Possibly as a consequence of climate change there is an increasing likelihood that areas of the world will be undergoing a rapid “weather whiplash” between drought and flood. These alternating extremes pose extraordinary risk to agricultural systems and economies, both rural and urban infrastructures, human population patterns and migration, and the natural ecosystems that we all ultimately depend on. Using the Spatial Hazard Events and Losses Database for the United States (SHELDUS) we accessed some 65,000 county-level records for financial losses from natural hazards over the period 1960-2010. The data were parsed to isolate floods and droughts, and a summary metric was computed to identify the cases in the top 80th and 95th percentiles for total losses (both crop and property damage, in 2000 dollars). By identifying cases that fell in the top 80th percentile for the union of flooding and drought we identified 99 counties that have had the highest cumulative losses from the combined hazard. This focused the data reduction on approximately 4,700 cases. By then parsing the data by geographic area we were able to sort cases by date to identify specific circumstances in which losses from floods and droughts occurred in spatial-temporal proximity. To conclude this phase of the project we will examine historical records such as news sources to gain insight into the consequences of the combined hazard events and how risk was communicated. This approach has identified the most acute cases, we anticipate that continuing analysis will identify broader and more nuanced patterns that will generate additional historical information. We will then seek additional collaborators with expertise in GIS and Atmospheric Science to use climatological data to identify areas where this combined natural hazard may increase under climate change scenarios.

P.60 Turley, AT*; Overton, AJ; Marin, K; Henning, CC; ICF International; [email protected] Implementing Systematic Review for Chemicals with Large Databases Systematic review allows risk assessors to use transparent logic to search for, categorize, and select data for use in chemical risk assessments for deriving toxicity reference values. The need for transparency is particularly important to federal agencies that are placing increased emphasis on stakeholder engagement. In addition, chemicals that may have tens of thousands of references require additional data management strategies beyond existing tools (e.g., EndNote) that allow additional sorting and querying of the results. The DRAGON Screen tool allows for the agile categorization and characterization of studies so that literature can be sorted and prioritized for inclusion in weight of evidence and data analyses. The tool allows the assessment manager to construct multiple and evolving rounds of review. The categories in each round are tailored to the particular chemical and assessment priorities, and new rounds can be added as needed throughout the literature screening process. Two separate data managers categorize the study separately based on title and abstract screening and without knowledge of each others’ selections. Then, a third expert reviews the two sets of categories for a study and resolves any discrepancies. Studies can then pass into the data entry phases. For large database chemicals, the risk assessment team may choose to only pass a set of categories through to the data entry phase. The logic of how this decision was made as well as the general literature flow is stored in the tool and can be provided to the stakeholders for review at any time.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T3-H.3 Underwood, PM*; Rak, , A; Office of the Deputy Under Secretary of Defense (I&E); Noblis Inc.; [email protected] Case Study on New Chemicals and Materials: Incorporating Environmental, Health and Safety Information into the Defense Acquisition Process The potential environmental, health, and safety (EHS) risks of new chemicals and materials in weapons systems must be examined from a life cycle perspective before they are used. Whether substituting a new chemical for an existing one or incorporating new chemicals never used before to achieve new mission requirements, there is a basic set of EHS data that should be known. The requirements include the physical, chemical, and toxicological data sets that can be used to estimate hazard, exposure, and ultimately risk to both uniformed service members and civilians. This basic set of EHS criteria can be developed in concert within the Department of Defense’s systems acquisition program. Collection of the EHS data early will help void substitution regret where a replacement is worse than the original and avoid fielding equipment presenting inherent risks to the user community or the environment. This case study presents a framework for the collection of EHS data during the systems acquisition life cycle with a focus on nanomaterials and munitions. The case study demonstrates the benefits of tying the data requirements to the progression of technology readiness levels. The framework is being used to inform the development of new department policy on EHS data set development for chemicals and materials as they are developed from early research, development, testing, and evaluation through production and operations, ending in disposal and demilitarization. The use of the framework and the new policy will improve the department’s risk management decisions during the systems development process thereby avoiding development delays and helping to remain within budget.

P.135 Valentini, M; Curra, C; DEMICHELIS, SO*; DEMICHELIS, SANDRA; ENVIRONMENT LABORATORY DDPY - UNLA; [email protected] Diminishing risks of soil pollution in public spaces: a proposal for remediation With the purpose of develop a set of proactive guidelines, the following paper carries out an environmental urban diagnosis in order to remediate an area associated with the trace of the Railway Railroad Roca, in the town of Remedios de Escalada, Lanús. The main objective is to intervene in the territory in order to diminish risks produced by urban soil contamination with oil, heavy metals and hydrocarbons which result from the deposition and accumulation of cars that have been abandoned in an underutilized area consequence of the breakdown of space. The problem is enhanced because of the combination of the absence of planning and strategic management, by neglect to cadastral situation in the area, among others, and the consequent reduction in both soil and environmental quality of the areas covered as nearby places. The overall purpose is to promote the relationship of individuals to the territory where they live and contribute to the development, presentation and subsequent conversion of the space, thereby promoting a better quality of life for its inhabitants. It has been projected a series of strategic guidelines designed and ordered according to the needs and relevant problems in the site, which proposed: removing cars and dispose them, tire recycling, Recycling /Reuse scrap metal and metals in general, Soil remediation in those contaminated points, Groundwater monitoring program and Creating public spaces once removed respective vehicles, waste and once remediation has taken place. We understand that transforming conflict areas and bringing back the importance that the environment represents, it is a way to achieve a balance together the implementation of best practices that contribute to improve quality of life.

T3-D.2 Van Doren, JM*; Kleinmeier, D; Ziobro, GC; Parish, M; Hammack, TS; Gill, V; Nsofor, O; Westerman, A; U.S. Food and Drug Administration; [email protected] FDA Risk Profile on Pathogens and Filth in Spices In response to renewed concerns regarding the effectiveness of current control measures to reduce or prevent illness from consumption of spices in the United States, the FDA began development of a risk profile on pathogens and filth in spices. The objectives of the risk profile were to (1) describe the nature and extent of the public health risk posed by the consumption of spices in the United States by identifying the most commonly occurring microbial hazards and filth in spice (2) describe and evaluate current mitigation and control options designed to reduce the public health risk posed by consumption of contaminated spices in the United States (3) identify potential additional mitigation and control options and (4) identify critical data gaps and research needs. Research for the report included a comprehensive review of the refereed scientific literature and available government/agency reports, and analyses of relevant FDA and CDC data. Data and information from stakeholders was formally requested with a Federal Register Notice and provided critical information on spice sampling and testing by the spice industry. Site visits to spice farms and spice processing and packing facilities, facilitated by spice industry trade organizations and foreign governments, provided the Risk Profile Team with first-hand knowledge of current practices. In order to fill some critical data gaps, FDA field assignments and laboratory research were undertaken. This presentation will present key findings of the risk profile including identifying pathogens and filth detected in spices, describing foodborne outbreaks associated with spice consumption, and characterizing the prevalence and level of Salmonella and filth found in spices at different points in the farm-to-table supply chain. The presentation will conclude by identifying research needs to better understand and thereby reduce foodborne illnesses from the consumption of spices.

W4-A.1 Vandenberg, J*; Cogliano, V; Owens, EO; Cooper, G; Ross, M; National Center for Environmental Assessment, U.S. Environmental Protection Agency, Research Triangle Park, NC (1, 3, 5) and Washington, DC (2, 4); [email protected] Determining Causality in Environmental Assessments Within the U.S. EPA, the National Center for Environmental Assessment (NCEA) draws judgments regarding causality for health effects related to environmental pollutant exposure. Formal frameworks and criteria are in place to characterize the strength of the scientific evidence and draw conclusions on causality for exposure-effect relationships. The framework used in the Integrated Science Assessments (ISAs) that support the periodic review of the U.S. National Ambient Air Quality Standards (NAAQS) establishes uniform language concerning causality and brings greater consistency and specificity to the ISAs. The framework relies on the evaluation and integration of multiple lines of evidence to draw conclusions with regard to factors such as consistency, coherence, and biological plausibility of health and ecological effects evidence. To develop this framework, EPA drew on relevant approaches for similar scientific decision-making processes by EPA (e.g., EPA Cancer Guidelines) and other organizations (e.g., Institute of Medicine). This session will present the causal framework in use in the ISAs and discuss how frameworks such as these may be applied to hazard assessments of other chemical exposures that may have different types of evidence bases such as may be encountered in the assessments conducted by the Integrated Risk Information Systems (IRIS). Disclaimer: The views expressed are those of the authors and do not necessarily reflect the views or policies of the US EPA.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T4-C.4 Vatanpour, S*; Hrudey, SE; Dinu, I; University of Alberta; [email protected] Blood transfusion public health risk to explore limitations of the common risk matrix A risk matrix is a popular semi-quantitative tool for assessing risks, and setting priority in risk management to address major hazards in diverse contexts. The matrix depicts risk magnitude as a function of harm likelihood and severity descriptors and has been used for diverse risks ranging from terrorism to drinking water safety. While the method can be informative in distinguishing high and low risks, previous work described theoretical circumstances, where hazard likelihood and its consequence are negatively correlated, under which the insight provided by the risk matrix would be worse than useless. Because the risk matrix is so widely used, we have evaluated the theoretical concern using a public health risk scenario tainted blood transfusion risk. We analyze the situation in which the hazard likelihood is negatively correlated with the hazard severity, and investigate its implications, including the scenario of making a decision worse than random, a practice that could result in tainted blood tragedy. Evidence about the frequency of blood infectious diseases in blood donors and population of Canada is provided by reports of the Public Health Agency of Canada. The severity was rated according to a scale provide by the U.K. National Health Service. The risk matrix was constructed based on the available risk evidence. Although the negative correlation between hazard likelihood and its consequence complies with our data, the risk matrix constructed for blood transfusion risk data discriminates reasonably among risks. This suggests that such a negative correlation which often occurs does not always invalidate the risk matrix. Previous work has raised an important concern about the validity of risk matrices, but this concern has only been illustrated in theory. Authentic scenarios to show the invalidity of risk matrices need to be evaluated. We recommend careful consideration of how risk matrices are used in risk management and we offer some explicit advice to this end.

T3-A.1 Veeramany, A*; Mangalam, S; Technical Standards and Safety Authority; [email protected] Risk-informed regulatory compliance enforcement model for technical infrastructure in public domain Inspections are a critical instrument for regulatory organizations to enforce timely compliance with regulations and standards of practice in a variety of areas of public interest such as environment, health and public safety. The amount of time provided to a regulated entity to comply with the regulations when a deficiency is identified on the site is the subject matter of this study. A typical regulatory model is to empower the inspectors to assess threat to public safety and provide practical time duration to comply with the regulations. An alternative approach being proposed is a semi-quantitative risk analysis to govern the time to compliance using an established cause-effect reasoning methodology. The objective would be to assess and characterize the risk associated with possible scenarios that could result in the presence of an unresolved non-compliance and establish a risk-informed time to compliance. This paper shares the knowledge and experience gained from implementing such a model at the Technical Standards and Safety Authority, Ontario, Canada, a regulatory organization with the responsibility of managing public safety associated with technical devices and infrastructure such as elevating devices, boilers and pressure vessels, power plants, pipelines etc. The technique to determine time to compliance is a three step process. In the first step, a deficiency observable through an inspection process is assessed for likelihood, severity and exposure using cause-effect reasoning techniques such as failure modes and effects analysis. Secondly, an acceptable risk threshold is chosen in terms of Disability-Adjusted Life-Years (DALY) per cumulative effect in agreement with all the stakeholders. Thirdly, time to compliance is determined based on the risk, context sensitive attributes and the acceptable limit. The inspection report shall include the nature of contravention, steps to address it and the time to comply.

P.14 Vianna, NA*; Saldiva, PHN; University of Sao Paulo; [email protected] Air pollution in Salvador, BA, Brazil: An experience of risk analysis Scientific basis for understanding the effects of air pollution on human health are necessary in worldwide because local data are important to support decision making and air pollution control. Developing countries have presented difficulty for pollutants detection and measure of air pollution. This is a challenge for the implementation of air quality standards. In Brazil a Program from Environmental Health Agency called VIGIAR makes surveillance in air quality on human health, but there is not technology available for measures of pollutants in all brazilian cities. For implementation this program in a metropolitan area of brazilian megacity was necessary involving stakeholders. The aim this study was to make a framework between academic and public sector for application of risk analysis and air quality management. The pollutants where characterized in terms of chemical composition. The receptor model was used for detection particulate matter 2.5 during two years in Salvador city. The composition of particulate matter was studied for understanding local emission. Alternative tools as biomonitoring were used, including morphological analysis of particles. Studies about effects on human health was used data of respiratory and cardiovascular diseases. Strategies of risk communication are necessary yet. After validation of methodology, these results will be used to support decision making in Salvador, as well as help in policy for air pollution control and to protect human health.

W3-J.4 Viauroux, C.*; Gungor, A.; University of Maryland, Baltimore County (Viauroux); U.S. Coast Guard (Gungor); [email protected] Econometric Model Estimating the Effectiveness of Life Jacket Wear in Recreational Boating Using Data from Coast Guard’s Marine Information Safety and Law Enforcement (MISLE) Database This paper presents the results of our analysis designed to estimate the effectiveness of life jacket wear in recreational boating activities. This effectiveness rating, which is typically measured by the number or percent of fatalities that could have been avoided, is important in assessing the potential benefits of USCG regulation, policy, and programs aimed at reducing the frequency and severity of recreational boating accidents. In 1993, the National Transportation Safety Board conducted a study (Safety Study: Recreational Boating Safety, PB93-917001, NTSB/SS-93/01) that estimated an effectiveness of 85 percent from examination of 281 drowning cases from recreational boating accidents in which persons were not wearing life jackets (an estimated 238 of them would have survived with a life jacket). We present a regression model to estimate the effectiveness of life jacket wear using the USCG Boating Accident Report Database (BARD), which contains detailed casualty data for boating accidents reported to the USCG. Focusing on the 2008-2011 period, this model suggests that life jacket wear could have prevented a substantial number of fatalities during this period.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.100 Victory, K*; Cabrera, N; Larson, D; Reynolds, K; Latura, J; Beamer, P; University of Arizona and Mariposa Community Health Center; [email protected] Risk perception of drinking water quality in a US-Mexico Border community The United States and Mexico are the largest consumers of bottled water worldwide, but it is unclear what causes this increased consumption. We previously demonstrated, in a cross-sectional study, that in the border town of Nogales, Arizona approximately 85% of low-income residents primarily drink bottled water and 50% cook with it. In the current study, we interviewed ninety low-income Latinos in Nogales, AZ to assess differences in perceived risks of drinking municipal tap water and bottled water and to understand why these families use bottled water as their primary drinking water source. Respondents viewed drinking tap water to be a significantly more risky activity than consuming bottled and other purchased sources of water (p<0.001). Additionally, 98% of respondents feared that drinking local municipal tap water could result in adverse health effects such as cancer, lupus or gastrointestinal illnesses and did not associate drinking bottled water with any health outcomes. The majority of respondents (79%) stated that their primary reason for not drinking their tap water was fear of chemical and microbial contamination, compared to only 17% who preferred the taste of bottled water. Furthermore, respondents had significantly higher perceived risk (p<0.001) of drinking tap water in Mexico when compared to the U.S., but no differences among U.S. cities. Parents who are thirty-five years and older had significantly higher perceived risk (p<0.001) of the safety of their tap water compared to younger parents. We found no significant differences in perceived risks by gender, education, income or immigration status. Based on these results, future studies are needed to assess if these findings are localized to Nogales, or persists in other parts of the state or border region.

M2-A.2 von Stackelberg, K*; Guzy, E; Claus-Henn, B; Harvard School of Public Health; [email protected] Metals, Mixtures, Pathways: Systematic Review to Support Risk Assessment Synthesizing exposure, toxicological, epidemiologic and biological pathway information to determine the potential for exposure to environmentally-relevant concentrations of mixtures of contaminants in conjunction with genetic influences to lead to specific health outcomes requires a critical evaluation of the intersection of environmental exposures (what are the exposure concentrations in the environment and how do those relate to biologically-effective doses), the evidence for particular effects from toxicological and epidemiological data, and what is known about cellular events at the subclinical scale in terms of disease etiology. This allows an evaluation of biological plausibility with respect to a hypothesized mode-of-action based on the best available understanding of molecular events required for disease progression, evaluated in the context of what is known about how these compounds exert their biological influence, and exposure conditions necessary to achieve absorbed doses relevant to the pathways of interest. We explore these emerging issues in risk assessment in the context of exposures to mixtures of metals, including lead, arsenic, and manganese, and neurodevelopmental health outcomes in children. Included is a discussion of emerging methods for characterizing potential risks from exposure to mixtures together with potential gene-environment interactions (Liu et al. 2012; EC 2011; Sargiannis and Hansen 2012; Backhaus and Faust 2012). We provide a review of the literature on gene-environment and chemical mixture interactions, particularly in the context of neurodevelopmental health outcomes in children. We discuss examples of approaches for synthesizing exposure, effect, and disease etiology with particular emphasis on our own research on exposure to mixtures of metals in sediment and neurodevelopmental effects at the center of the Superfund Research Program at the Harvard School of Public Health (Claus Henn et al. 2011; 2012; Bellinger 2012).

T1-F.4 Wachinger, G*; Renn, O; Wuthe, J; Wiehe, F; University of Stuttgart, ZIRIUS; [email protected] Emerging health risks: Early Participation in Hospital Restrukturing Conflicts Although health related risks are still in decline in Germany, the development is becoming unstable: Due to demographic changes and due to high level of technologies used in medical care, the health care becomes ever more expensive. Therefore activities have been launched to restructure present health care system in terms of prevention and preparedness, but in the same time safety and availability to reach medical facilities for all people must be ensured. This is a prerequisite even if as a result of restructuring some of the hospitals will be closed. Planning processes for hospitals are usually triggered by occupational and economic pressure, and often by a combination of both. These aspects are in a natural conflict with socio-cultural services including a strong demand of the local communities for a local hospital in spite of the fact that many members of these communities commute to outside facilities as they believe they will get better care there. The state government of Baden-Württemberg has initiated the “Health Dialogue” as a tool to involve multiple stakeholders, including the public, in an early stage of planning process to prevent future health related risks. We have designed, implemented and tested this early participation process on three levels (state, regional and local). First results are obtained form a concrete conflict (a closedown and rebuilding of hospitals in Baden-Württemberg). Patient organisations, people working in the concerned hospitals and the general public are invited to work on recommendations for the decision makers and to take part in the political decision about redesigning the hospital structure in their region. This tool turned out to be effective and could be applied to other conflicts, where sociocultural topics are directly interlinked wi th ce rt ain risks. It i s o u r bel i ev e t h a t by a dd re s s in g underlying conflicts through an early empowerment of highly concerned people, a better social capacity for upcoming risks can be achieved.

T2-J.4 Wald-Hopkins, P*; Ryti, RT; Neptune and Company, Inc.; [email protected] Updates to Ecological Preliminary Remediation Goals for Soils at the Los Alamos National Laboratory Approximately 10 years ago Los Alamos National Laboratory (LANL) developed an approach for calculating ecological soil preliminary remediation goals (ECOPRGs) for wildlife. This initial approach was based on LANL’s protocol for screening level ecological risk assessment, except that impacts on plants or soil invertebrates were not assessed quantitatively with these original ECOPRGs. Aspects of the ecological screening assessment were modified so that more representative toxicity and exposure parameters replaced the protective assumptions used in calculating screening levels. For example, exposure estimates for wildlife receptors were modified based on representative area-use factors and site-specific data. Updates to these ECOPRGs are possible because new information regarding toxicity and bioavailability to wildlife, plants, and soil invertebrates is available. For instance, the European Commission has published adjustments to toxicity to account for soil properties, using the terrestrial biotic ligand model (tBLM) or equivalent. Such studies are based on toxicity tests with standard soil biota, which may or may not be representative of species commonly found in arid and semiarid environments. This presentation focuses on refinements to the LANL ECOPRGs such that they are directly protective of impacts on plants and soil invertebrates. These updated ECOPRGs reflect ecotoxicity literature that is relevant to the LANL environment. . Refinements include separating studies based on the relevance of the test species to arid environments or matching/adjusting key soil properties like organic matter content to better reflect LANL soils.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts T1-D.1 Walderhaug, MO*; Menis, M; Anderson, SA; U.S. FDA CBER; [email protected] Use of an Administrative Database to Characterize Babesiosis Occurrence in the United States, 2006-2008 Babesiosis is a zoonotic disease caused by several species of protozoan parasite of the genus Babesia. This illness is typically spread to humans by tick bite. The symptoms of babesiosis range in severity from asymptomatic to high fever with hemolytic anemia. Symptoms are more likely to be severe for cases where the infected person is very young, very old, or immunocompromised. Blood units containing babesiosis that were collected from asymptomatic blood donors represent a threat to blood recipients, and there is no approved screening test at present. To assess the risk of transfusion transmitted babesiosis, we undertook an analysis to estimate the number of potential carriers of babesiosis that could be donating. Babesiosis became a nationally notifiable disease in 2011, however before 2011, the illness was sporadically reported to the CDC. We performed a nationwide study of the diagnosis of babesiosis in the inpatient, outpatient, skilled nursing facility, and carrier standard analytical Medicare enrollment files for calendar years 2006 to 2008. Our findings show that estimates of state-specific rates from our study are up to ten times higher than nationally reported rates at the time. Babesiosis rates were highest in the Northeastern states of the U.S. The rates of babesiosis in males over 65 years in age were significantly higher than rates in females which may reflect higher exposures to ticks in Babesia-endemic areas by males partaking in outdoor activities. Accurate estimates of the rates of asymptomatic cases of babesiosis in blood donors is an important factor in estimating the benefits and risks of screening blood donors for carriers of babesiosis. The use of these results in a general population risk assessment is limited in that the database used provides data on the elderly but not on the general population. Large administrative databases may have some limitations, but they may also be useful sources of health information for risk assessments.

T1-E.3 Wang, B*; Gray, GM; George Washington University; [email protected] Predicting long-term Benchmark Dose from short-term studies in National Toxicology Program toxicity tests Animal toxicity studies play a major role in risk assessment by providing regulators with dose-response data to estimate health risks in humans. However, there is a push to find more effective, efficient and less animal-intensive methods to assess the risk and yield reliable health risk values (like the Reference Dose (RfD)). This study compared short-term (3 months) and long-term (2 years) toxicity data of 41 chemicals in National Toxicology Program (NTP) to evaluate whether well-conducted short-term studies may yield data as reliable as long-term studies in identifying lowest doses associated with non-cancer adverse effects. Lower confidence limits of Benchmark Doses (BMDLs) were computed for non-neoplastic lesions, final mean body weight and mean organ weight. Linear regression was performed to predict long-term BMDLs and RfDs from short-term data. Concordance of affected organs relevant to the lowest short-term and long-term BMDLs was assessed. In addition, similar analysis was performed by species-sex groups. Interestingly, 34 of 41 chemicals (83%) had a less than 10-fold difference between the BMDLs of short-term and long-term studies. The linear regression showed statistically significant positive association between short-term and long-term BMDLs and RfDs. However, only nine of 41 chemicals (22%) had matching affected organs between short-term and long-term studies. By-group analysis showed a similar strong quantitative but weak qualitative association between short-term and long-term studies. The findings suggest the risk assessed in short-term animal studies provided a reasonably quantitative estimate of that based on long-term studies. However, the limited concordance of adverse effects within rodent species should be considered in interspecies extrapolation from animals to human in risk assessment process.

P.123 Wang, Y; Georgia Institute of Technology; [email protected] Nanoscale risk assessment and uncertainty quantification in atomistic simulations Uncertainties in atomistic simulations imply the associated risks in simulation-based materials and drug development. Lack of data, conflicting information, numerical and measurement errors are the major sources of epistemic uncertainty in simulation. In particular, the sources of model form uncertainty for molecular dynamics (MD) include imprecise interatomic potential functions and parameters, inaccurate boundary conditions, cut-off distance for simplification, approximations used for simulation acceleration, calibration bias caused by measurement errors, and other systematic errors during mathematical and numerical treatment. The sources for kinetic Monte Carlo (kMC) simulation include unknown stable and transition states, and imprecise transition rates. In this work, we illustrate the sensitivity and effect of model form uncertainty in MD and kMC simulations on physical and chemical property predictions. A generalized interval probability formalism is applied to quantify both aleatory and epistemic uncertainties. New reliable MD and kMC simulation mechanisms are proposed, where the robustness of simulation predictions can be improved without the traditional second-order Monte Carlo style sensitivity analysis. Examples of engineering materials and biochemical processes are used to demonstrate the new nanoscale risk assessment approach.

P.69 Wang, M*; Lambertini, E; Micallef, SA; Pradhan, AK; University of Maryland, College Park, MD; [email protected] Risk Assessments for Listeria monocytogenes and Salmonella spp. in Melons In the past decade, with the increasing public preference for fresh produce, the risk of illnesses associated with consuming raw and minimally processed fruits and vegetables has drawn increased scrutiny from various stakeholders including consumers, industry, government, and academia. Annual consumption of non-citrus fresh fruits, including melons, increased 45.5% from 1976 to 2009. Melons are highly popular due to their high nutrition value and the attractive natural flavor. However, melons are vulnerable to pathogen contamination because they are grown on ground, minimally processed, and eaten raw. Therefore, melons are considered as the second highest fresh produce commodity of concern for microbial risk. Salmonella spp. and Listeria monocytogenes, two of the most deadly foodborne pathogens, have been associated with melons contamination, recalls, and most importantly two recent large-scale outbreaks in 2011 and 2012. While government guidelines on Good Agricultural Practices and post-harvest Best Practices have been published for cantaloupes, no quantitative estimate of risk and mitigation effectiveness are available for any melon variety. In support of such quantitative risk assessment efforts, the goal of this study was to systematically review existing data on the risk of contamination from Salmonella spp. and Listeria monocytogenes and their ecology in the melon production chain. Specific objectives were to review: (i) production and consumption of common melon varieties (cantaloupe, honeydew, and watermelon), (ii) potential contamination sources in the farm-to-fork supply chain, (iii) prevalence and survival of pathogens associated with melons, and (iv) potential intervention strategies for risk reduction in the melon industry. This systematic review synthesizes critical information needed for conducting farm-to-fork quantitative microbial risk assessment (QMRA) models for L. monocytogenes and Salmonella spp. on melons.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.154 Wang, A*; Filer, D; Shah, I; Kleinstreuer, N; Berg, E; Mosher, S; Rotroff, D; Marinakos, S; El-Badawy, A; Houck, K; AW, DF, IS, DM, KH: US EPA. NK: ILS. EB: BioSeek Inc. DR: NC State Univ. SM: Duke Univ.; [email protected] Comparing bioactivity profiles of diverse nanomaterials based on high-throughput screening (HTS) in ToxCast™ Most of the over 2800 nanomaterials (NMs) in commerce lack hazard data. Efficient NM testing requires suitable toxicity tests for prioritization of NMs to be tested. The EPA’s ToxCast program is evaluating HTS assays to prioritize NMs for targeted testing. Au, Ag, CeO2, Cu(O2), TiO2, SiO2, and ZnO nanoparticles, their ion and micro counterparts, carbon nanotubes (CNTs), asbestos, and pesticides containing nano-Cu(O) - total 62 samples - were screened at 6 -10 concentrations each for 262 bioactivity/toxicity endpoints in cells and zebrafish embryos. Cellular stress and immune response pathways were primarily affected. NM’s core chemical composition was more important than size for bioactivity. NMs had similar profiles as their ion counterparts, suggesting ion shedding was a key factor in mechanism of action. Ag, Cu, and Zn (nano, ion) were more cytotoxic and active in more assays than others. While 3 asbestos samples had similar immune response profiles, 6 CNTs had profiles distinctive from asbestos. Potential bioactivity targets that were not directly measured were suggested by reference profiles similar to our data, e.g. similar profiles of a microtubule stabilizer interfering with mitosis and our nano-TiO2. Dividing endpoints into cytotoxicity and various function domains, we developed a ToxPi-based ranking approach for in vitro bioactivity. Samples active in more domains at lower concentrations were ranked higher than samples active in fewer domains and/or at higher concentrations. Ag, Cu, and Zn samples were ranked as high in vitro bioactivity; asbestos, Au, CeO2, some CNTs, and some TiO2 samples were ranked as low. Recognizing our assays using submerged cells may have limited sensitivity to inhalation effects, we are exploring prioritization approaches for various purposes. We demonstrated that HTS assays can identify affected cellular pathways, predict targets, and may be useful for ranking NMs for specific purposes. This abstract does not reflect EPA policy.

T4-F.5 Way, D.H.P.*; Bouder, F.; King's College London; Maastricht University; [email protected] Transparency and risk communication in the european pharmaceutical sector In recent years, European pharmaceutical regulators have increasingly committed to a range of ‘transparency’ initiatives including releasing safety-related documents and disclosing committee-meeting minutes. Yet, the regulators – including the European Medicines Agency (EMA) – continue to be criticised for lacking ‘transparency’. The debate has now greatly intensified with many calling for further measures (e.g. journal editors, politicians, NGOs). In response, the regulators have overwhelmingly focused on disclosure and access to raw data or what has been coined “fishbowl transparency”, with proposals to proactively release clinical trial reports in 2014. However, transparency is not problem free and can have both unintended and counterintuitive outcomes. Some have argued that "fishbowl" rather than reasoned transparency can also lead to raising public expectations to unrealistic levels. In conjunction, we have expressed concerns that the regulators have not tested their strategies for trust and risk communication leading to a dearth of evidence to inform the debate. In seeking to inject much needed evidence into this highly understudied area, this paper presents the results of a large European survey (N = 5,648) that compares six European countries. Specifically, the paper conveys results on how European citizens and patients are likely to react to the regulators’ “fishbowl" transparency measures including clear (and surprising) national-level variations

P.81 Webler, TW*; Tuler, SP; Social and Environmental Research Institute; [email protected] Progress in new tools for participatory vulnerability analysis to climate stressors City officials want to better understand how their communities are vulnerable to climate change. We used social science of hazard management and deliberative learning to develop a method for participatory vulnerability assessment that organizes expert and local knowledge about climate hazards. Facilitated deliberation promotes learning and is favored by participants as a “natural” way of carrying out self-governance. We report here on the results of employing this method in the City of Boston.

M2-B.3 Weed, DL; DLW Consulting Services, LLC; [email protected] On the utility of criteria-based methods of causal inference Criteria-based methods have been discussed in the epidemiologic literature since the late 1950’s and continue to be used today, with recent extension into the assessment of toxicologic evidence. This paper will discuss the theoretical and practical utility of criteria-based methods of causal inference, including but not limited to Hill’s criteria (i.e. strength, consistency, dose-response, specificity, temporality, biological plausibility, experimentation, coherence, and analogy). Included will be a discussion of how these methods fit within a broader methodology for assessing causation. Assessing the utility of criteria-based methods involves: (1) a historical assessment of the use of these criteria, (2) the use of these criteria in organizational settings and by toxicology, and (3) the relationship between these criteria and the scientific method. Criteria for causation have been continuously used in scientific practice for 50 years. These criteria represent key concerns of the scientific method. Examples follow. The criterion of consistency reflects the scientific principles of replicability and testability. The criterion of strength (of association) reflects the basic scientific concept of critically testing alternative explanations. Experimentation reflects the need to test and control for alternative hypotheses. Temporality is a key feature of any causal hypothesis. Specificity reflects the need to test the hypothesis of interest and not some different hypothesis. Biological plausibility incorporates biological explanations with those explanations at the level of human populations by examining the extent to which the basic causal hypothesis has been tested in cellular systems and in animal models. Dose response reflects a basic toxicological principle: the greater the exposure to a causal agent, the greater the effect. The criteria and the general scientific method are not only compatible but inseparable. Challenges to the use of the criteria will

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W4-F.5 Wiener, JB; Duke University; wie[email protected] Global Risks, Catastrophes, Crises, Regulation and Liability Global catastrophic risks pose several challenges for governance. This presentation takes a comparative institutional approach to different legal/policy measures for such risks, notably the classic alternatives of regulation and liability. Ex ante regulation may be precautionary, but faces challenges including priority-setting, benefit-cost judgments, risk-risk tradeoffs, and the political psychology of mobilizing regulatory action to address a future catastrophic risk. As to mobilizing action, collective action problems pose serious hurdles for global regulation; and although crisis events can spur demand for regulation, global catastrophes may lack antecedent crises as a basis for institutional learning, and not all types of crises may spur effective types of regulatory responses. Ex post liability may be compensatory and deterrent, but faces challenges including foreseeability, proving causation, multiple plaintiffs and defendants, sovereign immunity, defendants not subject to national or international courts' jurisdictions, damages valuation, damages exceeding tortfeasors' assets, and the catastrophe itself disabling the institutions of liability. These alternatives and their challenges will be illustrated with examples such as "back contamination" from outer space, climate change, and geoengineering. The presentation draws on the author's involvement in research projects on "The Tragedy of the Uncommons" and on "Recalibrating Risk: Crises, Perceptions and Regulatory Responses."

P.141 Wilkie, A*; Datko-Williams, L; Richmond-Bryant, J; ORISE; U.S. EPA; [email protected] Analysis of U.S. soil lead (Pb) studies from 1970-2012 Although lead (Pb) emissions to the air have substantially decreased in the United States since the phase-out of leaded gasoline completed in 1995, amounts of Pb in some soils remain elevated. Lead concentrations in residential and recreational soils are of concern because health effects have been associated with Pb exposure. Elevated soil Pb is especially harmful to young children due to their higher likelihood of soil ingestion. In this study, U.S. soil Pb data published from 1970 through 2012 was compiled and analyzed to reveal spatial and/or temporal soil Pb trends in the U.S. over the past 40 years. A total of 84 soil Pb studies across 62 U.S. cities were evaluated. Median soil Pb values from the studies were analyzed with respect to year of sampling, residential location type (e.g., urban, suburban), and population density. In aggregate, there was no statistically significant correlation between year and median soil Pb; however, within single cities, soil Pb generally declined over time. Our analysis shows that soil Pb quantities in city centers were generally highest and declined towards the suburbs and exurbs of the city. In addition, there was a statistically significant, positive relationship between median soil Pb and population density. In general, the trends examined here align with previously reported conclusions that soil Pb levels are higher in larger urban areas and Pb tends to remain in soil for long periods of time. The views expressed in this abstract are those of the authors and do not necessarily represent the views or policies of the U.S. Environmental Protection Agency.

W2-A.2 Williams, RA; Mercatus Center at George Mason University; [email protected] Too Many Rules, Too Much Risk Risk assessment is performed to aid in deciding where interventions in markets can act to reduce risks. Risk assessments typically address individual hazards and either one or multiple sources of exposure. These risk assessments inform federal regulations, which act in conjunction with state, local and even private rules. Even prior to the widespread introduction of risk analysis however, governments were establishing rules in the attempt to reduce risk. Some rules are over 100 years old. These rules are rarely changed and almost never removed, despite a rapidly changing society. At the federal level, this has resulted in 170,000 pages of rules with mandatory requirements, to do or refrain from doing something, at well over 1 million. Twenty five years ago Elizabeth Nichols and Aaron Wildavsky reported that multiple rules and safety systems were responsible for nuclear power plants being less safe as rules and systems began to interfere with one another – a “major contributing factor” in the Chernobyl accident. With more research, we are beginning to uncover other reasons that the idea that more rules equal more safety, the “linearity” assumption, may be wrong. For example, recent research by industrial psychologists shows that ex c essive amounts o f ru l es t u rn s pl a n t empl oy ee s in to automatons following excessive, prescriptive rules rather than being aware of new threats and solving problems. Even when trying to follow rules, equal weighting on rules of widely differing effectiveness causes less attention to be paid to the serious ones.

P.35 Williams, BH*; Pierce, JS; Glynn, ME; Johns, LE; Adhikari, R; Finley, BL; Cardno ChemRisk; [email protected] Residential and Occupational Exposure to Wood Treating Operations and Risk of Non-Hodgkin Lymphoma: A Meta-Analysis There are hundreds of former and currently active wood treating facilities in the United States, and over time concerns have been raised regarding the potential chronic health effects associated with wood treating-related exposures. It has been suggested in the peer-reviewed literature that exposure to chemicals related to historical wood treating operations (in particular, pentachlorophenol [PCP]) is associated with an increased risk in non-Hodgkin lymphoma (NHL). To test the merits of this assertion, we conducted a systematic review of all published and unpublished analyses that report risk estimates for NHL in (1) residents of communities surrounding wood treating operations, (2) wood treating workers, and (3) non-wood treating workers who were exposed to chemicals associated with wood treating operations (creosote, coal tar and associated polycyclic aromatic hydrocarbons [PAHs] and PCP). A total of 12 studies, including independent cohort, record-linkage, and case-control studies, were included in the meta-analysis. Using a random effects model, meta-relative risks (meta-RRs) were calculated for each exposure group. The summary relative risk (meta-RR) for NHL overall was 1.31 (95% confidence interval [CI]: 0.93, 1.85). No statistically significant meta-RRs were observed among residents of communities in the vicinity of wood treating facilities (meta-RR=0.75; 95% CI: 0.37, 1.51); wood treating workers (meta-RR=1.89; 95% CI: 0.69, 4.12); workers exposed to coal tar, creosote, and associated PAHs (meta-RR=1.37; 95% CI: 0.80, 2.34); and workers exposed to PCP (meta-RR=1.61; 95% CI: 0.99, 2.62). Notably, many of the occupational studies, and in particular those conducted among manufacturing workers, were limited by the inability to distinguish the potential confounding effects of contaminants, particularly polychlorinated dibenzo-p-dioxins (PCDDs), within chlorophenols. Nevertheless, there is no evidence in the studies reviewed that residential or occupational exposures related to wood treating operations increase the risk of NHL.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.15 Willis, AM*; Maier, A; Reichard, J; Haber, L; Patterson, J; Toxicology Excellence for Risk Assessment (TERA); [email protected] Practice makes perfect: Lessons and outcomes based on mode of action/human relevance framework application to case studies. A public workshop, organized by a Steering Committee of scientists from government, industry, university, and research organizations, was held at the National Institute of Environmental Health Sciences (NIEHS) in September, 2010. The workshop explored the development of dose-response approaches for receptor-mediated liver cancer within a Mode of Action (MOA) Human Relevance Framework (HRF) (WHO/IPCS). Case studies addressed activation of the aryl hydrocarbon receptor (AHR), the constitutive androstane receptor (CAR), and the pregnane X receptor alpha (PPARα). The workshop case studies provided a valuable exercise in applying the MOA/HRF and a number of insights and lessons emerged that may be useful for future applications. Inclusion of associative events and modulating factors into the framework was useful for the consideration of mechanistic data to inform dose-response. In particular, associative events and modulating factors could be a useful approach for the integration of molecular and biomarker data to inform risk assessment. Further guidance in the MOA/HRF would be useful to clarify the number of key events needed to define a MOA, to address the absence of human studies needed to inform critical interspecies differences, and to assess activation of multiple MOA cascades and interactive molecular signaling pathways. In addition, variation in human population susceptibility needs to be addressed for assessing human relevance, particularly the possibility that populations exhibit widely ranging individual thresholds. A qualitative “value of information” approach to assess the utility of further research to better inform dose-response and a detailed and systematic uncertainty analysis for use in the framework would be useful.

M4-G.3 WIlson, RS*; McCaffrey, S; The Ohio State University, USDA Forest Service; [email protected] Do I stay or do I go? Risk attitudes and evacuation decisions during a wildfire event Most socio-psychological wildfire research focuses on risk mitigation actions taken before a fire event occurs with less attention paid to homeowner actions during a fire. However, increasing incidences of homeowners refusing to evacuate or leaving at the last minute during wildfires and other natural disasters had led to a growing interest in research into evacuation decision making. We randomly selected homeowners from three counties in the states of Washington, Texas and South Carolina, and conducted a mailed survey to assess their evacuation decision making process in the spring of 2013. These three counties were specifically chosen because of their high wildfire risk, but they each varied in terms of past experience with wildfire and mandatory evacuation orders. Drawing on Protection Motivation Theory and the Extended Parallel Process Model, we assessed individual homeowner's threat and coping appraisal related to wildfire, as well as individual risk attitudes. We expected that individuals with risk averse attitudes would have higher threat appraisals and lower coping appraisals than those with risk tolerant attitudes. As a result it was expected that risk averse individuals would be more likely to evacuate early, whereas the risk tolerant would vary in their evacuation response based on their respective appraisals. In effect, the pattern would be that evacuation decisions are made heuristically, whereas decisions to stay and defend, or wait and see, are more systematic and driven by individual differences in the perceived threat and ability to cope with wildfire. To ensure maximum public safety, crisis communication efforts could target the risk tolerant, focusing on correcting the misperceptions or critical beliefs that tend to drive individuals to ignore evacuation orders and place their lives in danger.

T3-G.1 Wilson, RS; The Ohio State University; [email protected] Information processing, risk and uncertainty: A roundtable discussion There is a breadth of research that informs our understanding of how individuals process information about risk and uncertainty, with much of this research focusing on the heuristic processing of information through a variety of mechanisms. As a result of decades of research, many useful theoretical models have evolved to inform how this type of processing influences judgments (e.g., hazard acceptance models, the risk information seeking and processing model, etc). Ideally, these theoretical perspectives should influence the way we as practitioners communicate and structure decisions, yet the breadth of perspectives can make it difficult to know which is "best" or "right" for a particular context. This roundtable brings together an interdisciplinary panel of accomplished risk and decision scientists who will discuss how their research informs our understanding of information processing and what their findings suggest for improving risk communication and decision making in the context of real-world problems. Each panelist will have an opportunity to address these two key points before opening up the session to general discussion and questions from the audience. Invited roundtable panelists include: Paul Slovic (Decision Research) Joe Arvai (University of Calgary) Katherine McComas (Cornell University) Michael Siegrist (ETH Zurich) Janet Yang (State University of New York at Buffalo) Nathan Dieckmann (Oregon Health and Science University)

W4-F.2 Wilson, GS; Global Catastrophic Risk Institute; [email protected] Minimizing Global Catastrophic and Existential Risks from Emerging Technologies Through International Law Mankind is rapidly developing “emerging technologies” in the fields of bioengineering, nanotechnology, and artificial intelligence that have the potential to solve humanity’s biggest problems, such as by curing all disease, extending human life, or mitigating massive environmental problems like climate change. However, if these emerging technologies are misused or have an unintended negative effect, the consequences could be enormous, potentially resulting in serious, global damage to humans (known as “global catastrophic harm”) or severe, permanent damage to the Earth—including, possibly, human extinction (known as “existential harm”). The chances of a global catastrophic risk or existential risk actually materializing are relatively low, but mankind should be careful when a losing gamble means massive human death and irreversible harm to our planet. While international law has become an important source of global regulation for other global risks like climate change and biodiversity loss, emerging technologies do not fall neatly within existing international regimes, and thus any country is more or less free to develop these potentially dangerous technologies without practical safeguards that would curtail the risk of a catastrophic event. In light of these problems, this presentation serves to discuss the risks associated with bioengineering, nanotechnology, and artificial intelligence; review the potential of existing international law to regulate these emerging technologies; and propose an international regulatory regime that would put the international world in charge of ensuring that low-probability, high-risk catastrophes never materialize.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts P.82 Wilson, P*; Kubatko, A; Hawkins, B; Cox, J; Gooding, R; Whitmire, M; Battelle Memorial Institute and the Department of Homeland Security Chemical Security Analysis Center; [email protected] Challenges Associated with Communicating Multidimensional Risk Data to a Diverse Set of Stakeholders The Chemical Terrorism Risk Assessment (CTRA) is a DHS CSAC funded program that estimates the risk among chemical terrorism attack scenarios and assists in prioritizing mitigation strategies. Presentation of multidimensional results, specifically frequency, consequence, and risk results for a wide variety of attack scenarios, in a manner that is easily digestible to stakeholders from diverse backgrounds is a perpetual challenge on the CTRA. Graphical formats are commonly more comprehensible and meaningful than vast numeric tables; however, visually capturing multiple dimensions poses a difficult challenge. Experience has shown that pie and bar charts are the most aesthetically appealing and easily understood formats, yet such formats generally only present one dimension of the data and do not capture the uncertainty inherent in the results. Whisker and box plots portray the uncertainty associated with a single dimension of the data, but are generally not well understood by, and thus not appealing to stakeholders. Risk square plots that mimic traditional risk assessment matrices have proven useful at effectively communicating results, but struggle to depict the vast number of attack scenarios comprising the CTRA and wide range of scenario aggregates of interest to the various types of stakeholders. Often CTRA stakeholders desire drastically different aggregations in order to meet their specific needs of their missions. To better meet the needs of the wide array of CTRA stakeholders, notional data will be used to illustrate examples of risk visualizations for potential use in communicating results. Interactive data visualization concepts that allow stakeholders to customize scenario aggregation to meet specific needs will also be discussed.

P.129 Winkel, DJ*; Hawkins, BE; Roszell, LE; BATTELLE MEMORIAL INSTITUTE, US ARMY PUBLIC HEALTH COMMAND; [email protected] Development of exposure guidelines for chronic health effects following acute exposures to TICs Joint Chemical, Biological, Radiological, and Nuclear doctrine (JP 3-11) requires military commanders to minimize total risk in operational planning and execution. Incorporation of Military Exposure Guidelines (MEGs) into risk estimates provides a mechanism to consider short- and long-term chemical exposure risks. However, current MEGs (and civilian guidelines) do not address chronic non-cancer health effects resulting from a single acute exposure. This gap is a source of concern for planners in the medical community, as these effects may have implications for long-term protection of exposed military or civilian populations. Challenges in establishing this type of guideline are small sample sizes, difficulties/inconsistencies in identifying long-term effects, and uncertainty in exposure concentration and duration. Given these challenges, this investigation describes an approach to develop such guidelines, using chlorine as an exemplar chemical. Chlorine was selected due to its use in attacks with improvised explosive devices, its presence in industry, and a prevalence of cases in the literature. Reports from chlorine exposures were reviewed and data on exposure concentration, duration, and long term health outcomes were compiled. Health outcomes ranged from the presence of physical symptoms (e.g., shortness of breath) to abnormal pulmonary function test results. Binomial distributions were used to address issues with a small sample population; uniform distributions were used to address incomplete exposure terms. The approach was applied to data presently identified and a probit analysis was used to generate a probit curve capturing the dose-response of long term health effects due to acute chlorine exposure. The curve compares favorably to existing guidelines (both military and civilian) in that only severe exposures have the potential to cause chronic health effects. This approach is believed to be novel and may be applicable to other TICs with a limited data set.

T2-B.2 Wise, K*; Beck, N; Fischer, D; Pottenger, LH; Beatty, P; Cruzan, G; Becker, RA; 1) American Chemistry Council 2) American Chemistry Council 3) American Chemistry Council 4) The Dow Chemical Company 5) American Petroleum Institute 6) ToxWorks 7) American Chemistry Council ; [email protected] Getting the Science Right on Mode of Action: An Essential Element for IRIS Improvement All stakeholders have an expectation that EPA’s Integrated Risk Information System (IRIS) assessments will rely on the best available scientific information regarding hazard and exposure, and employ consistent, objective methods for establishing cause and effect. Given the importance of IRIS assessments in regulatory decisions, EPA must employ consistent, science-based frameworks, which utilize transparent approaches for evaluating study quality, integrating the most relevant scientific data, and clearly communicating uncertainty. Regulatory decisions have enormous public health and economic consequences, thus the scientific basis for IRIS assessments should include the full use of knowledge on modes of action (MOAs). In many ways, however, IRIS relegates MOA to a more limited role as a modifying feature, typically added only towards the end of an assessment, an approach which undervalues and undermines the vast extent of mechanistic research conducted over the last 30 years and current understanding of biology. One key litmus test for an improved IRIS will be adoption and use of a weight of evidence framework that incorporates all of the relevant and reliable data and knowledge of hypothesized MOAs, so that there is a clear and objective presentation of the extent to which existing data and knowledge do, or do not, support each hypothesis, including the default. The discussion will focus on best practices for considering scientific information on MOA and why applying them consistently is essential for IRIS improvement, not only to improve the objectivity and scientific basis of assessments, but also to help improve risk communication.

P.34 Wu, TT*; Chen, LH; Ho, WC; Lin, MH; Pan, SC; Fan, KC; Chen, PC; Wu, TN; Sung, FC; Lin, RS; China Medical University; [email protected] Air Pollution Patterns May Modify the Effect of Weight Gain on Lung Function among Adolescents Lung function is a very important index of respiratory health. Weight gain and air pollution both can have adverse effect on lung function. The objective of this study is to assess the modifying effect of air pollution patterns on weight gain related to reducing lung function. The study design was a retrospective birth cohort through linking birth registry record (birth weight and gestational age) and nation-wide junior high school student respiratory health survey database in central Taiwan. The study subjects were based on 10% of random sampling. For robust exposure assessments, we excluded the subject who had ever moved during the follow up in the analysis. Air pollution data including SO2, CO, O3, NO2 and PM10 were collected by high-density Taiwan Environmental Protection Administration monitoring stations. Multiple regressions were used, the adjusted variables including sex, age, height, weight, parental education level, family smoking, incense burning, exercise and temperature. Obesity was related to reducing lung function. Low birth weight had the similar effect. Obese adolescents who were born with low birth weight had the most adverse effect on lung function. Furthermore, air pollution patterns might modify the effect. It is necessary to protect public from the adverse effect of weight gain, especially considering the potential interaction with air pollution patterns.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts M2-E.4 Wu, F*; Liu, Y; Michigan State University; [email protected] Aflatoxin and cyanide: Global burden of disease Aflatoxin is a toxin produced by certain Aspergillus fungi that infect food crops, particularly corn, peanuts, and tree nuts (pistachios, hazelnuts, almonds, walnuts, etc.). It is the most potent naturally occurring human liver carcinogen known, and is particularly a danger in the parts of the world where corn and peanuts are dietary staples. We present our estimates of the global burden of human liver cancer caused by aflatoxin, and discuss the potential role of aflatoxin exposure in causing childhood stunting. Finally, we give a brief discussion of cyanide in cassava, its adverse effects, and the populations most at risk worldwide.

P.73 Wu, CH*; Huang, YF; Wu, KY; National Taiwan University; [email protected] Assessing the Health Risks of Dimethylformamide in an Occupational Setting Dimethylformamide (DMF) is an organic compound able to induce adverse health effects in humans. It is used in the production of pharmaceutical products, adhesives, and synthetic leathers. Employees working in these industries are likely to be at risk, particularly through inhalation and dermal absorption. Exposure assessment of DMF was conducted on 106 employees from three synthetic leather plants in Taiwan. The employees’ urinary samples were collected and subsequently analyzed. N-methylformamide (NMF), a DMF metabolite, is the biomarker identified to measure the body burden of DMF exposure. Results demonstrated that NMF concentrations from the majority of urinary samples collected prior to a work shift were mostly undetectable. However, urinary concentrations of NMF were significantly higher post-shift compared to pre-shift. Of the 318 urinary samples collected, the NMF concentrations in 59 samples exceeded the American Conference of Industrial Hygienists’ (ACHIH) recommended Biological Exposure Index (BEI) of 15mg/L. To assess the health risks of DMF, the reference concentrations (RfC) were calculated using the Benchmark Dose software. The benchmark dose, based on existing animal data, is calculated at 3.98 ppm (11.94 mg/m3). Based on this BMDL10, the RfC for DMF in these synthetic leather plants is derived at 0.04 ppm (0.12 mg/m3). The hazard indices for all 106 employees (HI) were also calculated and found that 89.6% of the employees have a HI value greater than one. Our results demonstrate that the majority of employees exposed to DMF are subject to noncarcinogenic adverse health effects, even though the amount of DMF exposure does not exceed current permissible exposure limits (PELs) of 10ppm. Further investigation focusing on the exposure to multiple compounds in these occupational settings is warranted. A review of risk management protocol of DMF should be considered since compliance with current regulations is inadequate in safeguarding health.

P.70 Wu, KY*; Chung, YC; Chen, CC; Hsiao, CH; National Taiwan University; [email protected] Probabilistic Risk Assessment with the Bayesian Statistics Markov Chain Monte Carlo Simulation US EPA already adopted probabilistic risk assessment (PRA) for decision making. Previously, PRA was conducted by mainly using the Monte Carlo (MC) simulation, which frequently requires either empirical or probability distributions of parameters to simulate the distribution of lifetime daily dose. The simulation results will be valid if only the input parameters, data and assumptions are valid. In practice, risk assessors frequently suffered from insufficient data to fit distributions for some parameters, especially concentrations and intake rates, or even worse spotted data hinder completion of an assessment, such as a large proportion of residue data below detection limit. In order to reduce uncertainty due to insufficient data, the Bayesian statistics Markov chain Monte Carlo (MCMC) simulation was applied to perform PRA. The limited data available were used as prior information. Markov chain Monte Carlo simulation was performed with the WinBUG to achieve the posterior distributions of parameters and health risk. Four examples will be presented in this meeting; assessment of lifetime cancer risk for N-nitrosodimethylamine (NDMA) in drinking water (only one sample with detectable NDMA level out of 50 samples collected), assessment of lifetime cancer risk for aflatoxin B1 in food (only few data greater than regulations were available), assessment of health risk for medical staffs exposed to cisplatin by using urinary platinum as biomarker to reconstruct exposures, and assessment of lifetime cancer risk for acrylamide in high-temperature processed foods with high uncertainty in residue and intake rate data. With limited data available, the posterior distributions of parameters and health risk theoretically converge to corresponding representative distributions for the study population so that quality of risk assessment may be improved without additional investment of resources to collect data.

P.119 Wu, CY*; Chang, CH; Chung, YC; Chen, CC; Wu, KY; National Taiwan University; [email protected] Probabilistic Assessment of Lifetime Cancer Risk for Acrylamide through Daily Consumption of High-Temperature Processed Foods in Taiwan with Bayesian Statistics Markov Chain Monte Carlo Simulation Acrylamide (AA), a probable carcinogen, present in foods, especially in carbohydrate-rich food processed at high temperature. The potential health risk has been of great concerns due to daily AA intakes which could vary geographically due to various dietary habits and cooking process. In order to assess the lifetime cancer risk, a variety of high-temperature processed foods were collected from several counties in Taiwan. Totally, AA contents in 300 samples were analyzed with liquid chromatography tandem mass spectrometry. Questionnaires were used to collect intakes of these foods from 132 study subjects. With the limited data on the AA contents and food intake rates, Bayesian statistics Markov chain Monte Carlo simulation was to perform probabilistic assessment of lifetime cancer risk for AA through consumption of high-temperature foods in Taiwan to estimate the representative distribution of daily AA intake. In this study, mean intakes doses is 1.655 micro-gram / kg-day for the samples. Use the cancer slope factor (0.51 [mg/kg-day] -1) of EPA(2010), the mean cancer risk for Taiwan population is 8.4410-4. The risk in our study is higher than those of other studies. This could be attributed to food intake rates probably overestimated according to surveys from young population. However, it may be necessary for the young study subjects to reduce the consumption of fried foods.

December 8-11, 2013 - Baltimore, MD

SRA 2013 Annual Meeting Abstracts W2-H.2 Xu, J; Song, C; Zhuang, J, J*; University at Buffalo, SUNY; [email protected] Robust Screening Policy--Balancing Congestion and Security in the Presence of Strategic Applicants with Private Information Concerns on security and congestion appear in security screening which is used to identify and deter potential threats (e.g., attackers, terrorists, smugglers, spies) among normal applicants wishing to enter an organization, location, or facility. Generally, in-depth screening reduces the risk of being attacked, but creates delays that may deter normal applicants and thus, decrease the welfare of the approver (authority, manager, screener). In this research, we develop a model to determine the optimal screening policy to maximize the reward from admitting normal applicants net of the penalty from admitting bad applicants. We use an M/M/n queueing system to capture the impact of security screening policies on system congestion and use game theory to model strategic behavior, in which potential applicants with private information can decide whether to apply based on the observed approver's screening policy and the submission behavior of other potential applicants. We provide analytical solutions for the optimal non-discriminatory screening policy and numerical illustrations for both the discriminatory and non-discriminatory policies. In addition, we discuss more complex scenarios including robust screening, imperfect screening, abandonment behavior, and complex server networks.

T2-I.1 XU, J*; Lambert, J.H.; University of Virginia; [email protected] Addressing Uncertainties of Avoided Crash Risk, Travel Time Savings, and Lifecycle Costs in Transportation Access Management Access management in transportation planning can save travel time, reduce crashes, and increase route capacities. The literature suggests a need for performance metrics and a decision-aiding framework to guide access management programs across large corridor networks and diverse time horizons. This research discusses a quantitative framework to support access management programs decision making, applying multicriteria analysis and cost-benefit analysis with data and parameter uncertainties. The metrics used to assess relative needs at existing access points include: crash intensity, crash exposure, travel time delay factor, access point density, traffic exposure, value of time, costs of typical access management activities, etc. Uncertain data and parameters that influence the estimates of potential benefits and costs are identified and treated via a numerical interval analysis. The framework is demonstrated at several geographic scales and locations including six thousand miles of highways of a geographic region and its several sub-regions. The results assist decision makers to find from various perspectives which route segments should be addressed sooner by collection of additional data, reserving right of way, closing access points, new alignments, development proffers, etc. The methods combing uncertainty analysis, multicriteria analysis, and cost-benefit analysis should be interest to engineers, policymakers, and stakeholders engaged in the strategic planning of access management and other multiscale systems, and are transferable to other topics involving cost-benefit analysis under uncertainty, evidence-based decision aids, strategic resource allocation, and others.

T1-E.2 Yan, Z*; Zhao, Q; ORISE; [email protected] Toxicity Review of Technical Grade Dinitrotoluene And Identification of its Critical Effects Technical grade dinitrotoluene (tgDNT) is a mixture of dinitrotoluene isomers. It is composed of 76% 2,4 DNT and 19% 2,6 DNT. The toxicological effects of tgDNT were reviewed and the critical effects were identified by evaluating available human occupational and animal studies mainly via inhalation and oral exposure routes. In occupational studies of workers from DNT manufacturing plants, tgDNT exposure (predominately via the inhalation route) associates with clinical symptoms, adverse reproductive effects, adverse effects on the cardiovascular system, and increased carcinogenic risk. However, these occupational studies are limited due to co-exposure to other known and/or unknown chemicals and lack of useful exposure information. Oral studies in F344 rats showed that tgDNT has a broad toxicity profile in both males and females including significantly increased reticulocytes and Heinz bodies, increased relative liver weight and kidney weight, spleen hemosiderin and extramedullary hematopoiesis, increased incidence of chronic interstitial nephritis, and hepatotoxicity characterized by increased incidences of hepatocyte necrosis, hyperbasophilic hepatocytes, and hepatocyte megalocytosis. The incidence of testicular degeneration in male rats is also significantly increased by tgDNT. In order to identify the most sensitive non-cancer effect(s), all toxicological endpoints from subchronic, chronic, and reproductive/developmental studies were evaluated and dose response data were modeled using EPA’s Benchmark Dose Software. A comparison of the points of departure for all potential critical effects suggests increased hepatocyte necrosis is the most sensitive effect and is thus considered the critical effect following subchronic and chronic oral exposure. In summary, tgDNT elicits a broad spectrum of non-cancer effects, among which hepatocyte necrosis was identified as the most sensitive effect. The views expressed in this abstract are those of the authors and do not necessarily reflect the views or policies of the U.S. EPA.

T4-G.2 Yang, ZJ; Rickard, L.*; Seo, M.; Harrison, T.; University at Buffalo, SUNY College of Environmental Science and Forestry, University at Albany; [email protected] The “I” in climate: The role of individual responsibility in systematic processing of climate change in