The goal of this project was to answer the following questions concerning response to a future anthrax release (or suspected release) in a building:

- Based on past experience, what rules of thumb can be determined concerning: (a) the amount of sampling that may be needed to determine the extent of contamination within a given building; (b) what portions of a building should be sampled; (c) the cost per square foot to decontaminate a given type of building using a given method; (d) the time required to prepare for, and perform, decontamination; (e) the eectiveness of a given decontamination method in a given type of building?
- Based on past experience, what resources will be spent on evaluating the extent of contamination, performing decontamination, and assessing the eectiveness of the decontamination in a building of a given type and size?
- What are the trade-os between cost, time, and eectiveness for the various sampling plans, sampling methods, and decontamination methods that have been used in the past?

We analyzed more than 70,000 air leakage measurements in houses across the United States to relate leakage area—the effective size of all penetrations of the building shell—to readily available building characteristics such as building size, year built, geographic region, and various construction characteristics. After adjusting for the lack of statistical representativeness of the data, we found that the distribution of leakage area normalized by floor area is approximately lognormal. Based on a classification tree analysis, year built and floor area are the two most significant predictors of leakage area: older and smaller houses tend to have higher normalized leakage areas than newer and larger ones. Multivariate regressions of normalized leakage are presented with respect to these two factors for three house classifications: low-income households, energy program houses, and conventional houses. We demonstrate a method of applying the regression model to housing characteristics from the American Housing Survey to derive a leakage-area distribution for all single-family houses in the US. The air exchange rates implied by these estimates agree reasonably well with published measurements.

10aair leakage10ablower door10afan pressurization measurements10ainfiltration1 aChan, Wanyu, R.1 aNazaroff, William, W.1 aPrice, Phillip, N.1 aSohn, Michael, D.1 aGadgil, Ashok, J. uhttps://indoor.lbl.gov/publications/analyzing-database-residential-air-001964nas a2200241 4500008004100000022001300041245007300054210006900127260001200196300001400208490000700222520122000229653001601449653001601465653003601481653001701517100002001534700002601554700002301580700002201603700002201625856007501647 2005 eng d a1352231000aAnalyzing a Database of Residential Air Leakage in the United States0 aAnalyzing a Database of Residential Air Leakage in the United St c06/2005 a3445-34550 v393 aWe analyzed more than 70,000 air leakage measurements in houses across the United States to relate leakage area—the effective size of all penetrations of the building shell—to readily available building characteristics such as building size, year built, geographic region, and various construction characteristics. After adjusting for the lackof statisticalrepresentativeness of the data, we found that the distribution of leakage area normalized by floor area is approximately lognormal. Based on a classification tree analysis, year built and floor area are the two most significant predictors of leakage area: older and smaller houses tend to have higher normalized leakage areas than newer and larger ones.Multivariate regressions of normalized leakage are presented with respect to these two factors for three house classifications: low-income households, energy program houses, and conventional houses. We demonstrate a method of applying the regression model to housing characteristics from the American Housing Survey to derive a leakage-area distribution for all single-family houses in the US. The air exchange rates implied by these estimates agree reasonably well with published measurements.

10aair leakage10ablower door10afan pressurization measurements10ainfiltration1 aChan, Wanyu, R.1 aNazaroff, William, W.1 aPrice, Phillip, N.1 aSohn, Michael, D.1 aGadgil, Ashok, J. uhttps://indoor.lbl.gov/publications/analyzing-database-residential-air02669nas a2200145 4500008004100000245013800041210006900179260003000248300001200278520209200290100002302382700001902405700002202424856007702446 2005 eng d00aAssessing uncertainties in the relationship between inhaled particle concentration, internal deposition and health effects, Chapter 90 aAssessing uncertainties in the relationship between inhaled part bCRC Press, Boca Raton, FL a157-1883 aThe question that ultimately motivates most aerosol inhalation research is: for a given inhaled atmosphere, what health effects will result in a specified population? To attempt to address this question, quantitative research on inhaled aerosols has been performed for at least fifty years (Landahl et al, 1951). The physical factors that determine particle deposition have been determined, lung morphology has been quantified (particularly for adults), models of total particle deposition have been created and validated, and a large variety of inhalation experiments have been performed. However many basic questions remain, some of which are identified by the U.S. Committee on Research Priorities for Airborne Particulate Matter (NRC 1998a) as high-priority research areas. Among these are: What are the quantitative relationships between outdoor concentrations measured at stationary monitoring stations, and actual personal exposures? What are the exposures to biologically important constituents of particulate matter that cause responses in potentially susceptible subpopulations and the general population? What is the role of physicochemical characteristics of particulate matter in causing adverse health effects? As these questions show, in spite of significant progress in all areas of aerosol research, many of the most important practical questions remain unanswered or inadequately answered.In this chapter, we discuss the sources and magnitudes of error that hinder the ability to answer basic questions concerning the health effects of inhaled aerosols. We first consider the phenomena that affect the epidemiological studies, starting with studies of residential radon and moving on to fine particle air pollution. Next we discuss the major uncertainties in physical and physiological modeling of the causal chain that leads from inhaled aerosol concentration, to deposition in the airway, to time-dependent dose (that is, the concentration of particles at a given point in the lungs as function of time), to physiological effects, and finally to health effect.

1 aPrice, Phillip, N.1 aRuzer, Lev, S.1 aHarley, Naomi, H. uhttps://indoor.lbl.gov/publications/assessing-uncertainties-relationship00707nas a2200193 4500008004100000245008500041210006900126520003400195100002300229700002200252700002200274700002400296700002600320700002400346700002200370700002500392700002300417856007300440 2002 eng d00aAdvice for first responders to a building during a chemical or biological attack0 aAdvice for first responders to a building during a chemical or b3 aNo Abstract available.

1 aPrice, Phillip, N.1 aDelp, William, W.1 aSohn, Michael, D.1 aThatcher, Tracy, L.1 aLorenzetti, David, M.1 aSextro, Richard, G.1 aGadgil, Ashok, J.1 aDerby, Elisabeth, A.1 aJarvis, Sondra, A. uhttps://indoor.lbl.gov/publications/advice-first-responders-building01836nas a2200241 4500008004100000022001300041245011500054210006900169260001200238300001600250490000700266520104000273653001301313653002401326653002601350653002701376653002501403100002301428700002201451700002201473700002401495856007501519 2001 eng d a1352231000aAn algorithm for real-time tomography of gas concentrations, using prior information about spatial derivatives0 aalgorithm for realtime tomography of gas concentrations using pr c06/2001 a2827 - 28350 v353 aWe present a new computed tomography method, the low third derivative (LTD) method, that is particularly suited for reconstructing the spatial distribution of gas concentrations from path-integral data for a small number of optical paths. The method finds a spatial distribution of gas concentrations that (1) has path integrals that agree with measured path integrals, and (2) has a low third spatial derivative in each direction, at every point. The trade-off between (1) and (2) is controlled by an adjustable parameter, which can be set based on analysis of the path-integral data. The method produces a set of linear equations, which can be solved with a single matrix multiplication if the constraint that all concentrations must be positive is ignored; the method is therefore extremely rapid. Analysis of experimental data from thousands of concentration distributions shows that the method works nearly as well as smooth basis function minimization (the best method previously available), yet is about 100 times faster.

10aAir Flow10acomputed tomography10aConcentration mapping10aOptical remote sensing10apollutant dispersion1 aPrice, Phillip, N.1 aFischer, Marc, L.1 aGadgil, Ashok, J.1 aSextro, Richard, G. uhttps://indoor.lbl.gov/publications/algorithm-real-time-tomography-gas01530nas a2200157 4500008004100000245005700041210005700098300001400155490000700169520103400176100002301210700002201233700002201255700002401277856007101301 2000 eng d00aAlgorithm for rapid tomography of gas concentrations0 aAlgorithm for rapid tomography of gas concentrations a2827-28350 v353 aWe present a new computed tomography method, the low third derivative (LTD) method, that is particularly suited for reconstructing the spatial distribution of gas concentrations from path-integral data for a small number of optical paths. The method finds a spatial distribution of gas concentrations that (1) has path integrals that agree with measured path integrals, and (2) has a low third spatial derivative in each direction, at every point. The trade-off between (1) and (2) is controlled by an adjustable parameter, which can be set based on analysis of the path-integral data. The method produces a set of linear equations, which can be solved with a single matrix multiplication if the constraint that all concentrations must be positive is ignored; the method is therefore extremely rapid. Analysis of experimental data from thousands of concentration distributions shows that the method works nearly as well as Smooth Basis Function Minimization (the best method previously available), yet is 100 times faster.

1 aPrice, Phillip, N.1 aFischer, Marc, L.1 aGadgil, Ashok, J.1 aSextro, Richard, G. uhttps://indoor.lbl.gov/publications/algorithm-rapid-tomography-gas