H Data - Tools for Exposure Assessment and Epidemiology


Wednesday, May 25, 2016, 1:30 PM - 4:10 PM


Epidemiology of Hearing Impairment and Injuries in the U.S. Military

D. Gimeno, J. Betancourt, and K. Whitworth, The University of Texas School of Public Health, San Antonio, TX; D. Tucker and N. Gorrell, The Geneva Foundation, San Antonio, TX; T. Hammill and M. Packer, Department of Defense Hearing Center of Excellence, San Antonio, TX; A. Senchak, Walter Reed National Military Medical Center, Bethesda, MD

Objective: To provide an epidemiologic analysis of hearing impairment and injury among U.S. military personnel.

Methods: For the main epidemiological analysis, we included 15-64 year old Active Duty Armed Forces serving during fiscal years (FY) 2008 to 2012, and who have at least one clinical encounter with select hearing related ICD-9 diagnoses. Cases were identified from records from direct care (CAPER: Comprehensive Ambulatory/Professional Encounter Record, and SIDR: Standard Inpatient Data Record) and paid provider (TED-I: TRICARE Encounter Data-Institutional and TED-NI: TRICARE Encounter Data-Noninstitutional) data sources. We also identified cases using the Defense Occupational and Environmental Health Readiness System-Hearing Conservation database (DOEHRS-HC), which maintains hearing conservation and audiometric data across the DoD. The Defense Manpower Data Center provided data on at-risk denominators by the demographic and job-related characteristics.

Results: Demographics of the sample appeared stable over the study period, with an average annual population of approximately 1.4 million Service Members. The initial unadjusted overall incidence rate increased from 27 cases per 1000 persons to 27.6 cases per 1000 persons over the study period. The tinnitus incidence rate increased from 9.3 cases per 1000 persons in FY 2008 to 12.3 cases per 1000 persons in FY 2012. The rate of sensorineural hearing loss slightly decreased from 11.4 cases per 1000 persons to 10.7 cases per 1000 persons at the end of the study period. In general, rates of injury were highest in males, in the Army, in senior officers and in older service members. Rates will be presented for all analyzed ICD codes.

Conclusions: This study will lay the groundwork for comprehensive epidemiologic studies of hearing outcomes among U.S. military personnel. Results from this project will draw much needed attention to this critical issue, encouraging initiatives to improve the auditory health and wellbeing of Service Members and aiding in the development and implementation of prevention measures.



Outpatient Costs of Hearing Loss in the U.S. Military: Direct Care and Paid Provider Care

H. Alamgir, J. Betancourt, and C. Turner, The University of Texas School of Public Health, San Antonio, TX; D. Tucker and N. Gorrell, The Geneva Foundation, San Antonio, TX; T. Hammill and M. Packer, Department of Defense Hearing Center of Excellence, San Antonio, TX; A. Senchak, Walter Reed National Military Medical Center, Bethesda, MD

Objective: The goal of this research is to comprehensively determine the economic impact of hearing impairment and noise-induced hearing injury (HINIHI) among active duty U.S. Service Members from the perspective of the DoD

Methods: This study is a retrospective review of data collected on active duty military service members (SMs) during the January 1, 2007 to December 31, 2012 timeframe. Clinical data on military SMs who were diagnosed with one or more of the identified ICD-9 codes (associated with HINIHI) from the Medical Data Repository (MDR) was received by the Military Health System Management Analysis and Reporting Tool for our analysis. This research reports findings from two M2/MDR clinical data sets: Tricare Encounter Data-Noninstitutional (TED-NI) which provides data on care provided by civilian paid providers outside a military treatment facility (MTF) and Comprehensive Ambulatory Encounter Record (CAPER) which provides data on direct care provided by military providers in an MTF.

Results: We obtained 8,251,109 encounter records from CAPER, representing 1,865,676 distinct patients, and 1,865,965 encounters from TED-NI, representing 243,349 distinct patients. Tympanic membrane disorders, males, and medical centers have relatively higher mean costs for HINIHI in both TED-NI and CAPER. In both databases, fiscal year, diagnosis code, and age were significantly related to RVU dollars. Compared to other disorders of ear, CAPER and TED-NI revealed that patients with tympanic membrane disorders had a higher cost than those with other diagnoses. Patients ≥ 65 years-old in TED-NI cost more than the other age groups. Cost differences between males and females were not significant in CAPER, but females cost significantly less than males by an average of 57.34 RVU dollars (p-value = 0.0078) in TED-NI. Pay grade and facility size were only significant in CAPER (p-values < 0.0001).

Conclusions: Our estimates are a valuable decision making tool for DoD policymakers. These cost estimates may identify high burden groups, enable proactive measures for concerted education and training, identify best practices, and develop return-to-duty programs following HINIHI, all of which may contribute to the retention of skilled, experienced, and mission-ready military personnel.



Assessment of Dermatitis Among Chair Sanders

N. Burton and L. Tapp, CDC/NIOSH, Cincinnati, OH

Situation/Problem: NIOSH received a Health Hazard Evaluation (HHE) request from management of a chair manufacturing facility. They were concerned about skin rashes among sanders in the clean-up (or sanding) department. Two employees had severe skin reactions at work and could no longer work in the department. The employees performed repairs using epoxy resins, sanded rough areas on wooden frames, and cleaned the frames using different chemicals including acetone. The work was done on downdraft benches, raised platforms, or the floor, depending upon the size of the piece. The screws used in the chair frames had been changed to a larger size that required angled holes to provide additional strength. The holes had to be filled with epoxy resin which was mixed by hand to match the color of the frame. The company required employees to wear either nitrile or vinyl gloves. This area of the facility did not have general ventilation.

Resolution: Several recommendations were made: installing local exhaust ventilation for the hand sanders; adding a vacuum system instead of compressed air to remove dust; using an epoxy gun to apply epoxy; using polyvinyl gloves when working with epoxies; using goggles or safety glasses when working with epoxies and other chemicals; reporting skin rashes to management when they occur; and referring employees with persistent rashes to a dermatologist with occupational medicine experience.

Results: Wipe sampling showed that dust from epoxy resins was found throughout the cleaning department. Air sampling for volatile organic compounds showed no over-exposures. The downdraft tables were not effective in controlling dust levels due to the size of some of the chair frames. The amount of epoxy used in the cleaning department was greatly increased after the change in frame design. Confidential medical interviews with employees showed that 8 of 18 employees reported current or recent skin irritation. Review of medical records confirmed that one employee showed an allergic skin reaction to epoxy resins when tested by skin patch testing.

Lessons learned: Epoxy resins were suspected to be the cause of the skin rashes. Engineering controls, improved work practices, and the use of personal protective equipment were needed to reduce the exposure to epoxy resins.



Total Exposure Health

R. Hartman, Planned Systems International, Arlington, VA; K. Phillips, Dept. of Defense, Alexandria, VA

Situation/Problem: Through “total health” and Total Worker HealthTM, both private and public sector organizations have made strides toward achieving the President’s Precision Medicine Initiative. This initiative is an approach for disease prevention and treatment that takes into account an individual’s unique genes, environment, and lifestyles to provide personalized healthcare. However, they have overlooked one key factor that influences individual health risks: the exposure which drives both protective and clinical interventions.

Resolution: As industrial hygienists, we are positioned to effectively contribute to this fundamental change in our health care system as exposure scientists, linking our expertise not only to the occupational health of individuals but also to their overall well-being. Understanding that the impact of exposures are strongly related to sociocultural and economic status, occupational and environmental factors, and lifestyle choices, we created the Total Exposure Health (TEH) concept that incorporates workplace, environment with lifestyle exposure and provides a path to precision medicine.

Results: We will discuss various operational models and examples to show how TEH takes our existing knowledge of exposures and connects them to the individual’s organ systems, cellular function, and DNA, along with classic industrial hygiene modeling (toxicokinetic modeling, cell toxicity, organ damage, etc.). As a new healthcare infrastructure to be defined and built, we will also show how TEH will position industrial hygienists as exposure scientists to improve the worker experience with a focus on individual exposures (unique and targeted).

Lessons learned: Exposure means different things to different people, so we conveniently packaged TEH into a simple brand. We revealed TEH as a catalyst to move exposure health away from animal data and population models to individual personalized effects of exposure. We also found TEH fosters innovations in research and technology development through programmatic support and can promote economic development, particularly in science, technology, engineering and mathematics (STEM) career fields. Lastly, we found TEH to be a system integrator between programs, policies, and disciplines that creates conversations and collaborations through a unified understandable goal to improve employee healthcare in the United States.



Development of an Explosive Dust Screening App

M. Rollins, Isosceles Group, Brooklyn, CT

Situation/Problem: Dust explosions, despite all the research, recommendation, and regulations, continue to plague industry. Often times, the lack of recognition of the explosion potential is the reason the hazard goes unnoticed, and uncontrolled. One of the challenges is a way to rapidly assess and rank potential hazards, ranging from basic housekeeping to dust Kst value to material(s) processed.

Resolution: Creation of an expert system allowed for rapid screening of processes that could pose a dust explosion hazard. FileMaker was used to develop a custom iOS app for this purpose. Over 50 questions (e.g. depth of dust on surfaces) on 9 rubrics (materials, housekeeping, etc.) were developed. Questions are based on recommended screening from NFPA, OSHA, EU Directives, FM Global, and others. Each question is scored based on the multiple-choice answer from a drop-down menu, so that higher-risk answers result in a higher overall risk profile score. The score for each rubric is also tallied. The report generated in-situ from an iOS device provides a multi-page PDF document listing each question, with specific recommendation for deficiencies noted.

Results: Screenings for processes was completed, with higher hazard areas identified. These could be addressed by priority in a more efficient manner.

Lessons learned: The use of a properly designed, formatted and tested expert system app allowed dust hazard screening to be accomplished in hours as opposed to days.



Developing a System to Track OSH Issues with Major Construction Projects

D. Harman and T. Carraway, US Dept. of State, Aberdeen, MD

Situation/Problem: The Department of State's Office of Safety, Health, and Environmental Management is involved in the planning and construction of new facilities (Embassies and Consulates), as well as major rehabs of existing buildings. One part of this process is to have an OSH professional conduct a site visit at approximately 85% completion. This site visit is to ensure that the completed project will meet applicable codes, DOS specifications, and the design drawings. In the past, the professional would produce a Word or Excel document with the issues and suggested resolution and leave it with the project management team for implementation. However, there was no process for follow up to ensure that the issues were addressed.

Resolution: DOS SHEM already has in place a software system to collect, track, and report on the results of visits to US Embassies and Consulates to ensure that any OSH hazard identified are addressed in a timely manner. That framework could be modified to perform similar collecting, tracking, and reporting of issues with construction projects. So a project was started with the same programming team that had done a recent rewrite of the other system.

Results: A new software development approach was used by the software team. Instead of writing a Requirements Document, the programmers would start programming, and there would be regular meetings to see the progress and provide feedback, and the Requirements Document would be produced on the fly, at the same pace the software development occurred. This approach caused a lot of moving 1 step forward then 1 or 2 steps back, and so forth. Programmers would start down a path, and then find they had gone a wrong direction. The system eventually was completed and is functional. Since, it was not replacing an existing system, it could go online in a less polished form. Also, there are a much smaller number of users, so the roll out is more like a Gamma (more than Beta) test and minor glitches would be acceptable.

Lessons learned: The big lesson learned is the traditional method of writing a Requirements Document makes for a smoother process to write the software. It is helpful to both sides, as it is a contract between the users and the programmers of what is expected at the end. It reduces the false paths and blind alleys of the more free flowing approach. But in the end, with close work between the user side and good programmers, you end up with a good system that does what you need it to do.



Improving the Estimation of the Interzone Air Exchange Rate in the Near Field-Far Field Model by Computational Fluid Dynamic Simulations

W. Chouchen and S. Halle, Ecole de technologie superieure, Montreal, QC, Canada; M. Debia and C. Castro Ruiz, University of Montreal, Montreal, QC, Canada; A. Bahloul, IRSST, Montreal, QC, Canada

Objective: The two-zones, also called Near Field-Far Field (NF-FF) model, is commonly used by industrial hygienists for predicting occupational exposures to vapor of solvents. Input parameters in the NF-FF model include the near field volumes, the room supply air flow rate and the inter-zone air exchange rate between the NF and FF (β). The β-parameter is determined from the averaged velocity at the zone interface and the available free surface area of the NF volume. However, the air velocity at the zone interface is generally unknown and important assumptions have to be made. The objective of this communication is to improve the determination of the β-parameter by using computation fluid dynamic (CFD) simulations.

Methods: A simulated 54.5 cubic meters room with a constant acetone generation rate was investigated applying the NF-FF model (IHmod, V0.209). The NF was defined as a 1.8 m height cylinder with a radius ranging from 0.5 m to 1.5 m from the emission source. The average air velocity at the zone interface was determined, by CFD (FDS, V6.2), for two ventilation scenarios: air supplied from a ceiling square diffuser and from a sill wall grid. In each scenario, four room air supply rates were used: 2, 4, 6 and 8 air changes per hour (ACH).

Results: For a given air supply rate, the averaged air velocity is nearly constant for both ventilation scenarios regardless the distance from the contaminant source and β depends only on the free surface area, hence the radial distance from the source. The average contaminant removal efficiency for the first and second ventilation scenario are 0.95 and 0.71 respectively. A linear relationship (R2=0.94) was found between the air change per hour and the averaged air velocity in the NF zone. For small free surface area, steady-state concentration in the NF zone is up to 20 times higher than the concentration obtained from the CFD simulations. However, NF concentration is only 2 to 3 times higher than CFD results, when the NF zone is defined as a 1.5m radius cylinder around the source. These results allowed us to determine an “optimal” free surface area for each ACH.

Conclusions: Estimation of the β-parameter from CFD simulations can improve significantly NF concentrations in the two-zone model.



Heat Stress and Monte Carlo Simulation—A Statistical Approach That Considers Uncertainty in Calculating Work/Rest Regimen

P. Dessureault and D. Drolet, University of Quebec, Ste-Genevieve-de-Batiscan, QC, Canada

Situation/Problem: The Wet Bulb and Globe Temperature (WBGT) has been the most frequently used heat stress index for decades. Its application is quite simple; one must make sure that the meeting point between the observed WBGT value (WBGTobs) and the estimated level of work metabolism (M) does not exceed the limit value (WBGTlim) curve on a graph, most often originating from ACGIH’s® TLV®. In cases of overexposure, it is common practice to establish an hourly work/rest regimen that brings the time weighted average parameters (WBGTave and Mave) to an acceptable point. Of course, WBGT and work metabolism values have an inherent error that makes this practice quite risky whenever the meeting point nears the limit curve. As​​​sessing the risk of overexposure is then clearly desirable. At this time, no technique has been proposed to do so.

Resolution: A probabilistic distribution of the work/rest regimen considering each parameter’s uncertainty would enlighten decision-making and make risk assessment possible. Monte Carlo Simulation (MCS) is a technique allowing a user to account for risk in quantitative decision making. Applied to heat stress assessment, it allows the user to define a probability distribution for WBGTs and M. The simulation process will run the model a chosen number of times, calculating the longest work period that meets the limit as TLV®. This iterative procedure produces an outcome defined by a probability distribution instead of a single point as WBGTave and Mave. A MCS procedure was programmed into a regular worksheet using Visual Basic in Microsoft Excel.

Results: The Excel file includes a data entry form where the user has to enter the WBGT and M values at work and at rest, along with their uncertainty expressed as a distribution of probability (normal, uniform, triangular, log-normal). After running the chosen number of iterations, the result is displayed as a histogram of probabilities that the limit is met at each number of working minutes per hour.

Lessons learned: The histogram presentation of the work minutes per hour allows the user to determine the work/rest regimen with a given level of confidence considering the uncertainty associated with his readings and estimation. This tool does not only assist the risk manager in decision making but also in identifying the best measures to decrease the risk levels and ensure a safe exposure. In this presentation, a real-time demonstration will be presented.​