Cui, Haiyang; Pietrzak, Julie; Stelling, Guus; Androsov, Alexey; Harig, Sven
The Indian Ocean Tsunami on December 26, 2004 caused one of the largest tsunamis in recent times and led to widespread devastation and loss of life. One of the worst hit regions was Banda Aceh, which is the capital of the Aceh province, located in the northern part of Sumatra, 150km from the source of the earthquake. A German-Indonesian Tsunami Early Warning System (GITEWS) (www.gitews.de) is currently under active development. The work presented here is carried out within the GITEWS framework. One of the aims of this project is the development of accurate models with which to simulate the propagation, flooding and drying, and run-up of a tsunami. In this context, TsunAWI has been developed by the Alfred Wegener Institute; it is an explicit, () finite element model. However, the accurate numerical simulation of flooding and drying requires the conservation of mass and momentum. This is not possible in the current version of TsunAWi. The P1NC – P1element guarantees mass conservation in a global sense, yet as we show here it is important to guarantee mass conservation at the local level, that is within each individual cell. Here an unstructured grid, finite volume ocean model is presented. It is derived from the P1NC – P1 element, and is shown to be mass and momentum conserving. Then a number of simulations are presented, including dam break problems flooding over both a wet and a dry bed. Excellent agreement is found. Then we present simulations for Banda Aceh, and compare the results to on-site survey data, as well as to results from the original TsunAWI code.
Mori, J.; Mooney, W.D.; Afnimar,; Kurniawan, S.; Anaya, A.I.; Widiyantoro, S.
A tsunami earthquake (Mw = 7.7) occurred south of Java on 17 July 2006. The event produced relatively low levels of high-frequency radiation, and local felt reports indicated only weak shaking in Java. There was no ground motion damage from the earthquake, but there was extensive damage and loss of life from the tsunami along 250 km of the southern coasts of West Java and Central Java. An inspection of the area a few days after the earthquake showed extensive damage to wooden and unreinforced masonry buildings that were located within several hundred meters of the coast. Since there was no tsunami warning system in place, efforts to escape the large waves depended on how people reacted to the earthquake shaking, which was only weakly felt in the coastal areas. This experience emphasizes the need for adequate tsunami warning systems for the Indian Ocean region.
Xie, Y.; Meng, L.
Extreme scenarios of M 7.5+ earthquakes on the Red Mountain and Pitas Point faults can potentially generate significant local tsunamis in southern California. The maximum water elevation could be as large as 10 m in the nearshore region of Oxnard and Santa Barbara. Recent development in seismic array processing enables rapid tsunami prediction and early warning based on the back-projection approach (BP). The idea is to estimate the rupture size by back-tracing the seismic body waves recorded by stations at local and regional distances. A simplified source model of uniform slip is constructed and used as an input for tsunami simulations that predict the tsunami wave height and arrival time. We demonstrate the feasibility of this approach in southern California by implementing it in a simulated real-time environment and applying to a hypothetical M 7.7 Dip-slip earthquake scenario on the Pitas Point fault. Synthetic seismograms are produced using the SCEC broadband platform based on the 3D SoCal community velocity model. We use S-wave instead of P-wave to avoid S-minus-P travel times shorter than rupture duration. Two clusters of strong-motion stations near Bakersfield and Palmdale are selected to determine the back-azimuth of the strongest high-frequency radiations (0.5-1 Hz). The back-azimuths of the two clusters are then intersected to locate the source positions. The rupture area is then approximated by enclosing these BP radiators with an ellipse or a polygon. Our preliminary results show that the extent of 1294 square kilometers rupture area and magnitude of 7.6 obtained by this approach is reasonably close to the 1849 square kilometers and 7.7 of the input model. The average slip of 7.3 m is then estimated according to the scaling relation between slip and rupture area, which is close to the actual average dislocation amount, 8.3 m. Finally, a tsunami simulation is conducted to assess the wave height and arrival time. The errors of -3 to +9 s in arrival time
Rabinovich, Alexander B.; Borrero, Jose C.; Fritz, Hermann M.
With this volume of the Pure and Applied Geophysics (PAGEOPH) topical issue “Tsunamis in the Pacific Ocean: 2011-2012”, we are pleased to present 21 new papers discussing tsunami events occurring in this two-year span. Owing to the profound impact resulting from the unique crossover of a natural and nuclear disaster, research into the 11 March 2011 Tohoku, Japan earthquake and tsunami continues; here we present 12 papers related to this event. Three papers report on detailed field survey results and updated analyses of the wave dynamics based on these surveys. Two papers explore the effects of the Tohoku tsunami on the coast of Russia. Three papers discuss the tsunami source mechanism, and four papers deal with tsunami hydrodynamics in the far field or over the wider Pacific basin. In addition, a series of five papers presents studies of four new tsunami and earthquake events occurring over this time period. This includes tsunamis in El Salvador, the Philippines, Japan and the west coast of British Columbia, Canada. Finally, we present four new papers on tsunami science, including discussions on tsunami event duration, tsunami wave amplitude, tsunami energy and tsunami recurrence.
Vilibic, I.; Monserrat, S.; Amores, A.; Dadic, V.; Fine, I.; Horvath, K.; Ivankovic, D.; Marcos, M.; Mihanovic, H.; Pasquet, S.; Rabinovich, A. B.; Sepic, J.; Strelec Mahovic, N.; Whitmore, P.
Meteotsunamis, or meteorological tsunamis, are atmospherically induced ocean waves in the tsunami frequency band that are found to affect coasts in a destructive way in a number of places in the World Ocean, including the U.S. coastline. The Boothbay Harbor, Maine, in October 2008 and Daytona Beach, Florida, in July 1992 were hit by several meters high waves appearing from “nowhere”, and a preliminary assessment pointed to the atmosphere as a possible source for the events. As a need for in-depth analyses and proper qualification of these and other events emerged, National Oceanographic and Atmospheric Administration (NOAA) decided to fund the research, currently carried out within the TMEWS project (Towards a MEteotsunami Warning System along the U.S. coastline). The project structure, planned research activities and first results will be presented here. The first objective of the project is creation of a list of potential meteotsunami events, from catalogues, news and high-resolution sea level data, and their proper assessment with regards to the source, generation and dynamics. The assessment will be based on the research of the various types of ocean (tide gauges, buoys), atmospheric (ground stations, buoys, vertical soundings, reanalyses) and remote sensing (satellites) data and products, supported by the atmospheric and ocean modelling efforts. Based on the earned knowledge, the basis for a meteotsunami warning system, i.e. observational systems and communication needs for early detection of a meteotsunami, will be defined. Finally, meteotsunami warning protocols, procedures and decision matrix will be developed, and tested on historical meteotsunami events. These deliverables are expected also to boost meteotsunami research in other parts of the World Ocean, and to contribute to the creation of an efficient meteotsunami warning systems in different regions of interest, such as Mediterranean Sea, western Japan, Western Australia or other.
Mattioli, Glen; Mencin, David; Hodgkinson, Kathleen; Meertens, Charles; Phillips, David; Blume, Fredrick; Berglund, Henry; Fox, Otina; Feaux, Karl
The NSF-funded GAGE Facility, managed by UNAVCO, operates approximately ~1300 GNSS stations distributed across North and Central America and in the circum-Caribbean. Following community input starting in 2011 from several workshops and associated reports,UNAVCO has been exploring ways to increase the capability and utility of the geodetic resources under its management to improve our understanding in diverse areas of geophysics including properties of seismic, volcanic, magmatic and tsunami deformation sources. Networks operated by UNAVCO for the NSF have the potential to profoundly transform our ability to rapidly characterize events, provide rapid characterization and warning, as well as improve hazard mitigation and response. Specific applications currently under development include earthquake early warning, tsunami early warning, and tropospheric modeling with university, commercial, non-profit and government partners on national and international scales. In the case of tsunami early warning, for example, an RT-GNSS network can provide multiple inputs in an operational system starting with rapid assessment of earthquake sources and associated deformation, which leads to the initial model of ocean forcing and tsunami generation. In addition, terrestrial GNSScan provide direct measurements of the tsunami through the associated traveling ionospheric disturbance from several 100’s of km away as they approach the shoreline,which can be used to refine tsunami inundation models. Any operational system like this has multiple communities that rely on a pan-Pacific real-time open data set. Other scientific and operational applications for high-rate GPS include glacier and ice sheet motions, tropospheric modeling, and better constraints on the dynamics of space weather. Combining existing data sets and user communities, for example seismic data and tide gauge observations, with GNSS and Met data products has proven complicated because of issues related to metadata
Airburst – In the simulations explored energy from the airburst couples very weakly with the water making tsunami dangerous over a shorter distance than the blast for asteroid sizes up to the maximum expected size that will still airburst (approx.250MT). Future areas of investigation: – Low entry angle airbursts create more cylindrical blasts and might couple more efficiently – Bursts very close to the ground will increase coupling – Inclusion of thermosphere (>80km altitude) may show some plume collapse effects over a large area although with much less pressure center dot Ocean Impact – Asteroid creates large cavity in ocean. Cavity backfills creating central jet. Oscillation between the cavity and jet sends out tsunami wave packet. – For deep ocean impact waves are deep water waves (Phase speed = 2x Group speed) – If the tsunami propagation and inundation calculations are correct for the small (ocean basins, the resulting tsunami is not a significant hazard unless particularly close to vulnerable communities. Future work: – Shallow ocean impact. – Effect of continental shelf and beach profiles – Tsunami vs. blast damage radii for impacts close to populated areas – Larger asteroids below presumed threshold of global effects (Ø200 – 800m).
Song, Y. Tony
Different from the conventional approach to tsunami warnings that rely on earthquake magnitude estimates, we have found that coastal GPS stations are able to detect continental slope displacements of faulting due to big earthquakes, and that the detected seafloor displacements are able to determine tsunami source energy and scales instantaneously. This method has successfully replicated several historical tsunamis caused by the 2004 Sumatra earthquake, the 2005 Nias earthquake, the 2010 Chilean earthquake, and the 2011 Tohoku-Oki earthquake, respectively, and has been compared favorably with the conventional seismic solutions that usually take hours or days to get through inverting seismographs (reference listed). Because many coastal GPS stations are already in operation for measuring ground motions in real time as often as once every few seconds, this study suggests a practical way of identifying tsunamigenic earthquakes for early warnings and reducing false alarms. Reference Song, Y. T., 2007: Detecting tsunami genesis and scales directly from coastal GPS stations, Geophys. Res. Lett., 34, L19602, doi:10.1029/2007GL031681. Song, Y. T., L.-L. Fu, V. Zlotnicki, C. Ji, V. Hjorleifsdottir, C.K. Shum, and Y. Yi, 2008: The role of horizontal impulses of the faulting continental slope in generating the 26 December 2004 Tsunami, Ocean Modelling, doi:10.1016/j.ocemod.2007.10.007. Song, Y. T. and S.C. Han, 2011: Satellite observations defying the long-held tsunami genesis theory, D.L. Tang (ed.), Remote Sensing of the Changing Oceans, DOI 10.1007/978-3-642-16541-2, Springer-Verlag Berlin Heidelberg. Song, Y. T., I. Fukumori, C. K. Shum, and Y. Yi, 2012: Merging tsunamis of the 2011 Tohoku-Oki earthquake detected over the open ocean, Geophys. Res. Lett., doi:10.1029/2011GL050767 (Nature Highlights, March 8, 2012).
Oppenheimer, D.H.; Bittenbinder, A.N.; Bogaert, B.M.; Buland, R.P.; Dietz, L.D.; Hansen, R.A.; Malone, S.D.; McCreery, C.S.; Sokolowski, T.J.; Whitmore, P.M.; Weaver, C.S.
In 1997, the Federal Emergency Management Agency (FEMA), National Oceanic and Atmospheric Administration (NOAA), U.S. Geological Survey (USGS), and the five western States of Alaska, California, Hawaii, Oregon, and Washington joined in a partnership called the National Tsunami Hazard Mitigation Program (NTHMP) to enhance the quality and quantity of seismic data provided to the NOAA tsunami warning centers in Alaska and Hawaii. The NTHMP funded a seismic project that now provides the warning centers with real-time seismic data over dedicated communication links and the Internet from regional seismic networks monitoring earthquakes in the five western states, the U.S. National Seismic Network in Colorado, and from domestic and global seismic stations operated by other agencies. The goal of the project is to reduce the time needed to issue a tsunami warning by providing the warning centers with high-dynamic range, broadband waveforms in near real time. An additional goal is to reduce the likelihood of issuing false tsunami warnings by rapidly providing to the warning centers parametric information on earthquakes that could indicate their tsunamigenic potential, such as hypocenters, magnitudes, moment tensors, and shake distribution maps. New or upgraded field instrumentation was installed over a 5-year period at 53 seismic stations in the five western states. Data from these instruments has been integrated into the seismic network utilizing Earthworm software. This network has significantly reduced the time needed to respond to teleseismic and regional earthquakes. Notably, the West Coast/Alaska Tsunami Warning Center responded to the 28 February 2001 Mw 6.8 Nisqually earthquake beneath Olympia, Washington within 2 minutes compared to an average response time of over 10 minutes for the previous 18 years. ?? Springer 2005.
DiLisi, Gregory A.; Rarick, Richard A.
In this paper we develop materials to address student interest in the Indian Ocean tsunami of December 2004. We discuss the physical characteristics of tsunamis and some of the specific data regarding the 2004 event. Finally, we create an easy-to-make tsunami tank to run simulations in the classroom. The simulations exhibit three dramatic signatures of tsunamis, namely, as a tsunami moves into shallow water its amplitude increases, its wavelength and speed decrease, and its leading edge becomes increasingly steep as if to “break” or “crash.” Using our tsunami tank, these realistic features were easy to observe in the classroom and evoked an enthusiastic response from our students.
Gee, L.; Green, D.; McNamara, D.; Whitmore, P.; Weaver, J.; Huang, P.; Benz, H.
Following the catastrophic loss of life from the December 26, 2004, Sumatra-Andaman Islands earthquake and tsunami, the U.S. Government appropriated funds to improve monitoring along a major portion of vulnerable coastal regions in the Caribbean Sea, the Gulf of Mexico, and the Atlantic Ocean. Partners in this project include the United States Geological Survey (USGS), the National Oceanic and Atmospheric Administration (NOAA), the Puerto Rico Seismic Network (PRSN), the Seismic Research Unit of the University of the West Indies, and other collaborating institutions in the Caribbean region. As part of this effort, the USGS is coordinating with Caribbean host nations to design and deploy nine new broadband and strong-motion seismic stations. The instrumentation consists of an STS-2 seismometer, an Episensor accelerometer, and a Q330 high resolution digitizer. Six stations are currently transmitting data to the USGS National Earthquake Information Center, where the data are redistributed to the NOAA’s Tsunami Warning Centers, regional monitoring partners, and the IRIS Data Management Center. Operating stations include: Isla Barro Colorado, Panama; Gun Hill Barbados; Grenville, Grenada; Guantanamo Bay, Cuba; Sabaneta Dam, Dominican Republic; and Tegucigalpa, Honduras. Three additional stations in Barbuda, Grand Turks, and Jamaica will be completed during the fall of 2007. These nine stations are affiliates of the Global Seismographic Network (GSN) and complement existing GSN stations as well as regional stations. The new seismic stations improve azimuthal coverage, increase network density, and provide on-scale recording throughout the region. Complementary to this network, NOAA has placed Deep-ocean Assessment and Reporting of Tsunami (DART) stations at sites in regions with a history of generating destructive tsunamis. Recently, NOAA completed deployment of 7 DART stations off the coasts of Montauk Pt, NY; Charleston, SC; Miami, FL; San Juan, Puerto Rico; New
… information following testing of the associated NWS communications systems. The tests are planned annually, in March/April and again in September. Post-test feedback information will be requested from emergency… Collection; Comment Request; Feedback Survey for Annual Tsunami Warning Communications Tests AGENCY: National…
McLean, S. J.; Mungov, G.; Dunbar, P. K.; Price, D. J.; Mccullough, H.
The National Oceanic and Atmospheric Administration (NOAA), National Geophysical Data Center (NGDC) and collocated World Data Service for Geophysics (WDS) provides long-term archive, data management, and access to national and global tsunami data. Archive responsibilities include the NOAA Global Historical Tsunami event and runup database, damage photos, as well as other related hazards data. Beginning in 2008, NGDC was given the responsibility of archiving, processing and distributing all tsunami and hazards-related water level data collected from NOAA observational networks in a coordinated and consistent manner. These data include the Deep-ocean Assessment and Reporting of Tsunami (DART) data provided by the National Data Buoy Center (NDBC), coastal-tide-gauge data from the National Ocean Service (NOS) network and tide-gauge data from the two National Weather Service (NWS) Tsunami Warning Centers (TWCs) regional networks. Taken together, this integrated archive supports tsunami forecast, warning, research, mitigation and education efforts of NOAA and the Nation. Due to the variety of the water level data, the automatic ingest system was redesigned, along with upgrading the inventory, archive and delivery capabilities based on modern digital data archiving practices. The data processing system was also upgraded and redesigned focusing on data quality assessment in an operational manner. This poster focuses on data availability highlighting the automation of all steps of data ingest, archive, processing and distribution. Examples are given from recent events such as the October 2012 hurricane Sandy, the Feb 06, 2013 Solomon Islands tsunami, and the June 13, 2013 meteotsunami along the U.S. East Coast.
Lavrentyev, Mikhail; Romanenko, Alexey; Marchuk, Andrey
Today, a wide well-developed system of deep ocean tsunami detectors operates over the Pacific. Direct measurements of tsunami-wave time series are available. However, tsunami-warning systems fail to predict basic parameters of tsunami waves on time. Dozens examples could be provided. In our view, the lack of computational power is the main reason of these failures. At the same time, modern computer technologies such as, GPU (graphic processing unit) and FPGA (field programmable gates array), can dramatically improve data processing performance, which may enhance timely tsunami-warning prediction. Thus, it is possible to address the challenge of real-time tsunami forecasting for selected geo regions. We propose to use three new techniques in the existing tsunami warning systems to achieve real-time calculation of tsunami wave parameters. First of all, measurement system (DART buoys location, e.g.) should be optimized (both in terms of wave arriving time and amplitude parameter). The corresponding software application exists today and is ready for use . We consider the example of the coastal line of Japan. Numerical tests show that optimal installation of only 4 DART buoys (accounting the existing sea bed cable) will reduce the tsunami wave detection time to only 10 min after an underwater earthquake. Secondly, as was shown by this paper authors, the use of GPU/FPGA technologies accelerates the execution of the MOST (method of splitting tsunami) code by 100 times . Therefore, tsunami wave propagation over the ocean area 2000*2000 km (wave propagation simulation: time step 10 sec, recording each 4th spatial point and 4th time step) could be calculated at: 3 sec with 4′ mesh 50 sec with 1′ mesh 5 min with 0.5′ mesh The algorithm to switch from coarse mesh to the fine grain one is also available. Finally, we propose the new algorithm for tsunami source parameters determination by real-time processing the time series, obtained at DART. It is possible to approximate
Vilibic, I.; Sepic, J.; Denamiel, C. L.; Mihanovic, H.; Muslim, S.; Tudor, M.; Ivankovic, D.; Jelavic, D.; Kovacevic, V.; Masce, T.; Dadic, V.; Gacic, M.; Horvath, K.; Monserrat, S.; Rabinovich, A.; Telisman-Prtenjak, M.
A number of destructive meteotsunamis – atmospherically-driven long ocean waves in a tsunami frequency band – occurred during the last decade through the world oceans. Owing to significant damage caused by these meteotsunamis, several scientific groups (occasionally in collaboration with public offices) have started developing meteotsunami warning systems. Creation of one such system has been initialized in the late 2015 within the MESSI (Meteotsunamis, destructive long ocean waves in the tsunami frequency band: from observations and simulations towards a warning system) project. Main goal of this project is to build a prototype of a meteotsunami warning system for the eastern Adriatic coast. The system will be based on real-time measurements, operational atmosphere and ocean modeling and real time decision-making process. Envisioned MESSI meteotsunami warning system consists of three modules: (1) synoptic warning module, which will use established correlation between forecasted synoptic fields and high-frequency sea level oscillations to provide qualitative meteotsunami forecasts for up to a week in advance, (2) probabilistic premodeling prediction module, which will use operational WRF-ROMS-ADCIRC modeling system and compare the forecast with an atlas of presimulations to get the probabilistic meteotsunami forecast for up to three days in advance, and (3) real-time module, which is based on real time tracking of properties of air pressure disturbance (amplitude, speed, direction, period, …) and their real-time comparison with the atlas of meteotsunami simulations. System will be tested on recent meteotsunami events which were recorded in the MESSI area shortly after the operational meteotsunami network installation. Albeit complex, such a multilevel warning system has a potential to be adapted to most meteotsunami hot spots, simply by tuning the system parameters to the available atmospheric and ocean data.
Abdolali, Ali; Kirby, James T.
In the present paper, we aim to reduce the discrepancies between tsunami arrival times evaluated from tsunami models and real measurements considering the role of ocean compressibility. We perform qualitative studies to reveal the phase speed reduction rate via a modified version of the Mild Slope Equation for Weakly Compressible fluid (MSEWC) proposed by Sammarco et al. (2013). The model is validated against a 3-D computational model. Physical properties of surface gravity waves are studied and compared with those for waves evaluated from an incompressible flow solver over realistic geometry for 2011 Tohoku-oki event, revealing reduction in phase speed.Plain Language SummarySubmarine earthquakes and submarine mass failures (SMFs), can generate long gravitational waves (or tsunamis) that propagate at the free surface. Tsunami waves can travel long distances and are known for their dramatic effects on coastal areas. Nowadays, numerical models are used to reconstruct the tsunamigenic events for many scientific and socioeconomic aspects i.e. Tsunami Early Warning Systems, inundation mapping, risk and hazard analysis, etc. A number of typically neglected parameters in these models cause discrepancies between model outputs and observations. Most of the tsunami models predict tsunami arrival times at distant stations slightly early in comparison to observations. In this study, we show how ocean compressibility would affect the tsunami wave propagation speed. In this framework, an efficient two-dimensional model equation for the weakly compressible ocean has been developed, validated and tested for simplified and real cases against three dimensional and incompressible solvers. Taking the effect of compressibility, the phase speed of surface gravity waves is reduced compared to that of an incompressible fluid. Then, we used the model for the case of devastating Tohoku-Oki 2011 tsunami event, improving the model accuracy. This
Löwe, P.; Hammitzsch, M.; Babeyko, A.; Wächter, J.
The development of new Tsunami Early Warning Systems (TEWS) requires the modelling of spatio-temporal spreading of tsunami waves both recorded from past events and hypothetical future cases. The model results are maintained in digital repositories for use in TEWS command and control units for situation assessment once a real tsunami occurs. Thus the simulation results must be absolutely trustworthy, in a sense that the quality of these datasets is assured. This is a prerequisite as solid decision making during a crisis event and the dissemination of dependable warning messages to communities under risk will be based on them. This requires data format validity, but even more the integrity and information value of the content, being a derived value-added product derived from raw tsunami model output. Quality checking of simulation result products can be done in multiple ways, yet the visual verification of both temporal and spatial spreading characteristics for each simulation remains important. The eye of the human observer still remains an unmatched tool for the detection of irregularities. This requires the availability of convenient, human-accessible mappings of each simulation. The improvement of tsunami models necessitates the changes in many variables, including simulation end-parameters. Whenever new improved iterations of the general models or underlying spatial data are evaluated, hundreds to thousands of tsunami model results must be generated for each model iteration, each one having distinct initial parameter settings. The use of a Compute Cluster Environment (CCE) of sufficient size allows the automated generation of all tsunami-results within model iterations in little time. This is a significant improvement to linear processing on dedicated desktop machines or servers. This allows for accelerated/improved visual quality checking iterations, which in turn can provide a positive feedback into the overall model improvement iteratively. An approach to set
Bilve, Augustine; Nogareda, Francisco; Joshua, Cynthia; Ross, Lester; Betcha, Christopher; Durski, Kara; Fleischl, Juliet; Nilles, Eric
On 6 February 2013, an 8.0 magnitude earthquake generated a tsunami that struck the Santa Cruz Islands, Solomon Islands, killing 10 people and displacing over 4700. A post-disaster assessment of the risk of epidemic disease transmission recommended the implementation of an early warning alert and response network (EWARN) to rapidly detect, assess and respond to potential outbreaks in the aftermath of the tsunami. Almost 40% of the Santa Cruz Islands’ population were displaced by the disaster, and living in cramped temporary camps with poor or absent sanitation facilities and insufficient access to clean water. There was no early warning disease surveillance system. By 25 February, an EWARN was operational in five health facilities that served 90% of the displaced population. Eight priority diseases or syndromes were reported weekly; unexpected health events were reported immediately. Between 25 February and 19 May, 1177 target diseases or syndrome cases were reported. Seven alerts were investigated. No sustained transmission or epidemics were identified. Reporting compliance was 85%. The EWARN was then transitioned to the routine four-syndrome early warning disease surveillance system. It was necessary to conduct a detailed assessment to evaluate the risk and potential impact of serious infectious disease outbreaks, to assess whether and how enhanced early warning disease surveillance should be implemented. Local capacities and available resources should be considered in planning EWARN implementation. An EWARN can be an opportunity to establish or strengthen early warning disease surveillance capabilities.
Furuya, Takashi; Koshimura, Shunichi; Hino, Ryota; Ohta, Yusaku; Inoue, Takuya
In recent years, real-time tsunami inundation forecasting has been developed with the advances of dense seismic monitoring, GPS Earth observation, offshore tsunami observation networks, and high-performance computing infrastructure (Koshimura et al., 2014). Several uncertainties are involved in tsunami inundation modeling and it is believed that tsunami generation model is one of the great uncertain sources. Uncertain tsunami source model has risk to underestimate tsunami height, extent of inundation zone, and damage. Tsunami source inversion using observed seismic, geodetic and tsunami data is the most effective to avoid underestimation of tsunami, but needs to expect more time to acquire the observed data and this limitation makes difficult to terminate real-time tsunami inundation forecasting within sufficient time. Not waiting for the precise tsunami observation information, but from disaster management point of view, we aim to determine the worst tsunami source scenario, for the use of real-time tsunami inundation forecasting and mapping, using the seismic information of Earthquake Early Warning (EEW) that can be obtained immediately after the event triggered. After an earthquake occurs, JMA’s EEW estimates magnitude and hypocenter. With the constraints of earthquake magnitude, hypocenter and scaling law, we determine possible multi tsunami source scenarios and start searching the worst one by the superposition of pre-computed tsunami Green’s functions, i.e. time series of tsunami height at offshore points corresponding to 2-dimensional Gaussian unit source, e.g. Tsushima et al., 2014. Scenario analysis of our method consists of following 2 steps. (1) Searching the worst scenario range by calculating 90 scenarios with various strike and fault-position. From maximum tsunami height of 90 scenarios, we determine a narrower strike range which causes high tsunami height in the area of concern. (2) Calculating 900 scenarios that have different strike, dip, length
Baptista, M. A.; Yalciner, A. C.; Canals, M.
Tsunamis are low frequency but high impact natural disasters. In 2004, the Boxing Day tsunami killed hundreds of thousands of people from many nations along the coastlines of the Indian Ocean. Tsunami run-up exceeded 35 m. Seven years later, and in spite of some of the best warning technologies and levels of preparedness in the world, the Tohoku-Oki tsunami in Japan dramatically showed the limitations of scientific knowledge on tsunami sources, coastal impacts and mitigation measures. The experience from Japan raised serious questions on how to improve the resilience of coastal communities, to upgrade the performance of coastal defenses, to adopt a better risk management, and also on the strategies and priorities for the reconstruction of damaged coastal areas. Societal resilience requires the reinforcement of capabilities to manage and reduce risk at national and local scales.ASTARTE (Assessment STrategy And Risk for Tsunami in Europe), a 36-month FP7 project, aims to develop a comprehensive strategy to mitigate tsunami impact in this region. To achieve this goal, an interdisciplinary consortium has been assembled. It includes all CTWPs of NEAM and expert institutions across Europe and worldwide. ASTARTE will improve i) basic knowledge of tsunami generation and recurrence going beyond simple catalogues, with novel empirical data and new statistical analyses for assessing long-term recurrence and hazards of large events in sensitive areas of NEAM, ii) numerical techniques for tsunami simulation, with focus on real-time codes and novel statistical emulation approaches, and iii) methods for assessment of hazard, vulnerability, and risk. ASTARTE will also provide i) guidelines for tsunami Eurocodes, ii) better tools for forecast and warning for CTWPs and NTWCs, and iii) guidelines for decision makers to increase sustainability and resilience of coastal communities. In summary, ASTARTE will develop basic scientific and technical elements allowing for a significant