Assessment of Results Produced by an Algorithm Delivering Earthquake Epi- & Hypocenters Concurrently and in Real-Time

  •  George R. Daglish    
  •  Yurii P. Sizov    


This paper is to suggest a means by which confidence limits may be placed around the Epi- & Hypocenter values that are evolved, concurrently and in real-time, on the detection of an increasing number of P-wave first arrivals, by the present algorithm.

As described in previous papers, this algorithm is table driven in that it uses an “interpolative tabular scanning process” to deliver its results. For this purpose a set of three main tables are provided: a table of travel times for rays originating from a graduated set of depth points to a given set of colatitudes; a table for a set of take-off angles corresponding to the traveltimes and a set of calibrating tables to correct numerical error found in the table generating process itself. These tables are generated by any from a set of point-to-point (P2P) ray tracers parameterized by any from a set of radial Earth velocity models {PREM, iasp91, ak135}.

The production of the confidence limits is concomitant on considering that each value (i.e. discovered Epi- & Hypocenter) is stationary, and subject to perturbation by error.

In brief, the error is considered ultimately to be normal. Therefore normal theory is used, as in Chauvenet’s test to screen the production of individual localizations for outlying data inputs. Subsequently it is used to monitor for outliers in the set of localizations themselves.

Use is then made of the t-variate (“Student’s t”) to dynamically establish confidence limits, of varying levels of significance, about regressions on the set of localizations as this set increases in real-time.

The upshot being that the algorithm can produce sucessive localizations of the Earthquake as data input (P-wave onset times) arises and at the same time monitor the accuracy and integrity of the solutions.

This work is licensed under a Creative Commons Attribution 4.0 License.