Available online at www.sciencedirect.com

ScienceDirect Journal of Electrocardiology 48 (2015) 29 – 30 www.jecgonline.com

Editorial

Big data and the electrocardiogram It is my guess that almost every reader of this journal is generally familiar with the remarkable accomplishments that have come from the interactions of the disciplines of mathematics, computer science, statistics, and signal engineering, and which have become known as “big data”. The torrent of data flowing from computers, phones, cameras, and other devices is beyond comprehension, and the pace of the flow is still increasing. From efforts to make sense of this mountain of data have come insights and techniques that are revolutionizing activities from homeland security to medical care. Almost every aspect of our lives has been or will be changed as a result. It is also my guess that relatively few of this journal’s readers have spent much time thinking about the potential impact of this new hybrid science on electrocardiography. One fact is clear: computers and signal analysis are far better at detecting and sorting electrical signals than the human eye– brain combination, and WILL become the gold standard for detecting and interpreting variations (i.e. “abnormalities”) in the electrocardiogram (ECG). Our challenge as scientists, students and leaders is to understand this new discipline, to form working alliances with those who understand its new language and approach to data problems, and, together, to start the next phase of our understanding and use of its potential. The ECG is a fairly straightforward continuous waveform, usually recorded in the context of diagnosis and treatment of an illness or a set of symptoms experienced by a patient. Most are now recorded by a commercial instrument which analyzes the various waveforms by using a proprietary analytic program based on the experience of one or more experts. Most do a good job of analyzing the wave forms and producing a report which agrees with expert opinion. Isn’t this enough “big data”? No!! This is the translation of the heuristic wisdom of a few experts into an automated action by the computer within the recording system. Each proprietary system (about nine, worldwide) uses its own standards, usually derived from opinions of its experts. It should be noted that big data experts have looked at the “discarded” digital information after extraction of the usual information and generation of a clinical report, and have extracted additional significant information, mainly about future risk, using information that no clinician has heretofore recognized [1,2]. We badly need the big data experts! The need is not that of improving the precision of detection of a currently recognized ECG abnormality, but that of discovering associations not

http://dx.doi.org/10.1016/j.jelectrocard.2014.10.008 0022-0736/© 2015 Elsevier Inc. All rights reserved.

previously recognized and determining the meaning of each for the future lives of patients. This might be a newly recognized risk or protected state. The potential of this new approach would be best demonstrated by using such techniques in large population databases which have been set up and are now maintained in many sites, as proposed in a recent editorial [3]. These data are recorded and maintained with rigor and high standards, and are available in digital form, minimizing the need for data pre-filtering. The populations are usually well known, and carefully followed, often for decades. Overall and cardiovascular mortality are usually tracked. These characteristics provide multiple parameters which can provide added dimensions for the multidimensional studies characteristic of this new field. Most existing population studies are designed to include coding systems, such as the Minnesota ECG Code, for easy identification of certain subgroups of subjects. The absence of such precoding would generally make the use of such data difficult, but it should be recognized that newer techniques permit the computer to learn to recognize and then to search and identify examples of new ECG “syndromes” in a population. Thus an ECG database, collected for one purpose, can be used for an entirely new purpose, without the requirement for a human intervention. Our paper in this issue is a step in this direction, in which signal science techniques are used to instruct the computer how to identify a clinical ECG feature, a J wave, and then use this “knowledge” to detect this feature in a previously unselected population [4]. Utilization of a database collected over a period of thirty years could compress the time needed to evaluate the risk of a newly recognized ECG syndrome from decades to a few months. Of even more importance is the possibility of discovering associations which are totally unsuspected, and which cause us to think in a different direction. Consider the very likely possibility of testing all existing automated ECG interpretation systems on the same population of 10,000 subjects, and comparing their ability to recognize each of the list of diagnoses. Consider the development of a validated set of cardiovascular risks based on data from this same population. Consider the addition of new risks from combinations of known factors. Consider the possibility of recognizing new associations leading to new understanding of the physiology underlying recognized ECG abnormalities. Consider the fact that big data techniques could relate ECG data to that generated by other imaging modalities, perhaps

30

Editorial

leading to new insights and interventions. “Big data” is the entity that has made the precision and utility of most of these imaging modalities (echo, MRI, etc.) possible. Isn't it time to apply its insights and skills to our old friend, the ECG?

E. Harvey Estes, MD Duke University Medical Center, Community and Family Medicine, 3542 Hamstead Court, Durham, NC, United States E-mail address: [email protected]

References [1] Syed ZH. Computational Methods for Physiological Data. (http://hdl. handle.net/1721.1/54671) PhD DissertationMIT Department of Electrical Engineering and Computer Science, John V. Guttag, Advisor; 2009. [2] Syed Z, Stultz CM, Scirica BM, Guttag JV. Computationally generated cardiac biomarkers for risk stratification after acute coronary syndrome. Sci Transl Med 2011;3:102ra95. [3] Estes EH. A new gold rush? The future in ECG research. J Electrocardiol 2014;47:593–4. [4] Wang Y, Wu H, Daubechies I, Yahing L, Estes EH, Soliman EZ. Automated J wave detection from digital 12-lead electrocardiogram. J Electrocardiol 2015;48:21–8 (In this issue).

Big data and the electrocardiogram.

Big data and the electrocardiogram. - PDF Download Free
104KB Sizes 0 Downloads 11 Views