Abstract

Background

The aim of this study was to measure chest compression decay during simulated advanced life support (ALS) in a cardiac arrest manikin model.

Methods

19 paramedic teams, each consisting of three paramedics, performed ALS for 12 minutes with the same paramedic providing all chest compressions. The patient was a resuscitation manikin found in ventricular fibrillation (VF). The first shock terminated the VF and the patient remained in pulseless electrical activity (PEA) throughout the scenario. Average chest compression depth and rate was measured each minute for 12 minutes and divided into three groups based on chest compression quality; good (compression depth ≥ 40 mm, compression rate 100-120/minute for each minute of CPR), bad (initial compression depth < 40 mm, initial compression rate < 100 or > 120/minute) or decay (change from good to bad during the 12 minutes). Changes in no-flow ratio (NFR, defined as the time without chest compressions divided by the total time of the ALS scenario) over time was also measured.

Results

Based on compression depth, 5 (26%), 9 (47%) and 5 (26%) were good, bad and with decay, respectively. Only one paramedic experienced decay within the first two minutes. Based on compression rate, 6 (32%), 6 (32%) and 7 (37%) were good, bad and with decay, respectively. NFR was 22% in both the 1-3 and 4-6 minute periods, respectively, but decreased to 14% in the 7-9 minute period (P = 0.002) and to 10% in the 10-12 minute period (P < 0.001).

Conclusions

In this simulated cardiac arrest manikin study, only half of the providers achieved guideline recommended compression depth during prolonged ALS. Large inter-individual differences in chest compression quality were already present from the initiation of CPR. Chest compression decay and thereby fatigue within the first two minutes was rare.

Keywords:

Advanced life support (ALS); cardiac arrest; cardiopulmonary resuscitation (CPR); fatigue; resuscitation; chest compression