Implantable cardioverter-defibrillator (ICD) therapy is clearly an effective therapy for selected patients in definable populations. The benefits and risks of ICD therapy are directly impacted by programming and surgical decisions. This flexibility is both a great strength and a weakness, for which there has been no prior official discussion or guidance. It is the consensus of the 4 continental electrophysiology societies that there are 4 important clinical issues for which there are sufficient ICD clinical and trial data to provide evidence-based expert guidance. This document systematically describes the greater than 80% (83%–100%, mean 96%) required consensus achieved for each recommendation by official balloting in regard to the programming of (1) bradycardia mode and rate, (2) tachycardia detection, (3) tachycardia therapy, and (4) the intraprocedural testing of defibrillation efficacy. Representatives nominated by the Heart Rhythm Society (HRS), European Heart Rhythm Association (EHRA), Asian Pacific Heart Rhythm Society (APHRS), and the Sociedad Latinoamericana de Estimulacion Cardiaca y Electrofisiologia (SOLAECE-Latin American Society of Cardiac Pacing and Electrophysiology) participated in the project definition, the literature review, the recommendation development, the writing of the document, and its approval. The 32 recommendations were balloted by the 35 writing committee members and were approved by an average of 96%. The classification of the recommendations and the level of evidence follow the recently updated ACC/AHA standard [1], [2]. Class I is a strong recommendation, denoting a benefit greatly exceeding risk. Class IIa is a somewhat weaker recommendation, with a benefit probably exceeding risk, and Class IIb denotes a benefit equivalent to or possibly exceeding risk. Class III is a recommendation against a specific treatment because either there is no net benefit or there is net harm. Level of Evidence A denotes the highest level of evidence from more than 1 high-quality randomized clinical trial (RCT), a meta-analysis of high-quality RCTs, or RCTs corroborated by high-quality registry studies. Level of evidence B indicates moderate-quality evidence from either RCTs with a meta-analysis (B-R) or well-executed nonrandomized trials with a meta-analysis (B-NR). Level of evidence C indicates randomized or nonrandomized observational or registry studies with limited data (C-LD) or from expert opinions (C-EO) based on clinical experience in the absence of credible published evidence. These recommendations were also subject to a 1-month public comment period. Each society then officially reviewed, commented, edited, and endorsed the final document and recommendations. All author and peer reviewer disclosure information is provided in Appendix A. The care of individual patients must be provided in context of their specific clinical condition and the data available on that patient. Although the recommendations in this document provide guidance for a strategic approach to ICD programming, as an individual patient's condition changes or progresses and additional clinical considerations become apparent, the programming of their ICDs must reflect those changes. Remote and in-person interrogations of the ICD and clinical monitoring must continue to inform the programming choices made for each patient. The recommendations in this document specifically target adult patients and might not be applicable to pediatric patients, particularly when programming rate criteria. Please consider that each ICD has specific programmable options that might not be specifically addressed by the 32 distinctive recommendations in this document. Appendix B, published online (http://www.hrsonline.org/appendix-b), contains the writing committee's translations specific to each manufacturer and is intended to best approximate the recommended behaviors for each available ICD model. Because the ICD is primarily indicated for tachycardia therapy, there might be some uncertainty regarding optimal bradycardia management for ICD patients. Data from clinical studies adequately address only the programmed mode rather than the number of leads implanted, the number of chambers stimulated, or how frequently the patients required bradycardia support. It is of note that most information on pacing modes has been collected from pacemaker patients, and these patients are clinically distinct from ICD recipients. Dual-chamber pacing (atrial and ventricular) has been compared with single-chamber pacing (atrial or ventricular) in patients with bradycardia in 5 multicenter, parallel, randomized trials, in 1 meta-analysis of randomized trials, and in 1 systematic review that also included 30 randomized crossover comparisons and 4 economic analyses [3]-[9]. Meta analyses comparing dual- chamber to single-chamber ICDs did not evaluate pacing modes [10], [11]. Compared with single-chamber pacing, dual- chamber pacing results in small but potentially significant benefits in patients with sinus node disease and/or atrioventricular block. No difference in mortality has been observed between ventricular pacing modes and dual-chamber pacing modes. Dual-chamber pacing was associated with a lower rate of atrial fibrillation (AF) and stroke [12]. The benefit in terms of AF prevention was more marked in trials comprised of patients with sinus node disease. Although trends in favor of dual-chamber pacing have been observed in some trials, there was no benefit in terms of heart failure (HF). In patients without symptomatic bradycardia, however, the Dual Chamber and VVI Implantable Defibrillator (DAVID) trial in ICD recipients showed that one specific choice of dual-chamber rate-responsive (DDDR) programming parameters led to poorer outcomes than VVI backup pacing, most likely secondary to unnecessary right ventricular (RV) pacing. The fact that RV stimulation was responsible was reinforced in the DAVID II trial, in which AAI pacing was demonstrated to be noninferior to VVI backup pacing [13]. Approximately a quarter of patients with either sinus node disease or atrioventricular block develop “pacemaker syndrome” with VVI pacing usually associated with retrograde (ventricular to atrial) conduction, which in turn is associated with a reduction in the quality of life [14]. In crossover trials, symptoms of pacemaker syndrome (dyspnea, dizziness, palpitations, pulsations, and chest pain) were reduced by reprogramming to a dual-chamber mode [14]. Dual-chamber pacing is associated with better exercise performance compared with single-chamber VVI pacing without rate adaptation, but produces similar exercise performance when compared with rate-responsive VVIR pacing. Because of the additional lead, dual-chamber devices involve longer implantation times, have a higher risk of complications, and are more expensive. However, because of the additional clinical consequences of pacemaker syndrome and AF (and its sequelae), the overall cost difference between single- and dual-pacing systems is moderated. In patients with persistent sinus bradycardia, atrial rather than ventricular dual-chamber pacing is the pacing mode of choice. There is evidence for superiority of atrial-based pacing over ventricular pacing for patients who require pacing for a significant proportion of the day. The evidence is stronger for patients with sinus node disease, in whom dual-chamber pacing confers a modest reduction in AF and stroke, but not in hospitalization for HF or death compared with ventricular pacing. In patients with acquired atrioventricular block, large randomized parallel trials were unable to demonstrate the superiority of dual-chamber pacing over ventricular pacing with regard to hard clinical endpoints of mortality and morbidity [4], [6]-[8]. The benefit of dual-chamber over ventricular pacing is primarily due to the avoidance of pacemaker syndrome and to improved exercise capacity [14]. Even if it is a softer endpoint, pacemaker syndrome is associated with a reduction in quality of life that justifies the preference for dual-chamber pacing when reasonable; thus, there is strong evidence for the superiority of dual-chamber pacing over ventricular pacing that is limited to symptom improvement. Conversely, there is strong evidence of nonsuperiority with regard to survival and morbidity. The net result is that the indications for programming the dual- chamber modes are weaker and the choice regarding the pacing mode should be individualized, taking into consideration the increased complication risk and costs of dual- chamber devices. Because ICD patients usually do not require bradycardia support, with the exception of patients who require cardiac resynchronization, programming choices should avoid pacing and in particular avoid single ventricular pacing, if possible [15], [16]. The benefit of rate response programming has been evaluated in patients with bradycardia in 5 multicenter, randomized trials and in 1 systematic review that also included 7 single-center studies [17]-[22]. Most of these data were obtained from pacemaker studies and must be interpreted in that light. Although there is evidence of the superiority of VVIR pacing compared with VVI pacing in improving quality of life and exercise capacity, improvements in exercise capacity with DDDR compared with DDD have been inconsistent. In 2 small studies on patients with chronotropic incompetence comparing DDD and DDDR pacing, the latter had improved quality of life and exercise capacity; however, a larger, multicenter randomized trial (Advanced Elements of Pacing Randomized Controlled Trial [ADEPT]) failed to show a difference in patients with a modest blunted heart rate response to exercise [17]-[19]. In addition, DDDR programming in cardiac resynchronization therapy (CRT) patients has the potential to impair AV synchrony and timing. It should be noted that trials evaluating CRT generally did not use rate- responsive pacing, and many in fact avoided atrial stimulation using atrial sensed and ventricular paced pacing modes with a lower base rate. However, the Pacing Evaluation- Atrial Support Study in Cardiac Resynchronization Therapy (PEGASUS CRT) trial is the exception and did not demonstrate adverse impact on mortality and HF events [23]. In patients with persistent or intermittent sinus node dysfunction or chronotropic incompetence, the first choice is DDDR with algorithms responding to intermittent atrioventricular conduction. There is sufficient evidence for the superiority of VVIR compared with VVI in improving quality of life and exercise capacity. The evidence is much weaker in dual-chamber pacing (DDDR vs DDD). Although only an issue when there is some concomitant AV block, the upper rate limit should be programmed higher than the fastest spontaneous sinus rhythm to avoid upper rate limit behavior. To avoid symptomatic bradycardia, the lower rate should be programmed on an individual basis, according to the clinical characteristics and the underlying cardiac substrate of the patient. Patients with permanent AF and either spontaneous or AV junctional ablation-induced high-degree atrioventricular block have little to no chronotropic response to exercise; thus, VVIR pacing is associated with better exercise performance, improved daily activities, improved quality of life, and decreased symptoms of shortness of breath, chest pain, and heart palpitations, compared with VVI [20]-[22], [24]-[26]. Therefore, rate-adaptive pacing is the first choice of pacing mode; fixed-rate VVI pacing should be abandoned in patients with permanent AF and atrioventricular block. It is the experts’ opinion that the minimum rate can be programmed higher (e. g., 70 bpm) than for sinus rhythm patients, in an attempt to compensate for the loss of active atrial filling. In addition, the maximum sensor rate should be programmed restrictively (e. g., 110–120 bpm) to avoid “overpacing” (i.e., pacing with a heart rate faster than necessary), which can be symptomatic, particularly in patients with coronary artery disease. In a small study, however, it was found that rate-responsive pacing could be safe and effective in patients with angina pectoris, without an increase in subjective or objective signs of ischemia [25]. The lower rate should be programmed on an individual basis, according to the clinical characteristics and the underlying cardiac substrate of the patient. The clinical benefit of programming a lower resting rate at night based on internal clocks has not been evaluated in ICD patients. There is some concern that atrioventricular junction ablation and permanent ventricular pacing might predispose the patient to an increased risk of sudden cardiac death (SCD) related to a bradycardia-dependent prolongation of the QT interval. This risk might be overcome by setting the ventricular pacing rate to a minimum of 80 or 90 bpm for the first 1–2 months following the atrioventricular junction ablation, then reducing it to a conventional 60–70 bpm [27], [28]. Not all patients with AF and milder forms of atrioventricular block will require a high percentage of ventricular pacing or have a wide QRS. Physicians should consider the risk of increasing preexisting left ventricular (LV) dysfunction with RV pacing vs improved chronotropic responsiveness and the potential value of CRT. The results of a number of large-scale, prospective randomized trials demonstrated a significant reduction in AF in pacemaker patients with atrial-based pacing (AAI or DDD) compared with patients with ventricular-based pacing [4], [8], [29]. In the Mode Selection Trial, which enrolled 2010 patients with sick sinus syndrome, the risk of AF increased linearly with the increasing percentage of RV pacing [30]. At the same time, deleterious effects of RV pacing in patients with LV dysfunction (left ventricular ejection fraction [LVEF] ≤40%) implanted with dual-chamber ICD systems were observed in the Dual Chamber and VVI Implantable Defibrillator (DAVID) trial, which included 506 ICD patients without indications for bradycardia pacing. Patients within the DDDR-70 group (with paced and sensed atrioventricular delays of 170 and 150 ms, respectively, in most of the DDDR group patients) showed a trend toward higher mortality and an increased incidence of HF compared with the patients programmed to ventricular backup pacing—the VVI-40 group. Within the DDDR-70 group, there were more cardiac events when the percentage of ventricular pacing exceeded 40% (P=.09) compared with patients with 95% RV stimulation (DDDR-70) or [31], [32] However, a more detailed post hoc analysis of the Inhibition of Unnecessary RV Pacing With Atrial-Ventricular Search Hysteresis in ICDs (INTRINSIC RV) trial revealed that the most favorable clinical results were not in the VVI groups with the least percentage of RV pacing but in the subgroup that had DDD pacing with longer atrioventricular delays and 11%–19% of ventricular pacing. This parameter selection probably helped patients to avoid exceedingly low heart rates while preserving intrinsic atrioventricular conduction most of the time [31], [33]. In the Second Multicenter Automated Defibrillator Implantation Trial (MADIT II), a higher risk of HF was observed in patients who had a greater than 50% burden of RV pacing [34]. In another large observational study of 456 ICD patients without HF at baseline, a high RV pacing burden (RV pacing more than 50% of the time) was associated with an increased risk of HF events and appropriate ICD shocks [35]. Optimally, RV stimulation should be avoided, but the precise tradeoff between the percentage of ventricular pacing and atrioventricular timing is unclear in non-CRT patients. The importance of reducing or avoiding RV pacing in ICD patients with LV dysfunction was illustrated in the DAVID trial [31]. The feasibility of algorithms designed to decrease the burden of unnecessary ventricular pacing has been demonstrated in patients with dual-chamber pacemakers [36]-[38]. These algorithms usually provide functional AAI pacing with monitoring of atrioventricular conduction and an automatic mode switch from AAI to DDD during episodes of atrioventricular block. Some studies directly compared various algorithms to decrease ventricular pacing, showing that a “managed ventricular pacing” (MVP) algorithm resulted in greater ventricular pacing reduction than an “atrioventricular search” algorithm [39], [40]: however, no randomized studies comparing these two algorithms with respect to important cardiovascular endpoints (e.g., HF, cardiac death) have been performed. The results of the studies on these pacing algorithms are summarized in Table 1. Unnecessary RV pacing should be minimized by using specific algorithms or programming longer atrioventricular delays, and this process is more important for patients with a higher risk of AF or who already have poorer LV function [49]. Patients with longer baseline PR intervals have a higher risk of AF regardless of the percentage of ventricular pacing or the length of the programmed atrioventricular interval [50]. Use of the AAIR pacing mode with exceedingly long atrioventricular conduction times can lead to “AAIR pacemaker syndrome” and actually increases the risk of AF compared with the DDDR mode, as was shown in the Danish Multicenter Randomized Trial on Single Lead Atrial versus Dual-Chamber Pacing in Sick Sinus Syndrome (DANPACE) [3], [51]. Therefore, excessively long atrioventricular delays resulting in nonphysiologic atrioventricular contraction patterns should be avoided. The potential harm of atrial pacing with a prolonged atrioventricular delay was also demonstrated in the MVP trial. In the MVP trial, dual- chamber pacing with the MVP algorithm was not superior to ventricular backup pacing (VVI 40 bpm) with respect to HF events. After a follow-up of 2.4 years, there was an apparent increase in HF events that was limited primarily to patients with a baseline PR interval of >230 ms (mean PR of 255– 260 ms) [42]. Long atrioventricular intervals also predispose the patient to repetitive atrioventricular reentrant rhythms, “repetitive nonreentrant VA synchrony,” or “atrioventricular desynchronization arrhythmia,” which manifest as mode switching but which also cause sustained episodes with poor hemodynamics [52]. Thus, based on the available data, it appears that atrial pacing with excessively long atrioventricular delays should be avoided. Algorithms that minimize ventricular pacing sometimes lead to inadvertent bradycardia or spontaneous premature, beat-related short-long-short RR interval sequences with proarrhythmic potential [53]-[55]. However, in a study retrospectively analyzing the onset of ventricular tachycardia (VT) in ICD patients, the MVP mode was less frequently associated with the onset of VT compared with the DDD and VVI modes [54]. Atrioventricular decoupling (greater than 40% of atrioventricular intervals exceeding 300 ms) was observed in 14% of the ICD patients in the Marquis ICD MVP study, which might have a negative effect on ventricular filling [56]. In ICD patients with structural heart disease, spontaneous atrioventricular conduction can become prolonged instead of shortening, with increased atrial paced heart rates [33]. This outcome frequently leads to a higher percentage of ventricular paced complexes. In view of the results of the ADEPT trial, which failed to demonstrate the clinical superiority of combined rate modulation and DDD pacing, the need for and aggressiveness of sensor-driven rate responses should be individualized or eliminated [19]. Rate-dependent shortening of atrioventricular delay could have the same effect and should usually be avoided. Patients with hypertrophic cardiomyopathy represent a small but intricate subset of the ICD population for whom pacing has not been demonstrated to be a consistently effective treatment for outflow tract obstruction. However, according to the 2011 ACCF/AHA Hypertrophic Cardiomyopathy Guideline, dual-chamber ICDs are reasonable for patients with resting LV outflow tract gradients more than 50 mm Hg, and who have indications for ICD implantation to reduce mortality [57]. In these patients, atrioventricular delays should be individually programmed to be short enough to achieve RV preexcitation and decrease LV outflow tract gradient, but not too short, which would impair LV filling; usually in the ranges of 60–150 ms [58], [59]. There are few studies of pacing modes in these patients, and they are limited by small numbers and the failure to quantify important cardiac outcomes. In conclusion, atrioventricular interval programming and choosing between DDDR and MVP or other atrioventricular interval management modes should be performed on an individual basis. The goal is to minimize the percentage of RV pacing and to avoid atrial-based pacing with atrioventricular intervals exceeding 250–300 ms leading to atrioventricular uncoupling. In patients with prolonged PR intervals and impaired LV function, biventricular pacing can be considered. CRT in combination with a defibrillator device (CRT-D) improves survival and cardiac function in patients with LV systolic dysfunction, prolonged QRS duration, and mild-to- severe HF [60]-[62]. The beneficial effect of CRT-D compared with ICD is likely to be derived from biventricular pacing, with a decrease in dyssynchrony and an improvement in cardiac function. The percentage of biventricular pacing capture in the ventricles can be negatively influenced by a number of factors, including atrial tachyarrhythmias, premature ventricular complexes, and programming of the atrioventricular delay, giving way to the intrinsic conduction of the patient and a reduced percentage of biventricular pacing. Some large observational studies have investigated the optimal level of biventricular pacing percentage and found a higher percentage to be associated with more pronounced CRT benefits. An optimal CRT benefit was observed with a biventricular pacing percentage as close to 100% as possible [63]-[66]. In the analysis of the left bundle branch block population in the MADIT-CRT trial, those patients with less than 90% biventricular pacing had similar rates of HF and death compared with the patients randomized to no CRT. By contrast, biventricular pacing exceeding 90% was associated with a benefit of CRT-D in terms of HF or death when compared with ICD patients and no CRT. Biventricular pacing 97% and greater was associated with a further reduction in HF or death and a significant reduction in death alone. Consistently, every 1% increase in biventricular pacing percentage was associated with a 6% risk reduction in HF or death, a 10% risk reduction in death alone, and an increase in LV reverse remodeling [67]. Therefore, in ICD patients with biventricular pacing, it can be beneficial to adjust the therapy to produce the highest achievable percentage of ventricular pacing, preferably above 98%, to improve survival and reduce HF hospitalization. Approaches to increasing the percentage of biventricular pacing include programming shorter but hemodynamically appropriate atrioventricular delays and minimizing atrial and ventricular ectopic activity and tachyarrhythmias. Optimizing the location of ventricular pacing sites and the timing of the pacing pulses can significantly improve cardiac hemodynamics in CRT patients. Echocardiographic optimization of atrioventricular delays in CRT patients can alleviate HF symptoms and increase exercise capacity compared with nominal programming, particularly when approaching nonresponding populations [68]. However, echocardiographic optimization in the PROSPECT study did not support this approach in a randomized trial, and the Frequent Optimization Study Using the QuickOpt Method (FREEDOM) trials failed to provide evidence supporting the benefit of CRT optimization and did not demonstrate superiority of the respective algorithms over nominal or empiric programming [69]-[71]. There are limited data supporting the use of LV- only stimulation in a small subset of patients who fail to respond to biventricular stimulation [72]. Adaptive CRT (aCRT) is an algorithm that periodically measures intrinsic conduction and dynamically adjusts CRT pacing parameters. The algorithm withholds RV pacing when intrinsic electrical conduction to the RV is normal and provides adjustment of CRT pacing parameters based on electrical conduction. A prospective, multicenter, randomized, double-blind clinical trial demonstrated the safety and efficacy of the aCRT algorithm [73]. This algorithm can increase the longevity of the implantable device and replace a manual device optimization process with an automatic ambulatory algorithm, although echo optimization might still be needed, at least in nonresponders. The Clinical Evaluation on Advanced Resynchronization (CLEAR) study assessed the effects of CRT with automatically optimized atrioventricular and interventricular delays, based on a peak endocardial acceleration (PEA) signal system. PEA-based optimization of CRT in patients with HF significantly increased the proportion of patients who improved with therapy during follow- up, mainly through an improved New York Heart Association (NYHA) class [74]. Following significant technological changes in ICDs in recent years, the concept of optimal ICD programming has changed dramatically. From the dawn of this therapy in the early 1980s to the first decade of the 21st century, the rapid detection and treatment of VT and ventricular fibrillation (VF) have been stressed. The argument for rapid detection of VT and VF derived from a number of factors. Initial skepticism regarding the feasibility of sudden death prevention with ICDs, the fact that early ICD patients had all survived one or more cardiac arrests, concern for undersensing and underdetection (of VF in particular), demonstration of an increasing defibrillation threshold with prolonged VF duration, and the increased energy requirement of monophasic defibrillation all created a culture of programming for rapid tachycardia detection and the shortest possible time to initial therapy [75]-[77]. The initial generations of ICDs did not record and save electrograms (EGMs), leading to a reduced appreciation for the frequency and impact of inappropriate shocks. With the advent and then dominance of primary prevention indications, avoidable shocks assumed a relatively larger proportion of total therapy [78]-[83]. Gradually, publications have increased awareness of the frequency and the diverse range of adverse outcomes associated with avoidable ICD therapy, and have demonstrated that avoidable ICD shocks can be reduced by evidence-based programming of the detection rate, detection duration, antitachycardia pacing (ATP), algorithms that discriminate supraventricular tachycardia (SVT) from VT, and specific programming to minimize the sensing of noise [81]-[92]. Until recently, default device programming used short- duration “detection” criteria that varied by manufacturer and a tachycardia rate of approximately 2.8 to 5 seconds before either ATP or charging (including detection time plus duration or number of intervals) [82], [93]. With increased awareness of the potential harm from inappropriate shocks and the realization from stored pacemaker EGMs that even long episodes of VT can self-terminate, a strategy of prolonged detection settings has been explored. This strategy allows episodes to self-terminate without requiring device intervention and reduces inappropriate therapy for nonmalignant arrhythmias. The benefit of programming a prolonged detection duration (30 of 40 beats) was first reported in the Prevention Parameters Evaluation (PREPARE) study on exclusively primary prevention subjects (n=700), and compared outcomes to a historical ICD cohort programmed at “conventional detection delays” with about half programmed to 12 of 16 intervals within the programmed detection zone and half to 18 of 24 intervals [94]. The programming in PREPARE demonstrated a significant reduction in inappropriate shocks for supraventricular arrhythmia and in avoidable shocks for VT. In addition, a composite endpoint was reduced as well: the morbidity index, which consists of shocks, syncope, and untreated sustained VT. Within the limitations of a nonrandomized study, it was concluded that extending detection times reduces shocks without increasing serious adverse sequelae. In 2009, the Role of Long-Detection Window Programming in Patients with Left Ventricular Dysfunction, Non- Ischemic Etiology in Primary Prevention Treated with a Biventricular ICD (RELEVANT) study confirmed and expanded the results of the PREPARE trial in a cohort of 324 primary prevention CRT-D patients with nonischemic cardiomyopathy [95]. The subjects were treated with simplified VT management, which implies much longer detection for VF episodes (30 of 40) compared with the control group (12 of 16) and a monitor-only window for VT. As in PREPARE, the RELEVANT study group experienced a significantly reduced burden of ICD interventions (81% reduction) without increasing the incidence of syncope. Fewer inappropriate shocks and HF hospitalizations were reported in the RELEVANT study group compared with the control group. The Multicenter Automatic Defibrillator Implantation Trial: Reduce Inappropriate Therapy (MADIT-RIT), a 3- arm study, compared a conventional programming strategy (a 1-second delay for VF [equivalent to approximately 12 intervals including detection plus delay] and a 2.5-second delay for VT detection [equivalent to approximately 16 intervals including detection plus delay]) (Arm A) to both a high-rate cutoff with a VF zone starting at 200 bpm (Arm B) (discussed in section Rate Criteria for the Detection of Ventricular Arrhythmia and discussed as referenced in reference [96].) and to a d