Genome complexity has been associated with poor outcome in patients with chronic lymphocytic leukemia (CLL). Previous cooperative studies established five abnormalities as the cut-off that best predicts an adverse evolution by chromosome banding analysis (CBA) and genomic microarrays (GM). However, data comparing risk stratification by both methods are scarce. Herein, we assessed a cohort of 340 untreated CLL patients highly enriched in cases with complex karyotype (CK) (46.5%) with parallel CBA and GM studies. Abnormalities found by both techniques were compared. Prognostic stratification in three risk groups based on genomic complexity (0-2, 3- 4 and ≥5 abnormalities) was also analyzed. No significant differences in the percentage of patients in each group were detected, but only a moderate agreement was observed between methods when focusing on individual cases (κ=0.507; P<0.001). Discordant classification was obtained in 100 patients (29.4%), including 3% classified in opposite risk groups. Most discrepancies were technique-dependent and no greater correlation in the number of abnormalities was achieved when different filtering strategies were applied for GM. Nonetheless, both methods showed a similar concordance index for prediction of time to first treatment (TTFT) (CBA: 0.67 vs. GM: 0.65) and overall survival (CBA: 0.55 vs. GM: 0.57). High complexity maintained its significance in the multivariate analysis for TTFT including TP53 and IGHV status when defined by CBA (hazard ratio [HR] 3.23; P<0.001) and GM (HR 2.74; P<0.001). Our findings suggest that both methods are useful but not equivalent for risk stratification of CLL patients. Validation studies are needed to establish the prognostic value of genome complexity based on GM data in future prospective studies.