A real-time pattern recognition algorithm based on k-nearest neighbors and lazy learning was used to classify, voluntary electromyography (EMG) signals and to simultaneously control movements of a dexterous artificial hand. EMG signals were superficially recorded by eight pairs of electrodes from the stumps of five transradial amputees and forearms of five able-bodied participants and used online to control a robot hand. Seven finger movements (not involving the wrist) were investigated in this study. The first objective was to understand whether and to which extent it is possible to control continuously and in real-time, the finger postures of a prosthetic hand, using superficial EMG, and a practical classifier, also taking advantage of the direct visual feedback of the moving hand. The second objective was to calculate statistical differences in the performance between participants and groups, thereby assessing the general applicability of the proposed method. The average accuracy of the classifier was 79% for amputees and 89% for able-bodied participants. Statistical analysis of the data revealed a difference in control accuracy based on the aetiology of amputation, type of prostheses regularly used and also between able-bodied participants and amputees. These results are encouraging for the development of noninvasive EMG interfaces for the control of dexterous prostheses.