In this paper, the process of extracting and classifying six different EMG hand gestures using machine learning for future use in the control of a myoelectric hand prosthesis is presented. The process began with the extraction of the EMG data supplied by a patient suffering amputations in different parts of the forearm with an age range that ranges from 26 to 65 years, the recording of electromyographic signals was performed in two sessions in a non-invasive way. where the patients made the respective connected gestures using the Myo Armband bracelet and the data was stored in matrices. For the recognition of gestures based on EMG signals, we focus mainly on 3 areas: First, the pre-processing of the EMG signal where we improve the quality of the signal through filtering techniques to avoid noise in the signals obtained, the next area is the extraction of features where we obtained feature vectors of the EMG signal in the time domain, finally the recognition of the patterns in the signals using the following classifiers: MVS(Support Vector Machine), RNA(Artificial Neural Red), KVC(K-Near Neighbors), Naive Bayes and Decision Trees.