One of the most promising methods to assist amputated or paralyzed patients in the control of prosthetic devices is the use of a brain computer interface (BCI). The use of a BCI allows the communication between the brain and the prosthetic device through signal processing protocols. However, due to the noisy nature of the brain signal, available signal processing protocols are unable to correctly interpret the brain commands and cannot be used beyond the laboratory setting. To address this challenge, in this work we present a novel automatic brain signal recognition protocol based on vowel articulation mode. This approach identifies the mental state of imagery of open-mid and closed vowels without the imagination of the movement of the oral cavity, for its application in prosthetic device control. The method consists on using brain signals of the language area (21 electrodes) with the specific task of thinking the respective vowel. In the prosecution stage, the power spectral density (PSD) was calculated for each one of the brain signals, carrying out the classification process with a Support Vector Machine (SVM). A measurement of precision was achieved in the recognition of the vowels according to the articulation way between 84% and 94%. The proposed method is promissory for the use of amputated or paraplegic patients.