Eye muscles will move the cursor on the computer screen while jaw muscles will be used to click. Voice recognition will be used as command shortcuts and as access to programs.

Tuesday, April 30, 2013

Amplifier Construction

Week 4

   During the week 4 lab, we learned how to use a soldering iron. Intense heat is used to connect metal portions of wires and other electrical components together. Using this knowledge, we started on the mechanical construction portion of our project, specifically the instrumentation amplifier. DJ gave us a schematic to follow, which can be seen below as Figure 1.


Figure 1: Instrumentation Amplifier Schematic

   We focused mainly on soldering the capacitors, resistors and amplifier onto the circuit board first. We had to compensate for the lack of space on the circuit board. There wasn't much surface area on the circuit board to allow adequate space between the resistors and capacitors. We also had to be careful to not solder the positive and negative ends of the resistors together. 

   As seen in the schematic, some resistors (R1, R2, R3, R4) and reference electrodes (REFA and REFB) will need to go to ground. So, we thought about putting a wire around the circuit board to act as ground.


Figure 2: Circuit Board in Progress


Sunday, April 21, 2013

Mechanical Parts, FAQ, and Voice Recognition

Week 3

    We decided to order small sized Nikotab 0X15 tab electrodes (No. 0315, 21 x 34 mm), since there is a lack of surface area around the eye area. We also obtained a data acquisition box (the USB-6009). This will be programmed in MATLAB. We also answered questions presented to us by DJ, explaining physiology of EMG/EOG. These answers can be seen on our FAQ page.

   We also worked on the word recognition part of our project. We decided that the activation and deactivation of the recording process will be signaled by stating keywords. This would eliminate the need to continuously record, since that would take up memory. The voice recordings will then be processed and plotted on a graph. To have the word program decide on what the speaker said, the newly plotted graph will be compared with previously plotted graphs, made by previously recorded words. If the current plot graph is matched, then the program will deduce that the user said the phrase relative to the matching plot graph.

   From there, we delved into research on MATLAB code that would enable a word recognition system. Here is a link to our first finding: http://www.mathworks.com/tagteam/60673_91805v00_WordRecognition_final.pdf
This gave us a template for our workflow and for our MATLAB program. There are several stages to our program; the first one is training. This is when we record the user saying a single command, such as "save document," several times. We will then capture 10 seconds of this speech from the computer's built-in microphone at 8000 samples per second. The next stage, testing, will require acquiring previously recorded speech samples while processing incoming speech. The Data Acquistion Toolbox will be used to perform this function. Graphs of the speech samples will be made. This link also mentioned Mel Frequency Cepstral Coefficients (MFCCs) that give a measure of the energy within overlapping frequency bins of a sound spectrum. Using this, MFCC vectors can be calculated from the test speech and incoming speech, and be compared.
. We also found another source: http://www.ece.iit.edu/~pfelber/speechrecognition/report.pdf. This article explains Linear Predictive Coding (LPC) that can extract and store information about the points of loudness (formants) in the sound spectrum. We can use LPC to compare the formants of the stored speech and incoming speech. This article also gives MATLAB source files for extracting, matching, recording, speech, training, and testing. We also found a simple MATLAB code for recording and plotting audio samples, which can be seen in Figure 1. We can use this code, along with the code found in the previous sources, as a starting point to creating our own word recognition program.



Figure 1 :  MATLAB code for recording and plotting audio samples.






Figure 2 : Plot graph of the the phrase "save document," using MATLAB code from Figure 1.




Resources:
[1] N.p. (n.d.). Developing an Isolated Word Recognition System in MATLAB. (N/A) [Online]. Available: http://www.mathworks.com/tagteam/60673_91805v00_WordRecognition_final.pdf

[2] N.p. (n.d.). Record and Play Audio. (N/A) [Online]. Available: http://www.mathworks.com/help/matlab/import_export/record-and-play-audio.html#bsdl2em

[3] P. Felber.(2001, April 25). SPEECH RECOGNITION
Report of an Isolated Word experiment
. (N/A) [Online]. Available: http://www.ece.iit.edu/~pfelber/speechrecognition/report.pdf.




Tuesday, April 16, 2013

Shortcut Functions and Facial Muscles


Week 2

   After a talk with DJ, we decided to be more specific with what commands we would want to implement into our program. We were inspired by pre-existing keyboard shortcuts used for windows. Examples of these commands can be seen here: http://windows.microsoft.com/en-us/windows7/keyboard-shortcuts . We decided to include shortcuts for functions such as:


  • “Open/Close programs, including Internet Explorer, Microsoft word, Windows media player, iTunes,  and will include an option to add other desired programs
  • “Save Document”
  • “Undo” or “Redo”
  • “Zoom in/out”
  • “Stop voice commands”
  • “Search for file"

 
These functions and commands will allow users to gain much of the functionality of the keyboard for shortcuts and improve the interface and ease of use. We plan to implement this by having responses to prompts be stored, then compared to the different voice commands given by the user, and then the command that is recognized by the speech recognition program will be activated through a section of code that can interact with the operating system. In light of these changes and modifications to the project direction, the group reworked the topic proposal to fit these changes.
         
           We also did some research regarding eye and jaw muscles to figure out what happens during the movement of those muscles. The information we gathered would let us deduce where the best placement for the electrodes would be. There are 6 extraocular muscles that turn the eye about its vertical, horizontal, and antero-posterior axes. There are also cardinal positions (positions of gaze) which allow comparisons of horizontal, vertical, and diagonal movement. The muscle of one eye is "yoked" with a muscle of the other eye when put in these positions. So, we will place the electrodes on top, bottom, and side of the eye to detect these muscle movements.

Figure 1: Extraocular Muscles

For the jaw, the masseter muscle closes the jaw. It is located on the sides of the jaw. Electrodes will be placed here to detect the clenching of the left/right jaw. The anterior belly of the digastric muscle opens the jaw. It's located on the chin. We will place an electrode here to detect when the jaw opens, activating the voice recording.

Figure 2: Jaw Muscles (Masseter and Digastric Anterior Belly)

     We also started ordering and gathering parts for the device. There is a list of those parts outlined in the project proposal page



References:

[1] N.p. (n.d.). Keyboard shortcuts. (N/A) [Online]. Available: http://windows.microsoft.com/en-us/windows7/keyboard-shortcuts
[2]N.p. (n.d.). Muscles that move the lower mandible (the jaw). (N/A) [Online]. Available:
http://linguistics.berkeley.edu/~kjohnson/ling110/Lecture_Slides/6_MotorControl/face_tongue_muscles.pdf
[3]  N.p. (n.d.). Science of massage. (N/A) [Online]. Available: http://www.scienceofmassage.com/dnn/som/journal/1007/tmj-f7.jpg
[4] T.M. Montgomery. (n.d.). The Extraocular Muscles. (N/A) [Online]. Available: http://www.tedmontgomery.com/the_eye/eom.html



Monday, April 8, 2013

EMG Controlled Devices

Week 1

    During our first lab period, several project topics were discussed. Our group focused on EMG Controlled Devices as our project topic. It involves studying the electrical signals produced during muscle movement. These signals can be acquired by electrodes and used to control electrical/mechanical systems. The project for this topic will be based on another project done by a group from the previous year's class. Their project involved using EOG and EMG sensors for facial muscles to move a cursor and click. 

     Our group, group 6, discussed several ideas to add onto this project. Our ideas included: morse code and lip reading. Our plan was to have a hands-free way to type, in addition to a hands free cursor control. But, DJ commented on how a voice recognition device would be easier and more comfortable for the user, and how using sensors to read lips would prove to be too difficult. Our original idea was also lacking an audience to use it, since not many people are quadriplegics who can't talk and who weren't able to use a voice recognition device.

      So, our group started brainstorming on an appropriate audience and a more feasible project. Later that week, we came up with the idea of adding voice recognition to serve as shortcuts to programs, making the computer even more user friendly for those who can;t use their hands.