[P] Detect human behaviour with sensor matrix
Since I have no previous experience in machine learning (only in software development), I was wondering if the following approach could be feasible at all or if I have to dig much deeper:
I am currently developing a software interface (Python, running on a PC) to access sensor data that is being collected by a microcontroller and sent to the PC via USB.
The sensors are arranged in a 64×64 matrix and are placed under a piece of foil (calling the combination of it “sensor foil”). They measure pressure created by human physical activity (e.g. a hand touching the sensor foil).
What I want to achieve is to detect different body parts or objects placed on the foil. For example left/right hand placed flat on it, a glass being placed on it or an elbow.
Is there an approach so high level that allows me to:
– Create a fixed set of events to be detected by the system: Left/right hand, elbow and glass placed on the sensor foil
– Feed the live raw sensor data to a machine learning system running on the PC
– Train the system with the events described above. This means performing the physical action like putting my left hand down and tell the system which of the previously defined events I just made occur
After training is finished provide the system with the live sensor data, execute one of the defined physical actions and have the system recognize the according event along with a confidence level provided
Is that something one of the available machine learning systems can do, considering this is a proof-of-concept project with one man behind it?
My apologies, if this request is being formulated too broad. I will gladly take in any recommendations on reading up on the matter.