Stevens Institute of Technology
For their senior design project, three computer engineering seniors developed a neural network to recognize simultaneous human emotions from multiple modes of communication
A jaw clench. An eye turn. A vocal crack. A sigh.
These subtle, sometimes barely perceptible microexpressions and vocalizations can communicate as much about a person's mental and emotional states as the words they choose to express.
Many brands rely on artificial intelligence to identify consumer sentiment and emotion as part of their market research and product development efforts. But current commercially available machine learning models are limited, often relying on overly simplistic techniques or single, isolated features to make their determinations.
Stevens Institute of Technology fourth-year computer engineering seniors Mourad Deihim, Daniel Gural, and Jocelyn Ragukonis have devised an artificial neural network to close this gap. . . .
Continue reading at Stevens Institute of Technology.