EE-Emerge team: Brandon Potter, Christine Mitroff, Sejal Karanjkar, Willie Feng-Liu, Brian Nguyen and Paulo Batilay
Abstract:
As the supply chain gets deeper in complexity and the need to meet consumer demands grow, the process of automating tasks has never been needed more than now. Helping Hand enters the field of automation in a different way by allowing a task to be automated without sacrificing the specific wants and needs of the user. Helping Hand is a robotic hand that utilizes computer vision through edge machine learning to mimic the gestures of its user. By copying the user's motions Helping Hand allows one to avoid unsafe work environments while still having full control over tasks. With all decisions being made directly by the devices itself, Helping Hand is a stand-alone, portable device that does not require wifi for cloud computing. This was an important design decision as it would allow the device to have more versatility than other devices currently available. To do this the Helping Hands team identified and implemented many different approaches to computer vision with this design constraint in mind before settling on a method. This required assessing each approach’s efficiency, accuracy, and effectiveness under different environmental conditions. This ultimately resulted in the Helping Hand team using Convolution Neural Networks (CNN), which is the software responsible for the object detection and movements of Helping Hand.
- Tags
-