Shown is a hand 3D printed with flexible Ninjaflex material, powered by linear actuators in each finger and an Arduino. Objects are presented to the laptop's built-in camera, and a script on Matlab processes the image in real-time and forms the necessary grasp associated with the object presented. Here, we use QR codes to identify objects and grasps, but we are currently working on a machine-learning algorithm that is far more robust.