We're back for another day of exploring the Northeastern University labs. We checked out a handful of projects yesterday, including the HyCycle, a runner-up in the school's Capstone award. Today we checked out the winner:
iCRAFT
-- that's short for eye-Controlled Robotic Arm Feeding Technology. The
project was developed by a group of seven electrical and computer
engineering students looking to create an inexpensive solution for
helping the disabled and elderly feed themselves at home. As the
Apple-esque name implies, the project utilizes eye-tracking to help the
user feed him or herself.
The hardware side of the project
involves a robotic arm and controller (which run a combined $640), a
hacked Creative webcam and IR light (around $114), three bowls, a water
bottle and a custom built power supply. On the software side, the team
used the open-source ITU gaze tracker software, combined with a custom
GUI. The whole thing is designed to be simple to use right out of the
box -- though, being in prototype stage, there were naturally a few
hiccups in the process. It didn't work perfectly when we demoed it
today, but it certainly wasn't much more buggy than what many companies
try to pass off as finished products.


The whole setup worked pretty well off the bat. We sat down, let the
system calibrate for our eyes (which managed to track our gaze through a
pair of glasses), a reasonably quick process. Once calibrated, the
team's simple GUI pops up on the monitor, featuring four big blocks: a
yellow bowl one, a blue bowl two and a red bowl three. Along the bottom
is a big green rectangle labeled "Rest." The crosshair that signifies
what you're looking at has a tendency to jump around and can be slightly
distracting, but after just a few minutes we were able to get the hang
of directing the pointer to the desired location. We found that picking
blocks one or three on either end of the screen was actually easier if
we looked past the monitor's bezel.
Once you've locked in and
stare at one block long enough for the indicator gradient to fill up,
the arm goes into action. Is swings the plastic spoon at its end toward
the chosen bowl, lifting up and plunging into the food. According to the
students, the system has a "scooping algorithm," which ensures that it
evenly distributes its position, so you get the whole bowl's contents
for more clingy food like rice or cook oatmeal.

The arm then slowly raises the food-filled spoon, bringing it around
and then gingerly moving it toward the user's face. Once extended, new
options appear on the screen, allowing the user to return the food to
the bowl (in the case of a changed mind or wrong selection) or to get a
drink from the Hydrant Bottle. One issue we encountered here, was that
the arm, once extended, would sometimes interfere with the eye tracking,
making it difficult to select a new option, but the team assured us a
higher-end camera or a more optimal positioning could easily rectify the
problem.

All told, the system cost the students about $900 to put together. Mass
produced, they expect that price to drop down to around $800 to $850 --
that's a pretty big drop from the $3,500 to $6,000 that similar systems
often cost. But while the system is designed to work instantly, it's
certainly not a complete replacement for human help. After all, someone
still has to cut up the food and put the bowls out. Rather, the system
is meant to simplify the process and offer a greater degree of freedom
and control over the act of eating.
No comments:
Post a Comment