Sensory input is an important feature for humanoid robots. Their complex structure and functionality goals requires humanoids to sense their environment in multifaceted ways, one of which is being able to sense touch and tactile feedback.
While some groups in the class worked towards building a functioning Baymax who could hug people, one problem that persisted was the lack of sensory input. In order to have Baymax be able to interact with people, we felt that he must be able to sense when he is touching something, and also when others are touching (or hugging) him.
While trying to decide on a project, we considered methods that could achieve the goal of Baymax being able to sense hugs. One approach we considered was using vision with facial tracking software to detect when people come up to Baymax. While this approach would have likely achieved the goal, we felt that there has already been a lot of work done in the computer vision field, and we did not have much experience dealing with tactile feedback. We thus decided to approach this issue by following the not-so-beaten path of developing a sensing "skin."
We chose to address the problem of sensory input by developing a "skin" for Baymax that can "feel." The skin consists of force sensors so that Baymax is able to sense when a person is hugging him! Once Baymax has sensed this hug, he has the capability to recipricate the hug with the same force as the person giving the hug. Additionally, this functionality can be extended to other applications, such as manipulating objects with the help of force sensors on fingertips, sensing terrain with sensors on feet, etc.
Here are the materials we used in our final demo
The skin setup consists of a grid of FSRs sewed onto the inside of the Baymax doll's torso. Each sensor is connected to the Photon board, which reads the voltage across the resistor and publishes the data over Wi-Fi to the cloud. A Python script running on a machine fetches the data and visualizes it in 3D.
Using the FSR, we created a circuit that would give us the resistance that the FSR currently had. The circuit was a simple voltage divider read between the nodes marked below. The voltage read off of this circuit gives the resistance value of the FSR at the current moment. This in turn tells us how much force is being applied to the FSR at the given moment.
The Photon board scales a voltage from the 0-3.3V range to an integer between 0 and 4096. The raw integer is scaled back to volts by multiplying by a factor of 3.3/4096. To convert from the read voltage to the resistance, we used the equation below. We use this resistance in the formula below that to find the amount of force, in Newtons, being applied to the sensor.
We configured our bread board for seven sensors, soldered longer ends of wire onto the sensors, and stitched the sensors into the plush Baymax doll. Before deciding on a good configuration for sensors on Baymax, we tested different configurations of sensors on an inflatable ball that had about the same squish-ability as Baymax. We tested configurations until landing on a configuration of a column of two, a column of three, and a column of two sensors off-set to fill in the gaps of the middle column, as shown in the picture below.
The software for this project consists of two parts, the "server" Photon firmware and the "client" Python data visualization script. The data pipeline consists of the Photon reading the resistor voltages and publishing the data to the Particle cloud. The client script requests the data from the cloud via HTTP requests, converts the raw data to Newtons, and graphs the data in 3D.
The firmware for the Photon board is a C style program that is responsible for reading the analog values from the sensors from the analog inputs on the board. After setting up the pins and data streaming to the Particle Photon cloud, the program reads the sensor values in a tight loop. The data is published to the cloud continuously and can be fetched by the client program. In terms of publishing data to the cloud, the Photon supports publishing for up to eight variables. A variable in the program can be registered as a Particle “cloud” variable through the Particle API, letting the board and the Particle cloud server know that this variable should be published. The data types supported include ints and strings.
The client program is a simple Python script that fetches the sensor data from the cloud and graphs it in a 3D triangular surface plot in real-time. The script sends formatted HTTP GET requests (using the Requests Python package) to the Particle cloud and extracts the data from the JSON object received. The script then converts the raw data into a force value in Newtons (see the conversion formulas above) and graphs the data in 3D using the Matplotlib graphing library. Each data point consists of X, Y, and Z coordinates. The X and Y values representing the position of the sensor (in centimeters) from the bottom-left corner of Baymax's torso. The Z coordinate is the corresponding force reading in Newtons. The graph is a 3D triangular surface plot, which involves a triangular interpolation between 3D points.
One challenge that we faced with the hardware was selecting the best sensors to use. One of the options that we considered was the TakkTile sensor bars. These seemed like a good option, so we purchased one of the strip. The drawback of using these however was that they required a lot of overhead. The sensors came on a single strip, and could be separated, but then they must be rewired. The other issue was that the documentation was for an older version of the sensors and was unclear on how to rewire the sensors back together. Another drawback of using these sensors is that they have a very specific interface that could only connect to Arduino through another bard that we had purchased. The setup of these sensors got very complicated very quickly and so we decided to go for a simpler sensor that could be used much easier.
The main challenges that were faced with hardware was the durability of the FSR's. They are made with a round end, which receives the forces and varies the resistance, a long tail piece which carries the data to the two end of the sensor, as show in the picture below. The layout of the FSR makes it hard to configure and use especially since it is very fragile. The plastic that holds the wire is very flimsy and heats up very easily. This made it challenging to solder longer ends onto the sensor because if the metal wires got too hot, they would melt the plastic surrounding them. This also weakened the plastic and caused it to tear very easily. To combat this, we ordered extra sensors and solder extras in case some broke, which inevitably they did.
As each Particle cloud variable must be fetched by the client via an HTTP GET request, the pipeline is I/O bound with the speed bottleneck being the HTTP requests. Initially, we set up each sensor as its own Particle cloud variable in the firmware and would linearly send a request per sensor in the client. While this worked, it proved to be extremely slow and infeasible for real-time feedback.
In order to avoid sending an HTTP request for each sensor, the firmware now packs all of the sensor readings into a single string and publishes just the string to the cloud. Thus, the client program needs to send just one HTTP request and can parse the string to get the individual values. As a result of this optimization, our graph update interval went from 2-3 seconds down to 0.3 seconds. This approach can also be scaled up very easily if more sensors or boards are added.
The main lessons we learned were that to make a fully functions sensing skin is a very difficult task. The sensors that are used must be very flexible and resilient to damage. Originally we did not know how true this was so we struggled a lot with replacing sensors after they broke and trying to keep them working properly. If we had had time, we would have found more durable sensors to use instead of the FSRs.
We also learned that while convenient, the Wi-Fi on the board works a lot slower than a wired connection. The original plan was to use Wi-Fi to ensure that we could collect data without having to wire Baymax to our computer. While this was a good plan it, the downside was the speed of the Wi-Fi. If we had used a wired connection through an Arduino board we could have gathered the data much faster.