Throughout our brainstorming and development of our system, we have had many changes. Starting from the very first week of idea development, our first idea was to develop TreeBot, an autonomous robot that would plant and maximize tree growth in a world where deforestation has begun to create serious environmental effects. After further discussion among ourselves, we decided that this idea would be very difficult to implement, and we had to brainstorm new ideas. We had a thorough list of cool futuristic ideas, ranging from Super-bugs too Artificial intelligence. In the end, we all came to a consensus to work with an imagine reality where advanced genetic modification was commonplace among the wealthy, and our system would be a physical augmentation device that could be afforded by the non-elite.
The next step was to decide on an actual device and system that we could prototype for our project. We looked at possible genetic modifications that could be mimicked in a simpler form. The idea was that in the future, people would have cybernetic implants that allowed “telepathic” communication between the wealthy who could afford these implants. This idea stuck with us, because designing a system that would mimic that sort of technology would be a fun project to pursue, and we had good ideas for our prototype. At first we thought of creating a braille glove that would send messages to other glove users through a braille language system. We also added aspects to our world that using phones were frowned upon by the upper class, and therefore using your phone to communicate was not an option for the lower class who could not afford the implants. Our next step was to do field work on our system.
After online research and interviews, we found that we got a lot of constructive criticism from other people’s thoughts on our idea. Some examples were that the Braille language would be hard to learn, our glove wasn’t discrete and people would know if you were using a device, the world was a little hard to understand, and how the receiving end of our messages would work. Our group then narrowed our idea to a gesture ring that would have similar functions, but lean more toward the functions of an actual phone. This led up to our current idea of our prototype.
In a world where the use of cellphones while in motion has been outlawed, the upper class can afford cybernetic implants that allow them to use phone functions through their mind. The less fortunate however cannot afford these implants, and don’t have a way to access functions on their phones in public while moving. To resist this world, we have designed a small gesture ring system, that will be affordable and allows users to access the functions on their phones, without breaking the law.
After deciding conceptually what our final project would look like and how it would function, we brainstormed what hardware components we could physically use to represent this device in our world as closely as possible.
We considered various mini computer and hobby micro controllers like a Raspberry Pi or an Arduino. Ultimately, we decided the most important goal in bringing this device to life was to make the pieces as small as possible. So we dug deeper and decided the Flora line of wearable circuit boards would be excellent for this project since the emphasis in their design is minimalism. We ended up using these three Flora components to demonstrate the device:
Flora v2 -The “brain” of the ring. We used it to collect the data from the accelerometer and pass that data onto the bluetooth module which then passes that data wirelessly to our Android App (more on that below).
Flora Bluetooth – This is how the ring device sends data to other devices. For the purposes of this prototype, it sends gesture signals to an Android app.
Flora Accelerometer – The accelerometer acts as the main and most essential hardware component for this ring device since it detects motion.
The following lists off everything we purchased to assemble our prototype:
- Flora v2 – $19.95:
- Flora Bluetooth – $21.50:
- Small ass accelerometer (Flora accelerometer/magnetometer) – $14.95:
- Conductive Thread – $5.95:
- Batteries AAA – $4.39:
- Batteries Holder – $1.95
In the above image, we essentially decided we would use an armband to hold the battery, Flora, and Bluetooth all securely in one place. The accelerometer is attached to a ring base with wires connecting it to the Flora for processing the accelerometer values.
Below are some photos of the prototype as we put the pieces together.
The images above show the Flora board we used. The left image shows the top side while the right shows the underside. Notice the bluetooth module is soldered directly onto the backside of the Flora. We decided this would save us the most amount of space as possible in our final prototype design.
This next set of images shows the accelerometer. The first two images show how the wires were organized. We wanted to keep the length of the individual wires uniform so we applied dots of hot glue to the backside of the wiring to keep them together. In the third photo, we wrapped the wire in a long strip of white electric tape. This helped us keep the final prototype design as clean as possible.
We also decided to wrap the accelerometer in red electric tape to conceal the appearance of the accelerometer and then hot glue it to the base of a ring pop ring. We believed this would keep the wearers of the device from actively thinking about the circuit boards they are wearing.
These images show the final design. The first image shows an overview of the prototype. It looks like a ring attached by wire to a box with an arm strap. The second image reveals the contents of the box. It holds the Flora, bluetooth module (soldered behind the Flora) and the battery pack powering the device.
Sightless UI (Blind user inspiration): During the decision process of our prototype, we had discussed similar real-world applications that could be used for a gesture ring. What came up during our research was the lack of technology for the blind. There were a few articles illustrating how modern smartphones are made as a visual tool, with a touch screen that can only be used by people who can actually see it. These smartphones are usually adapted for blind persons through screen reading software. While this works, it’s not entirely ideal for blind users. We found that technology for the blind consists of the use of touch or sound.
We also thought of some non screen based technology and UIs that perhaps weren’t intentionally designed for blind individuals but would work very well for them regardless. One great example is the iPod shuffle or any iPod with a UI designed for navigation via click wheel. These kinds of devices were and still are much more easy to navigate without a screen than today’s touch based smartphones because the UI feedback was either haptic (pressing physical buttons) or auditory (clicking sound while dragging finger around click wheel circle). We used this as inspiration to design an audio feedback based user interface that would respond to finger/ring gestures.
Android: For the software component of our prototype, our group decided on developing an android app that would hold a few phone functions that we wanted to control by our ring. We decided Android would be the best decision because we all had experience writing code in Java and some experience with android UI development.
We decided that the main features we wanted to implement were weather, music, and text messages. The first step was to setup the UI so that we knew what specific “button” functions we wanted. There was one main UI with buttons for playing, pausing, next, and previous songs, reading weather, and reading text messages. We wanted to implement access to the phone’s actual music library, text message, and a weather applications information, but decided that creating static fields for each functionality was viable for the purpose of our prototype.
To make the UI our prototype system more appropriate for how we intend it to be used, we switched the UI layout to match our rings actual gesture controls. To do so, we converted the layout into 4 arrows representing the cardinal directions that our ring gestures would understand. From there we made sure to implement what each direction would day according to our previous methods. Here is a look at the gestures of our ring:
- Cycle through the menu’s of our app. Music -> Weather -> Text Messages
- Music: Play next song
- Weather: Read tomorrow’s weather
- Texts: Read next text message
- Music: Play/Pause song
- Weather: Read today’s weather
- Music: Play previous song
The three images below show how our app design changed over time and demonstrates the differences between traditional mobile UI design and our auditory feedback UI. The towo images on the left show what the UI of the mobile device would look like if it were button based and intended to for sighted users via touch screen. The image on the right completely removes the buttons for the purposes of sightless navigation via our smart ring. The two different UI types can function in essentially the same way but one requires a screen while the other (ours) does not.
These are the following resources we used from the Adafruit website:
Bluetooth Android App:
Adafruit has a fully fledged Android app available for the bluetooth modules it sells called Bluefruit LE Connect. Basically it is a Bluetooth Low Energy Scanner app that allows you to connect to various bluetooth devices and interact with the devices services and characteristics. It allows more advanced interaction and options like updating firmware for Adafruit specific bluetooth modules.
The specific Flora Bluetooth module we used was designed to advertise a UART service by default. Connecting to that service allows devices to communicate with each other. The Adafruit BLE app has specific options for this type of connection/communication.
Although we had prior experience writing Bluetooth apps, we decided rather than starting from scratch, it would be best to use a source of Adafruit’s Android app, strip away all the non essential functions, and add our Ring Gesture auditory feedback UI code. This allowed us to focus much more on the functional components of our specific project rather then testing Bluetooth code we wrote ourselves that may or may not be compatible with Adafruit’s Flora Bluetooth module. After a two weeks of reading, running, and testing the source of Adafruit’s Bluetooth app, we were able to fully strip or comment out the unnecessary functionality while maintaining the connection and UART communication code. Our app ended up looking similar to Adafruit’s but with a lot of options removed and instead of going to the UART activity upon selecting UART as the communication option, it went straight to our blank activity that waited for commands.
The accelerometer used SPI communication protocol to send data to our Flora. Adafruit also had example code for this as well. We added a basic algorithm to detect certain gestures from the three X, Y, Z values on a single plane (hand/fingers pointing to the ground). The algorithm would then determine if an Up, Left, Down, or Right gesture was just detected and then send that data to the Bluetooth UART to send to our Android app. The Android device/app would wait for these Bluetooth signals to come in, active a function or change the menu, and announce the action taken via auditory feedback.
These images above show one of us (Dominic) wearing the device with a connected Android phone to the left.
Here is a link to the video of it working: TheOne Ring Video Demonstration
We made the video primarily for presentation purposes on ICAT day. We learned very quickly after feedback from initial presentations that our world was a bit confusing. We needed a video to clearly show and explain the quirks of our world, and how our gesture control device supports a less economically prosperous class of people.
We shot the video with a digital camcorder device, and used Adobe Premiere to sequence the footage.
Evaluation Plan Discussion
Once we had the device and video we were ready for ICAT day. At first not many people came to our exhibit, but before long we had a slow but steady stream of people. Most people grasped our world quickly, and understood both the need that our project was filling, and how we went about filling it. When we moved on to actually demonstrating the device, most people struggled to get the hang of it at first, but once they got the first gesture or two to work, the others followed much more easily. Once visitors became comfortable with how to use the Ring they typically wanted to talk some more about potential uses, difficulties, or what the project would look like if we were to continue.
Our project could be improved in many ways. The main struggle that people had at first was not understanding what they needed to do, but rather doing the gestures right, in such a way that the device actually read them correctly. This often involved accidentally doing a downward flick when meaning to do an upward one, though this could just be because we always started them with the upward flick to demonstrate cycling through the modes. One thing we did not try was having someone who knew nothing about our world, or what our device was supposed to do, use it to see how easily they could figure out what the device does and how to use it. While we think that this is an unlikely situation, as anyone purchasing our device should know that it is a motion controller for their phone, we could have tested to see just how intuitive our device actually is.
While we received valuable feedback from visitors to our table, in a more formal setting we would have tabulated the questions that they had for us, as well has how easy they though our device was to use, how comfortable they felt while using it, as well as how they would rate their overall experience while using our device. This would allow us to determine how effective our device was and if it would be marketable. Overall we are quite satisfied with our project and the prototype that we created, and grateful for the learning opportunity that it provided.