Jessica Hui's profile

Hellokitty Cuddle Buddy : movement based interaction

CONCEPT & SKETCHES
The concept of my movement based tangible was to represent toys in virtual reality such that could motivate children to treat them better. Depending on how the stuffed animal is held, the amount of movement and speed on it differs because of the constraints of body movements. A stuffed animal can afford many ways of being held, thus the results is displayed to a child so that they understand proper nurturing or consequences. For example...
 
- The sketch to the left illustrates a child that holds her stuff animal properly, similar to the position of how a live animal would enjoy being held. Reason being is the close proximity to the child's body, which constrains the movement to a light left-right rocking motion.

- The second sketch indicates a careless child that holds the stuffed animal hand-to-hand which allows for arm swinging, such that generates a larger and quicker movement. Thus the stuffed animal is in an unfavourable position to which it could feel nautious and unhappy if it were to be alive.
 
This sketch illustrates how the stuffed animal is plugged into a computer so that it displays its emotions in its current state. The display would include the bear itself with the corresponding emotion, along with a love meter and the movement meter to read updated interactions that could take an effect on its emotions.
The upper left sketch is how I want the display to generally look. In total there will be 6 states in which the stuffed animal could feel.
 
Section A : These are the overall emotions of the stuffed animal dependent on the love meter, which is displayed when there is no motion (not being held).
                     1.) When the love meter is high, it would be happy and loving because of the proper nurturing
                     2.) When the love meter is moderate, it would begin to lose affection
                     3.) When the love meter is low, it would be sad because of the lack of proper attention
Section B : These emotions are displayed when the stuffed animal is in motion (being held), which differ according to the acceleration of motion.
                    1.) At the sweet spot of the movement meter (when cuddled at the perfect pace) it will show it is feeling loved, and the love meter will increase
                    2.) When there is not enough movement, it will show its boredom and unamusement, thus the love meter will decrease
                    3.) When there is too much movement, it will get dizzy and nautious, thus the love meter will decrease
PROCESS
I went to the Japanese dollar store, Daiso, to look for a potential stuffed animal for this assignment when I chose this Hellokitty. This Hellokitty has a large but soft head so that i could easily hide my arduino and accelerometer inside. As well it makes designing and drawing the emotions of the stuffed animal a lot easier since there are a lot of online and consistent drawings of Hellokitty on Google already.
This shows all the 6 states of emotion expressed by Hellokitty that would be used in my display screen. I reiterated the emotions from the sketch so that the images better represent its current state so that there could be no wrong interpretations. The biggest difference is the change in Hellokitty's overall emotions. The sketch illustrated the feelings of affection, slight affection, and then sadness, which makes hard to represent in a single icon. So I changed it to feeling happy, sad, then mad.
These are all the technical parts used for the assignment: Arduino Uno, breadboard protoshield, accelerometer
Here is a close up of how i connected the accelerometer with the Arduino with the use of the protoshield for a compact microcontroller to stuff into the head of Hellokitty.
Before stuffing the parts into Hellokitty, I had to test the accelerometer with my code. This is the set up of how I coded everything, while Hellokitty watches and waits for its head to be cut open.
Sadly the accelerometer was giving weird inputs that included random characters including letters and signs, so I was unable to complete my assignment since the movement part of it literally does not work. So Hellokitty's head was spared from being cracked open.
FINISHED VIDEO
Because the accelerometer did not work properly, I changed up the code so that the mouse on Y-axis changes the movement meter in order to present a proper demonstration of the display.
 
Here you could see how the level of "acceleration" or "handling" of Hellokitty would change its emotions. And when there is no interaction, it would display Hellokitty's overall expression.
If i had more than a week to do this assignment, I would have definitely completed it. It was only because I had started coding for the accelerometer the day before the due date that I was unable to replace it for one that works. In addition to completion, I would have added more decorations to the display screen to further represent Hellokitty's emotions. As well it could enhance the experience of virtual reality for the users since digital Hellokitty currently illustrates an exact appearance as its physical object.
ARDUINO CODE
PROCESSING CODE
Hellokitty Cuddle Buddy : movement based interaction
Published:

Hellokitty Cuddle Buddy : movement based interaction

Movement-based Tangible Interface

Published: