This was an exciting week for the Bixcreen Team!
This week, we greatly advanced our touch and gesture-based prototypes to reflect the feedback given to us by our critical friends, while remembering to remain true to our original research results captured in Milestone 1. Our prototype iteration involved melding two touch screen prototypes into one, which made an end-product that reflects the best of both designs; as well as exploring two types of gesture-based interaction modalities.
Touch-Based Interaction Design
This week’s touch-based designing included iterating with the strengths and weaknesses of touch interaction in mind. This meant making the interface options more simple, touch buttons bigger, and making each step focused on a particular step in our pathway. The image above reflects our thoughts on redesigning the pathways and flow for our new touch-based experience. With our new direction and melded design we are poised to focus and improve the overall experience while, hopefully, decreasing the potential learning curve required to successfully navigate through our UI.
Gesture-Based Interaction Design
This week our project received tangible support from the Lead Interaction Designer for the Xbox One, Tim Franklin. Tim met up with two of our team members to give feedback on our current prototype ideas, discuss the limitations of the Xbox 360 & Xbox One Sensor, and explain the potentials for using an Xbox One Sensor instead of a Xbox 360 Sensor. Tim then helped the Bixcreen team apply for an internal Kinect for Windows v2 (K4Wv2) Software Development Kit (SDK). Since two of our team members already work at Microsoft we were instantly approved for two Kinect for Windows V2 kits; which we received only two days after our meeting with Tim. This is particularly exciting given that the K4W2 SDK won’t be available to the public until the middle of this summer! Big kudos to Tim!
With the new sensor available to our team, new functionality can be implemented into our design; thus, our team will need to explore our newly available options. These options include upgrades to hand states (particularly grip/release & press functionality), a more robust voice detection system/mic array, access to more gesture options, more stable cursor control, and further and more accurate skeletal tracking distance (3.5M to 4M m). This week we explored a Four Corners approach to gesturing, which simply supports raising one arm in relation to a corner of the screen to identify the ticket type: Adult, Child, Senior, and Student. The Unique Gesture model incorporates four unique poses to select a particular movie ticket type. The illustrations above show these gesture models starting with the Unique Gesture Method followed by the Four Corner Method.
We finished the week putting the finishing touches on the two prototypes for our pilot usability test, which will be on Monday. The task lists are written, the technology is ready, the time has come. Stay tuned for next week!