When we ask “Why?”, in science, we actually mean “How?”
In the stage two of the project, I put together a conceptual prototype that brought together a handful of cutting-edge research findings:
In this third stage of the project, with feedback from the professor and classmates, we identified an interesting aspect of the concept to prototype that has not received a lot of attention in the research community: textural displays. More specifically, we identified the essential interaction that the textural display facilitated—using haptic feedback to quickly get non-nominal (ordinal, interval, ratio, etc.) information at glance—and different methods that could achieve the similar results.
Feedback from textural display prototypes
The interaction begged the question: Which manifestations of textural displays would be most useful? To answer this question, I quickly put together three low-fidelity prototypes to get informal feedback from MUX lab members and other grad students.
In the image, the prototypes are blocks of balsa wood covered with masking tape. On the left, the bubble prototype has an indentation, highlighted in light blue, that simulates an inflatable diaphragm to render non-nominal data. The prototype could then be rotated to a different corner of the balsa wood block with bubble wrap underneath the tape to simulate different levels of inflation.
On the right, instead of using bubble wrap, I used sand paper to simulate a “roughness” display. Highlighted in turquoise is one grade of sandpaper, while the black sandpaper on the bottom half of the block is another grade of sandpaper. The “active” part of the display (grade of sandpaper) would be exposed while the other grades of sandpaper would be covered with tape. As with the bubble prototype, the block is rotated to simulate changes in the display’s roughness.
Finally, in the middle, a piece of metal (highlighted purple) is slid up and down over exposed balsa wood (coloured yellow) to simulate a slider display. When rendering non-nominal data, the experimenter would need to move the slider him or herself to simulate an active display.
The feedback was fairly consistent:
There were a few individual differences as well: some people thought it’s easier to tell different levels using sandpaper while others thought it would be easier with bubbles. One big downside of using sandpaper (aside from having an unpleasant texture) is that people’s haptic abilities vary widely and decline over time. This means that different people feel textures with varying accuracy and, as we age, our ability to feel different textures becomes less reliable. Thus, using an inflatable diaphragm or roughness displays may not be suitable for most use cases.
On the other hand, the slider prototype, using generally interpretably different textures for the slider and exposed surfaces would offer the best of all three prototypes: the slider renders non-nominal data well and the textured surfaces could potentially allow users to quickly interpret the display at glance, especially if the surface area of the display fits under a finger.
Preparing for a pilot research study
To wrap up the project, I’ll take these preliminary findings and run a small pilot study. After reading the experiment by Hameed et al as a starting point  and discussing with Oliver about the experimental design, here’s an outline of the experiment.
For the purposes of this study, “quickly” means under 2 seconds. The procedures are as follows:
The 2-back test consists of presenting to the participant a series of images and asking the participant to click on images that appeared two instances ago. There’s an excellent example at the cognitive fun! website.
This is a controlled within-subjects experiment with cognitive load (2-back test) and time as the independent variables, and deviation from the rendered value as the dependent variable. The data will be analyzed using the two-way ANOVA test to determine the main and interaction effects. The timer data will be binned (1, 2, 4 seconds etc.) to satisfy the ANOVA requirement for categorical independent variables.
Hopefully, I’ll have enough time to build the prototypes, the 2-back with prompt and record system, recruit participants, and run the study in time for next Wednesday’s presentation!
 Gray, R., Spence, C., Ho, C., and Tan, H. Z. Efficient multimodal cuing of spatial attention. Proceedings of the IEEE 101, 9 (2013), 2113–2122.
 Hameed, S., Ferris, T., Jayaraman, S., and Sarter, N. Using informative peripheral visual and tactile cues to support task and interruption management. Human Factors: The Journal of the Human Factors and Ergonomics Society 51, 2 (2009), 126–135.
 Mutlu, B. Human-Computer Interaction: Experimental Design. University of Wisconsin-Madison. 9 April 2014. <http://hci.cs.wisc.edu/courses/hci/lectures/fall2011/HCI-Week05-Lecture06.pdf>
 Sarter, N. Multimodal support for interruption management: Models, empirical findings, and design recommendations. Proceedings of the IEEE 101, 9 (Sept 2013), 2105–2112.
We believe each other into being.