I recently recorded a new 10 minute long walkthough of the Tobii Studio3.2eye tracking software where I present all the main parts of the software. Tobii also recently made it possible to use Tobii Studio 3.2 for free during 30 days. The trial version does not support eye tracking, but you can use the mouse for simulating an eye tracker. The trial version also comes with a sample project that includes 20 recordings so that you are able to play around with all the features in Tobii Studio even though you don’t have access to an eye tracker or your own data. You can download the trial version of Tobii Studio and the sample project here.
We spend more and more time using our mobile devices such as tablets and smart phones to access web pages and for apps and games. From a design and usability and user experience perspective, it is worth keeping in mind that mobile devices are not the same as computers. We use them in quite different ways; the interaction is different (usually touch screen instead of keyboard and mouse), the screen and the device is smaller and handheld, and the general usage is different – we use them more often and in shorter sessions. It also seems like user have higher expectations on a good user experience for mobile apps, which was revealed for example in a study by Harris Interactive on behalf of Effective UI from 2010: 73% of mobile app users agree that they expect a company’s mobile app to be easier to use than its website. All in all this increases the need to perform user test on content designed for, and used on mobile devices. However, conducting usability test on mobile devices is slightly more challenging than conducting usability tests on a computer. These challenges are mainly practical and technical, but also methodological.
Tobii just launched a new solution for user testing of mobile devices and tablets: Tobii Mobile Device Stand for X2. This is a follow up to the existing mobile device stand for Tobii X60/X120 Eye Trackers. The new mobile device stand in only compatible with the new small Tobii X2 Eye Trackers, X2-30 and X2-60 Compact Editions. The major improvement is that due to the small sized eye tracker it is no longer necessary to place the eye tracker up-side-down like in the old solution; this increases the ease of use for the solution and also makes eye tracking more robust and accurate.
The stand includes everything needed to start testing content on mobile devices; a stand where they mobile can be attached and rotated, a robust scene camera holder with an HD scene camera that records the screen of the mobile device, a calibration plate used when calibrating the participant and various other components. Since the eye tracker tracks anything that is placed on the stand and the scene camera records the object – the stand can also be used to test pretty much anything; even books, brochures and papers as long as they fit on the stand (up to 31.9 cm (12.6”) in height). In the Tobii Studio software the eye tracking data is automatically overlaid on top of the scene camera video and thus you are able to use all the analysis tools available.
I have tried the new solution a few times and it is easy to use and can be assembled quickly. It is possible use the stand in eight different configurations, but usually one configuration works for most studies and setups. Thus to use it you only need to assemble the stand, connect the eye tracker to the Tobii Studio computer and put in a few values in the Tobii X-config tool. It is also straight forward to use it during a recording and the usage is pretty much the same as when conducting an eye tracking test on a screen; the main different is that the calibration procedure is manual – you have to ask the test participant to look at five dots on the calibration plate.
Learn more about how to perform user tests on mobile devices
For advice on how to conduct usability studies in general, and with eye tracking, on mobile devices you can read the Tobii white paper: Using the Tobii Mobile Device Stand in Usability Testing on Mobile Devices. Please keep in mind that is was written for the previous mobile device stand and many of the technical limitations mentioned are no longer valid. The paper gives however a good introduction to usability testing and eye tracking on mobile devices and provides a set of 44 practical methodological guidelines. It covers all important steps involved in the planning, the actual testing, as well as the analyzing of the collected eye tracking data. You can download the paper here as PDF.
To learn more about the Tobii Mobile Device Testing Solution you can watch one of the two videos below, one introducing the solution and a longer training video explaining how to assemble and use the Mobile Device Stand for X2, or visit the Tobii website http://www.tobii.com/mobile-testing
Introducing: Mobile Device Testing Solution for Tobii X2 Eye Trackers:
How to assemble and use the Mobile Device Stand for Tobii X2 Eye Trackers:
I have recently visited a few labs that are using Tobii eye trackers, an EEG system from EGI and E-Prime and in this blog post I want to share some information about how you can be combined these in the same E-prime experiment.
E-Prime supports both Tobii eye trackers and EGI EEG equipment via two different extensions (E-Prime Extensions for Net Station and E-prime Extensions for Tobii). By using the Tobii extension E-Prime is able to communicate directly with a Tobii eye tracker and also record the eye tracking data in E-prime (as a tab delimited text file). The Net Station extension for E-Prime is able to send events, such as responses and stimuli names, from E-prime to the EEG acquisition computer running the Net Station software from EGI.
To be able to use a Tobii eye tracker and an EGI EEG system in the same experiment a two computer setup is needed: one PC running E-prime with both extensions installed and one Mac running Net Station (see picture below). The eye tracker is connected to the E-prime computer via Ethernet (for example by using the USB-Ethernet adapter provided by Tobii) additionally the Net Station and the E-prime computers are connected via another Ethernet connection (a direct LAN cable connection is recommended). In this setup E-prime will present the stimuli, collect and save the eye tracking data, and send information about the trials to Net Station containing information about which stimuli has been presented when. See the setup overview below:
In E-prime both the Tobii eye tracker and the Net Station extension should be installed and configured correctly; adding the IP address of the Net Station Mac and the name/serial number of the Tobii eye tracker is a minimum. In the E-prime experiment both Tobii and the Net Station Package calls should be combined as well as a few additional InLine scripts for both Net Station and Tobii. The Net Station package calls will send information about the trials to the Net Station Mac, while the Tobii package calls will calibrate the eye tracker and record the eye tracking data. This enables you to get two different data files (eye tracking data and EEG data) with identical event information (time stamps and when a certain stimuli was presented) in both files which should make the combined analysis of the data easier.
Disclaimer! Please note that I have only tested this script and this setup in a couple of different labs. If you intend to use the experiment for real please make sure you conduct adequate testing of your setup and the script. I also do not know if this way of combining the Netstation and the Tobii extension in E-prime is officially supported and tested by E-prime. To be able to run the sample script you must have purchased and installed both the Netstation and the Tobii extensions for E-prime.
To learn more about how to combine eye tracking and different EEG systems and similar devices please visit Estefania Dominguez blog on http://gazesync.com
A very simple way to illustrate what kind of additional information you get in a usability study by using eye tracking is to watch the video below video (http://bit.ly/gazemouse) that I just uploaded on YouTube. In this video you observe a person completing a task on a website (www.spotify.com) the task was to sign up for the paid service (called Premium). First you will see the person completing the task without seeing the persons eye movements, only the mouse movements can be seen. In the second video you can also see the eye movements in addition to the mouse movements.
As you can see in the video the eye tracking data does provide you with more information about the person’s behavior while completing this task. In this study we found a clear pattern: even though several participants in the study did click on the right button (Premium) they hesitated and compared all the other available sign-up options (4) on the web page. Based on this insight the web page has now been optimized and only has one sign-up option (see www.spotify.com), instead of four on the old web page.
In the same study we also compared the value of different cues when using the retrospective think (RTA) aloud method in web usability testing: an un-cued RTA, a video cued RTA, a gaze plot cued RTA, and a gaze video cued RTA. The findings suggest that using a gaze plot or gaze video cue stimulates participants to produce the highest number of words and comments, and mention more usability problems during the interview. Read the paper here (pdf).
Returning to the Premium problem described above, none of the participants in the un-cued group mentioned anything about this problem during the interview. They just said the web page was easy to use and that they had no comments about the first page. In other words without the additional information that eye tracking provided we would not have spotted this issue on the Spotify web page.
Today June 22, 2010 Tobii launched a new revolutionary eye tracker: Tobii Glasses (see: http://www.tobiiglasses.com). This is the first head mounted, or wearable eye tracker from Tobii. Just like all of the other eye trackers from Tobii, Glasses is taking eye tracking to the next level. Unlike other wearable eye tracking systems Glasses is very lightweight (Glasses 75 grams plus recording assistant 200 grams), just like wearing a pair of sun glasses and an iPod. This is not the only revolutionary thing about Tobii
Glasses, the real leap forward is the ease of use and the ability to aggregate eye tracking data which enables you to conduct quantitative eye tracking studies which is really unique for a wearable eye tracker. To be able to do this Tobii has developed a new technology called AOA-Track with small markers that emits infrared light. By placing these IR-markers around objects and/or areas you want to study in depth, Tobii Glasses automatically knows when a person is looking within these areas and will automatically aggregate the data from all the participants in an eye tracking study. This is truly amazing! Previously when doing this kind of research it has required a lot of manual coding work to be able to do this, now it all happens automatically and you are able to use the analysis software Tobii Studio to analyze the data just as if it was recorded with a stationary eye tracker. In other words you can create visualizations like heat maps and gaze plots, calculate statistics etc automatically with data from a Tobii Glasses recording.
The device also includes a microphone and of course a camera filming everything the participant is looking at so it can also be used for qualitative studies (you just watch the recording as a video with the gaze point overlaid) and might even enable new research methodologies, what do you say about “Concurrent Think Aloud Walking” in usability research for example!
To learn more about the Tobii Glass please visit these resources online:
These are some examples from Tobii Glasses recordings we did in a Supermarket.
Watch a shopper in a supermarket being eye tracked in the YouTube video below
This is a Gaze Plot showing how a person is searching for a product on the shampoo shelf. IR-markers have been placed on the shelf to enable us to collect the data.
This is another shelf, the heat map shows the aggregated data from 30 recordings and reveals where people look when deciding which product to buy. IR-markers have been placed on the shelf (try to find them!) to enable us to aggregate the data and make this heat map.
This year we will have a super exiting conference on eye tracking in user experience research – EyeTrackUX 2010 – to be arranged in Leuven (near Brussels), Belgium on June 2-3, 2010 in co-operation with Centre for User Experience Research (CUO) at IBBT / K.U. Leuven. We just launched the website on www.eyetrackux.com and the call for speakers so register and submit your speaking proposaltoday! On June 1, the day before the conference, we will arrange a separate full day course on how to use eye tracking in web UX research and present 7 successful methods you can use.
Additionally I’m happy to announce that we have a confirmed speaker from Google at EyeTrackUX 2010: Anne Aula, Senior User Experience Researcher. She will talk about:
How and why Google uses eyetracking in user experience research. By Anne Aula, Senior User Experience Researcher, Google Inc.
“In this talk, I will talk about the various ways in which we use eyetracking at Google. I will highlight the reasoning behind key methodological decisions – such as using eyetracking as a qualitative supplement to think aloud studies or running more controlled experiments where eyetracking is the main source of data. I will also give examples of more controlled experiments we’ve run to explore how changes in pupil size predict the relevance of search results and how eye and mouse movements are coordinated when users are scanning search results. Throughout the talk, I will focus on the main question: in a fast paced product development environment, when is eyetracking worth the trouble?”
A new library of eye tracking publications has been made available by Tobii. In this unique library you can find publications and papers on the use of eye tracking within many different research fields. Every item has a title, abstract, keywords, and in most cases a link to the full text paper, which makes it easy to find publications within your field of interest. This library already contains more than 270 publications and more will be added all the time. Access the library here: http://www.diigo.com/list/tobiieyetracking
There are a number of exciting Tobii eye tracking events being arranged this spring. Looking forward to seeing you all there!
Course: Designing behavioral experiments with Tobii eye trackers and eye tracking software
Date and time: Wednesday, May 26th 2010, 9:00am – 5:00pm
Place: Tobii Technology office, Danderyd, Stockholm, Sweden Registration and more information here
Conference: Emotions – Arousal – Pupil Dilation
Date and time: Thursday, May 27th 2010 09:00 – Friday, May 28th 2010 1:00pm
Place: Uppsala University, Uppsala, Sweden Registration and more information here
Course: Baking with Tobii: 7 delicious recipes for successful eye tracking use in web usability studies
Date and time: Tuesday June 1th 2010
Place: Leuven, Belgium Registration and more information here
Conference: EyeTrackUX 2010 – Tobii’s Eye tracking conference on User Experience
Date and time: Wednesday June 2nd 2010 – Thursday June 3rd
Place: K.U. Leuven, Leuven, Belgium Registration and more information here
March 22nd, 2010 | Category: Events | Comments are closed
Tobii has now released a new extensive white paper describing how eye tracking can be used to complete user tests on mobile devices. There are several challenges when using an eye tracker together with a mobile phone, both in the physical setup and in the human physiology. The paper explains the challenges and presents three different setups that works.
You can read and download the white paper Using eye tracking to test mobile devices on Scribd here
Recently another video clip was released on YouTube by Herbstwerbung showing how a Tobii x120 eye tracker was used to track the eye movements when using an iPhone. Watch the clip below:
A lot of people has asked us for advice on how to use Tobii eye trackers for studying user behavior on mobile devices like mobile phones and pda:s. Next week we will release a extensive white paper on this topic where we are comparing a few different eye tracker setups. We have already now published a video on YouTube describing these setups and the pros and cons. Watch the clip below and return to my blog or keep your eyes open next week and you’ll be able to download and read the white paper.