Pre-Conference Workshops & User Group Meetings

Prior to ECEM 2017, several pre-conference workshops will be offered that will give students and researchers the opportunity to delve deeply into special topics relevant to eye movement research. The workshops will start on Saturday, August 19th on 9:00am. They will be scheduled as two three-hour sessions plus a 30-min break (i.e. 6 hours in total, from 9:00am to 12:30pm and from 1:30pm to 4:30pm). Between 12:30pm and 1:30pm, lunch will be available. Some workshops will continue in the morning of Sunday, August 20th from 9:00am to 12:30pm).

For a workshop to take place, at least 15 participants need to register (60 maximum). Registration fee is 70€ for 6h workshops and 90€ for 9h workshops.

Workshop enrollment will start with the conference registration phase starting at the 15th of May.

The following pre-conference workshops will be hosted:

Programming and analysing eye-tracking experiments in Python → 6h, room: Senatssaal [K.11.07]
Eye tracking in Mixed or Virtual Realities → 6h, room: K6 [K.11.17]
Analysing Eye Movement Experiments using Linear Mixed Models in R → 9h, room: K8 [K.11.10]
Combining Eye-Tracking and EEG: An Introduction → 4-5h, room: K5 [K.11.20]
COGAIN Workshop: Gaze interaction - methods and best practices → 6h, room: K3 [K.12.18]

User Group Meetings

TOBII user group meetings will be held in room 3 (HS 28).
SR Research user group meetings will be held in room 4 (HS 26).

A detailed map of the pre-conference area can be found here:äne/Grifflenberg-ZentrRäume.pdf

Programming and analysing eye-tracking experiments in Python (6h, room: Senatssaal [K.11.07])

Edwin Dalmaijer, University of Oxford, UK

In this hands-on workshop, you will learn how to script eye-tracking experiments and analyses in Python. This versatile and open-source programming language is free, easy to use, and very popular in science and industry. We will start from a beginners level, so no prior programming experience is required. The programme covers the basics of Python, and will build up to a point where you will use the PyGaze and PsychoPy libraries to create a simple experiment. After collecting some (simulated) data, we will turn to analysis and visualisation using the NumPy, SciPy, and Matplotlib packages. This will be a brief introduction of how to handle common file formats, how to process pupil data, and how to detect and visualise fixations. At the end of the workshop, you should have a basic proficiency in Python, on which you can build further knowledge gained from documentation or more advanced courses. All the required software and materials will be made available, but please do bring your own computer.

The workshop will be taught by Edwin Dalmaijer (University of Oxford), who is the developer of several useuful software packages, including PyGaze and parts of OpenSesame. Please click here for further info about that at ECEM 2017.

Eye tracking in Mixed or Virtual Realities (6h, room: K6 [K.11.17])

Gabe Diaz, Rochester Institute of Technology, USA

The ability to incorporate eye tracking into computationally generated contexts presents new opportunities for research into gaze behavior. The aim of this workshop is to assist the interested researcher to incorporate eye tracking technology into a laboratory, and into his or her next funding proposal.

The full list of anticipated topics includes:
- A review of existing equipment.
- Associated costs.
- A quick overview of some related research.
- What existing analysis software can and can not do for you.
- Methods for measuring and reducing spatial and temporal inaccuracies.
- Algorithms for identifying what someone is looking at.
- Algorithms for identifying where someone is looking relative to a fixed point in space.
- Algorithms for the classification of gaze events (e.g. fixation, head tracking, VOR, etc.).
- Relevant metrics to include in your next publication involving eye tracking in VR.
- What type of students should I hire, and what skills should they develop?
- Issues that may be raised by shrewd reviewers of your related funding proposal, and how to address them.

Because example data and code will be provided in the form of Jupyter notebooks, attendees are encouraged to arrive with a copy of Continuum’s Anaconda installed on their machine (

This presentation is intended to educate and facilitate adoption of an emerging technology. If a topic you are interested is not listed, or if you are a researcher or manufacturer that feels you can contribute knowledge to one or more of these topics (in the form of literature), you are encouraged to contact the organizer at gabriel.diaz{at}

Analysing Eye Movement Experiments using Linear Mixed Models in R (9h, room: K8 [K.11.10])

Denis Drieghe, University of Southampton, UK

Linear Mixed Models (LMM) are an increasingly popular way to analyse eye movement data as they can simultaneously account for multiple uncontrolled effects in the data such as individual variability or variability in the stimuli. In this workshop, we will focus on replacing the analysis of eye movement data using the General Linear Model (e.g. ANOVA’s, regressions) by analyses based on LMMs.

The workshop will start by exploring multiple regression in R as from there on it is a logical step to venture into LMMs. No pre-existing knowledge concerning LMMs is required. However, people taking the workshop will be expected to have mastered the basic skills of working in R: Getting data into R, accessing variables and managing subsets of data, use of simple functions and basic plotting tools. If you still need to acquire these skills before the workshop, there are many excellent, free guides for beginners available on-line. If you prefer to work from a book, a fairly inexpensive one is by Zuur, Leno and Meesters (2009) for which besides a paperback version, there is also a Kindle edition and a Chinese translation available.

The workshop will focus on introducing LMMs and analyzing experiments from relatively simple experimental designs (no more than three continuous or categorical factors). The goal will not be on analyzing a comprehensive set of complex designs but on spending a considerable time of the workshop doing exercises of the designs that are being covered.

Alain Zuur, Elena Ieno, & Erik Meesters. A Beginner’s Guide to R. Springer. ISBN: 978-0387938363

Combining Eye-Tracking and EEG: An Introduction (4-5h, room: K5 [K.11.20])

Olaf Dimigen, Humboldt Universität zu Berlin , Germany

Update, May, 11th: Target audience & workshop duration: The workshop is aimed at beginning users of this method combination, i.e., students and researchers who do not have prior experience with running combined eye-tracking/EEG experiments, but who would like to learn more about the possibilities and limitations of this still relatively new methodological approach. The duration of the workshop will be 4-5 hours.

Although natural vision involves 2-4 eye saccades per second, most EEG data is recorded under rather artificial conditions of sustained visual fixation. An alternative approach to EEG analysis, summarized in the present workshop, is to co-record EEG and eye-tracking data during more natural viewing situations and to use the on- or offsets of eye movements as time-locking points for the EEG signal, yielding saccade- and fixation-related potentials (SRPs/FRPs). However, recording high-resolution eye movements with the EEG is also useful in other, more traditional EEG research contexts, e.g. for controlling fixation, for detecting hidden signal distortions from microsaccades, for co-recording the pupil diameter, for using saccades as extremely fast behavioral responses, or for enhancing EEG-based brain-computer interfaces.

The workshop aims to provide a basic introduction to this method combination and its advantages, existing limitations, and applications in oculomotor and neurocognitive research. Talks will cover the relevant theoretical and practical issues, including the historical and recent development of this technique, the scalp-recordable subcomponents of SRPs and FRPs, and the various methodological challenges related to conducting and analyzing co-registration experiments (e.g. laboratory setup, hardware requirements, suitable processing pipelines, data synchronization and integration, ocular artifact correction, control of confounds, and response modelling). I will also show examples from several lines of research (e.g. microsaccades, sentence reading, scene viewing) on how this technique can be used to gain new insights. The hands-on part of the workshop will start with a brief introduction to the EEGLAB toolbox. We will then together analyze simple, co-registered eye-tracking/EEG datasets using a combination of EEGLAB, the EYE-EEG toolbox ( and custom MATLAB scripts.

To benefit from the workshop, it is helpful if participants have some basic familiarity either with the acquisition/analysis of EEG data or the acquisition/analysis of eye-tracking data. Some programming experience in MATLAB is also recommended, but not required. For an optimal experience, bring your own laptop with MATLAB (version 2010a or newer) and the latest version of EEGLAB ( installed on it.

Please email me at olaf.dimigen{at} if you have specific questions that you would like to see covered in the workshop.

COGAIN Workshop: Gaze interaction - methods and best practices (6h, room: K3 [K.12.18])

Carlos H. Morimoto, University of São Paulo, Brazil

Gaze-based interfaces have been around for decades. Though very succesful in helping people with disabilities, the use of eye movements in general purpose computer applications is still very challenging. From Jacob's seminal "What you look at is what you get" work, gaze-based interfaces have come a long way, along with eye tracking technology. Today, light weight wearable eye trackers might finally enable everyday gaze interaction applications. This course will give you insight into the major gaze interaction paradigms and some practical experience in designing novel gaze-based interfaces.

Target Audience:
Beginners and intermediate eye-tracking researchers interested in designing gaze interaction systems.

Main Topics:
• Taming eye trackers for computer interaction
• History of gaze interaction: methods and applications
• Gaze interaction paradigms: avoiding the Midas' touch problem
• User interface design guidelines
• Usability vs User Experience: beyond speed
• Prototyping gaze-based interaction systems
• Discount evaluation methods for gaze-based interfaces
• Gaze interaction design workshop: practice exercise
• Wrap-up

Further information can be found here: