OpenEDS Workshop on Eye Tracking for AR/VR
We are pleased to announce the fourth instalment of the OpenEDS workshop (openeds.org<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fl.workplace.com%2Fl.php%3Fu%3Dhttps%253A%252F%252Fresearch.facebook.com%252Fmeta-openeds-workshop-2022%252F%26h%3DAT0D_vl38kYmM6RUJvoUzAax86AZ7gJC4cn5HoMdUcaGoek0dArr5KZMsfvNk3grdMMlnO4jUcDVx5NEiByMkmMnXkYZSHbt31XEsAvZj4q5MaL9YKs9iUAjbufE8VSnlr2hkg75WBHd-55GyQ0PLZDsch90MoG0%26__tn__%3D-UK-y-R%26c%5B0%5D%3DAT1r3CjuQxWUDDCjPkH4MmODIfP6OzDT6XODY8Als4qw8uF4_agAr1jxpZ2_xl2EMqG2QpDi12NNRskNPckyqRDROWDtsp3OlZJgNa43QqXVmH71Q2oLKnyw2rVbrDnegrMX2sy6Gxda__orkjghHbpnCNasCC0TqiPrsLCXgI-GDTVKdx_5s8RFkXl6hTckf9EVmywCV0Yt37jodxtwDSdETGruP84WrISwCLq3JrftN3-cBGs&data=04%7C01%7Cmjp51%40bath.ac.uk%7Cf4a809c2f3324bd969e708da0c228b60%7C377e3d224ea1422db0ad8fcc89406b9e%7C0%7C0%7C637835639754884944%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=u0qCH52cpeYbOc121de35JJoN3CJVVhhVHS2dd8JHzE%3D&reserved=0>) on Eye-Tracking for AR/VR, to be hosted at the annual ETRA conference<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fl.workplace.com%2Fl.php%3Fu%3Dhttps%253A%252F%252Fetra.acm.org%252F2022%252F%26h%3DAT2v05M5q1vX3Qns8fPj6ut6OUeMZy-akNImiLVMNs8E5kRbaXHTvQiwLxfVdwV7IlHb0sLu4O32X9usRDnJEMZjmdwKgtzFD3NRA9K5P70aO6CiDdv8yXTUq1_rUpXsmck1vX47gx5ckqeNCE3Af3-EuSOmayY%26__tn__%3D-UK-y-R%26c%5B0%5D%3DAT1r3CjuQxWUDDCjPkH4MmODIfP6OzDT6XODY8Als4qw8uF4_agAr1jxpZ2_xl2EMqG2QpDi12NNRskNPckyqRDROWDtsp3OlZJgNa43QqXVmH71Q2oLKnyw2rVbrDnegrMX2sy6Gxda__orkjghHbpnCNasCC0TqiPrsLCXgI-GDTVKdx_5s8RFkXl6hTckf9EVmywCV0Yt37jodxtwDSdETGruP84WrISwCLq3JrftN3-cBGs&data=04%7C01%7Cmjp51%40bath.ac.uk%7Cf4a809c2f3324bd969e708da0c228b60%7C377e3d224ea1422db0ad8fcc89406b9e%7C0%7C0%7C637835639754884944%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=SoVwa1xHAc%2F9sX64l5zB96CzHA3dzZC6HuG8DKdOrXE%3D&reserved=0> in Seattle between June 8th and 11th, 2022. The OpenEDS workshop is hosted annually by the Eye Tracking Research team at Reality Labs Research.
How can I contribute?
1.
Spread the word! openeds.org<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fl.workplace.com%2Fl.php%3Fu%3Dhttps%253A%252F%252Fresearch.facebook.com%252Fmeta-openeds-workshop-2022%252F%26h%3DAT0D_vl38kYmM6RUJvoUzAax86AZ7gJC4cn5HoMdUcaGoek0dArr5KZMsfvNk3grdMMlnO4jUcDVx5NEiByMkmMnXkYZSHbt31XEsAvZj4q5MaL9YKs9iUAjbufE8VSnlr2hkg75WBHd-55GyQ0PLZDsch90MoG0%26__tn__%3D-UK-y-R%26c%5B0%5D%3DAT1r3CjuQxWUDDCjPkH4MmODIfP6OzDT6XODY8Als4qw8uF4_agAr1jxpZ2_xl2EMqG2QpDi12NNRskNPckyqRDROWDtsp3OlZJgNa43QqXVmH71Q2oLKnyw2rVbrDnegrMX2sy6Gxda__orkjghHbpnCNasCC0TqiPrsLCXgI-GDTVKdx_5s8RFkXl6hTckf9EVmywCV0Yt37jodxtwDSdETGruP84WrISwCLq3JrftN3-cBGs&data=04%7C01%7Cmjp51%40bath.ac.uk%7Cf4a809c2f3324bd969e708da0c228b60%7C377e3d224ea1422db0ad8fcc89406b9e%7C0%7C0%7C637835639754884944%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=u0qCH52cpeYbOc121de35JJoN3CJVVhhVHS2dd8JHzE%3D&reserved=0>
2.
Submit your paper! Key topics for the workshop are outlined below, and other related areas are also encouraged.
3.
Attend the workshop! Hear talks from leaders in the field (see below), build connections, and advocate for your eye tracking research goals for AR and VR.
Workshop Topics:
*
Gaze estimation techniques that are robust to identity variance, sensor slippage and sensor noise
*
Methods to incorporate uncertainty estimates for near-eye gaze tracking
*
Fusing multi-sensor data streams for near-eye gaze tracking
*
Multi-modal sensor interactions– for example, combining eye tracking with head, body and hand tracking to improve interactions in AR/VR environment
*
Privacy preserving approach to gaze tracking
*
Eye movement analysis for biometrics, security and privacy
Confirmed Invited Speakers:
*
Prof. Dr.-Ing. Dorothea Kolossa<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.ruhr-uni-bochum.de%2Fika%2Fmitarbeiter%2Fkolossa_conference_articles.htm&data=04%7C01%7Cmjp51%40bath.ac.uk%7Cf4a809c2f3324bd969e708da0c228b60%7C377e3d224ea1422db0ad8fcc89406b9e%7C0%7C0%7C637835639754884944%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=YrPbkNAaFXL7OAe5XguCBrNHPgqRerYsyZTXA7SHzdY%3D&reserved=0>, Cognitive Signal Processing Group, Ruhr University Bochum
*
Dr. Melissa Hunfalvay<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Frighteye.com%2Fstaff%2Fdr-melissa-hunfalvay%2F&data=04%7C01%7Cmjp51%40bath.ac.uk%7Cf4a809c2f3324bd969e708da0c228b60%7C377e3d224ea1422db0ad8fcc89406b9e%7C0%7C0%7C637835639754884944%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=W%2FxJMbpfApxpGtER1fyr2L8mQWMau%2Fm9peAUaL2kxRc%3D&reserved=0>, Chief Science Officer & Co-Founder RightEye, LLC
* David Zee MD<https://eur01.safelinks.protection.outlook.com/?url=http%3A%2F%2Finvalid.invalid%2F&data=04%7C01%7Cmjp51%40bath.ac.uk%7Cf4a809c2f3324bd969e708da0c228b60%7C377e3d224ea1422db0ad8fcc89406b9e%7C0%7C0%7C637835639754884944%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=pIO1tPGJ3BncyoaQjxG5WNJLqwozva5cQlMxUSALxRc%3D&reserved=0>, Paul and Betty Cinquegrana Professor, Departments of Neurology, Neuroscience, Ophthalmology, Otolaryngology- The Johns Hopkins University School of Medicine
More information, paper submission and registration:
https://research.facebook.com/meta-openeds-workshop-2022/<https://eur01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fresearch.facebook.com%2Fmeta-openeds-workshop-2022%2F&data=04%7C01%7Cmjp51%40bath.ac.uk%7Cf4a809c2f3324bd969e708da0c228b60%7C377e3d224ea1422db0ad8fcc89406b9e%7C0%7C0%7C637835639754884944%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000&sdata=dhWkNeFFovqNxyq6ctSq%2BThI2mrtURxC%2FLuRCzjzysA%3D&reserved=0>
Michael J. Proulx, PhD
Reader in Psychology, University of Bath
Visiting Researcher, Reality Labs Research (Oculus/Facebook/Meta)
[log in to unmask]<mailto:[log in to unmask]> Twitter: @MichaelProulx<https://twitter.com/MichaelProulx>
https://researchportal.bath.ac.uk/en/persons/michael-proulx
Director, Crossmodal Cognition Lab | https://sites.google.com/view/ccc-collective/home
Deputy Director, REal and Virtual Environments Augmentation Labs | http://www.bath.ac.uk/reveal
Co-Investigator, CAMERA 2.0 | https://www.camera.ac.uk/
UKRI CDT in ART-AI | https://cdt-art-ai.ac.uk<https://cdt-art-ai.ac.uk/>
--
EYE-MOVEMENT mailing list ([log in to unmask])
N.B. Replies are sent to the list, not the sender
To unsubscribe, etc. see http://www.jiscmail.ac.uk/files/eye-movement/introduction.html
Other queries to list owner at [log in to unmask]
|