JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for EYE-MOVEMENT Archives


EYE-MOVEMENT Archives

EYE-MOVEMENT Archives


EYE-MOVEMENT@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

EYE-MOVEMENT Home

EYE-MOVEMENT Home

EYE-MOVEMENT  June 2013

EYE-MOVEMENT June 2013

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

Call for Contributions: Challenge on Automatic Object Identification (AOI) and Tracking

From:

Thies Pfeiffer <[log in to unmask]>

Reply-To:

Eye-movement mailing list <[log in to unmask]>

Date:

Wed, 12 Jun 2013 09:21:40 +0200

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (235 lines)

[Apologies for multiple postings.]

1. Call for Challenge on Automatic Object Identification (AOI) and Tracking

as part of the

SAGA 2013:
1st INTERNATIONAL WORKSHOP ON SOLUTIONS FOR AUTOMATIC GAZE DATA ANALYSIS
  - uniting academics and industry.

24-26 October 2013 Bielefeld University, Germany
Cognitive Interaction Technology Center of Excellence Workshop Website:
http://saga.eyemovementresearch.com/challenge/

===========================================================================

Important Dates:


August 15th, 2013:   Deadline for 2-page abstract sketching your
                      approach.
September 2nd, 2013: Notification of acceptance for challenge.
October 2nd, 2013:   Submission of the final abstracts and final
                      results.

October 24-26, 2013: Challenge results presentation takes place at the
                      SAGA 2013 Workshop at Bielefeld University,
                      Germany.

===========================================================================

We are very pleased to publish this call for challenge contributions as
part of the SAGA 2013 1st International Workshop on Solutions for 
Automatic Gaze Data Analysis. The challenge will focus on software 
solutions for automatic object recognition as a trailblazer for 
vision-based object and person tracking algorithms. The automatic object 
or person recognition and tracking in video sequences (in real-time) is 
a key condition for many application fields, such as mobile service 
robotics, Human-Robot Interaction (HRI), Computer Vision, Digital Image 
Processing, autonomous assistance and surveillance systems (e.g., driver 
assistance systems) and Eye Tracking. Applications vary from tracking of 
objects (e.g., manipulating or recognition of objects in dynamic 
scenes), body parts
(e.g., head or hand tracking for mimic and gesture classification), and
persons (e.g., person reidentification or visual following).

Although, many efficient tracking methods have been introduced for
different tasks over the last years, they are mostly restricted towards
particular environmental settings and therefore cannot be applied to
general application fields. This is due to a range of factors: 1.) 
Often, underlying assumptions about the environment cannot be met, 
including static background, no changes in lighting and inhomogeneous or 
invariant appearances. These idealized conditions are usually missing 
for object tracking in high dynamic environments, as they are common, 
for example in mobile scenarios. 2.) Object models cannot be applied 
because of the high variance in the appearance of tracked persons or 
objects. 3.) Most algorithms are computationally quite expensive (large 
systems demand often hard computational restrictions for the used 
algorithms).

===========================================================================

Details on the SAGA 2013
CHALLENGE on Automatic Object Identification (AOI) and Tracking:


In order to drive research on software solutions for the automatic
annotation of videos we offer a special challenge on this topic.
The purpose of the challenge is to encourage the community to work on a
set of specific software solutions and research questions and to
continuously improve on earlier results obtained for these problems over
the years. This will hopefully not only push the field as a whole and
increase the impact of work published in it, but also contribute open
source hardware, methods and data analysis software back to the
community.

For the challenge we adress this topic on the basis of eye-tracking 
data. Therefore, we are providing a set of test videos (duration 2-3 
minutes) and separate text files with the corresponding gaze data on the 
workshop website for which solutions should be written. These gaze 
videos, recorded by a scene camera attached to an eye-tracking system, 
show people when they look at objects or interact with them in mobile 
applications. The gaze data contains a time-stamped list of x- and 
y-positions of the gaze points (in the coordinate system of the scene 
video). For selected videos, frame counter information will be also 
available to assist with synchronization of the video and the gaze data.

For the challenge we are looking for semi- and fully-automatic software
solutions for the recognition and tracking of objects over the whole 
video sequence. The software should provide the coordinates for the 
tracked objects and use this information to automatically calculate 
object specific gaze data, such as number of fixations and cumulative 
fixation durations, by using the time-stamped list of 2D gaze 
coordinates in the eye-tracking file. There are no restrictions on the 
way in which the relevant objects are marked and on which kind of 
techniques can be used to track the objects. The only constraint is that 
your software solution can read and process the provided videos and 
reports gaze specific data for the selected objects either as a text 
file (which can serve as input for a statistical program such as SPSS, 
Matlab, R oder MS Excel) or by providing some kind of visualization.

All submissions will be evaluated by an independent jury according to 
the evaluation criteria (see below). Additionally, there is a live 
session scheduled for the third day in which all selected solutions can 
be demonstrated to the interested workshop participants. The three best 
solutions will receive an award.

Prize money:

1. Price: 1.000,- ¤
2. Price:   500,- ¤
3. Price:   250,- ¤

We would like to thank our premium sponsor SensoMotoric Instruments
(SMI) for the contribution of the prize money.

The SAGA challenge features test videos recorded with different devices
from
- SensoMotoric Instruments (SMI) [SMI EyeTracking Glasses]
- Tobii Technologies [Tobii Glasses]
- Applied Science Laboratories (ASL)
   / Engineering Systems Technologies (EST) [ASL Mobile Eye-XG]

===========================================================================

Submissions:

In order to allow for more time for the implementation process for the
challenge a two-step submission procedure has been devised. The decision
for acceptance to the challenge will be on a preliminary submitted
abstract. The final evaluation and ranking of the software solutions
will be based on the final abstract and the final results for a test-set
of videos, including such similar to those on the website:

a) Preliminary submissions should consist of a 2 page abstract
describing the implementation details of your proposed software solution
including the following:

- description of the underlying techniques and implementations
- description of object selection and tracking processes

b) Finals submissions shall extend the preliminary submission to a 3
page paper by adding the following details:

- number of fixations and cumulative fixation duration details for the
   specified objects
- performance data (such as computation time, number of selected
   objects, parallel tracking of several objects in the scene)
- snapshot of the results

We will use results based on manual annotation to evaluate the submitted
results. The following evaluation criteria will be applied:

- quality of the automated benchmark results (region and pixel based)
   compared to the results given by manual annotation
- conceptual innovation
- performance (such as computation time, number of selected objects,
   parallel tracking of several objects in the scene)
- robustness (such as such as tracking performance, general scope of
   the application)
- usability

The test videos and a corresponding description of them can be found on
the workshop website. Additionally, you can find a detailed description
of how we perform the manual annotation. The exact description for the
challenge, including the evaluation criteria and the required format for
the results, will appear on the workshop website within the next 3
weeks. Please check the website regularly for updates.

Abstracts will be peer-viewed by at least two members of an 
international program committee. We will provide templates on the 
workshop website. We are currently pursuing possible options for 
publication of a special issue in a journal or as an edited volume.

Please Note: All challenge participants must register separately for
access to the challenge material and the video download.

===========================================================================

We would like to thank our commercial sponsors:

Premium Sponsors
- SensoMotoric Instruments (SMI) [challenge]
   / SMI Eye Tracking Glasses (www.eyetracking-glasses.com)

Sponsors
- Tobii Technologies [live demo workshop session]
   / Tobii Glasses (http://www.tobii.com/en/eye-tracking-
   research/global/products/hardware/tobii- glasses-eye-tracker/)

===========================================================================

Challenge Organising Committee:

Workshop Organisers:
- Kai Essig
- Thies Pfeiffer
- Pia Knoeferle
- Helge Ritter
- Thomas Schack
- Werner Schneider

All from the
Cognitive Interaction Technology Center of Excellence
at Bielefeld University

Scientific Board:
- Thomas Schack
- Helge Ritter
- Werner Schneider

Jury of the Challenge:
- Kai Essig
- Thies Pfeiffer
- Pia Knoeferle
- Denis Williams (Sensomotoric Instruments, SMI)

Please visit the website periodically for updates:
http://saga.eyemovementresearch.com/about-saga/

For additional question, please contact: [log in to unmask]

We look forward to receiving your submissions and to welcoming you to
Bielefeld in October, 2013!

On behalf of the organisers

Thies Pfeiffer

--
EYE-MOVEMENT mailing list ([log in to unmask])
N.B. Replies are sent to the list, not the sender
To unsubscribe, etc. see http://www.jiscmail.ac.uk/files/eye-movement/introduction.html
Other queries to list owner at [log in to unmask]

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
February 2009
December 2008
November 2008
October 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
August 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
December 2006
November 2006
October 2006
September 2006
August 2006
July 2006
June 2006
May 2006
April 2006
March 2006
February 2006
January 2006
November 2005
October 2005
September 2005
August 2005
July 2005
June 2005
May 2005
April 2005
March 2005
February 2005
January 2005
December 2004
November 2004
October 2004
September 2004
August 2004
July 2004
June 2004
May 2004
April 2004
March 2004
February 2004
January 2004
December 2003
November 2003
October 2003
September 2003
August 2003
June 2003
May 2003
April 2003
March 2003
February 2003
January 2003
December 2002
November 2002
October 2002
September 2002
August 2002
July 2002
June 2002
May 2002
April 2002
March 2002
February 2002
January 2002
December 2001
November 2001
October 2001
September 2001
August 2001
July 2001
June 2001
May 2001
April 2001
March 2001
February 2001
January 2001
December 2000
November 2000
October 2000
September 2000
August 2000
July 2000
June 2000
May 2000
April 2000
March 2000
February 2000
January 2000
December 1999
November 1999
October 1999
September 1999
August 1999
July 1999
June 1999
May 1999
April 1999
March 1999
February 1999
January 1999
December 1998
November 1998
October 1998
September 1998


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager