Scene Background Modeling Contest
in conjunction with ICPR 2016 (http://www.icpr2016.org)
CALL FOR PARTICIPATION
Scene Background Modeling Contest (http://pione.dinf.usherbrooke.ca/sbmc2016/)
Contest results submission deadline: *July 18, 2016*
Contest paper submission deadline: *July 19, 2016*
MOTIVATIONS
In scene analysis, the availability of an initial background model that describes the scene without foreground objects is the prerequisite, or at least can be of help, for many applications, including video surveillance, video segmentation, video compression, video inpainting (or video completion), privacy protection for videos, and computational photography.
Few methods for scene background modeling have specifically addressed initialization, also referred to as bootstrapping, background estimation, background reconstruction, initial background extraction, or background generation. However, many challenges still remain unsolved, including handling sudden illumination changes, night videos, low framerate, and videos taken by PTZ cameras; thus, model learning is highly required.
The aim of the SBMC2016 contest (http://pione.dinf.usherbrooke.ca/sbmc2016/) is to advance the development of algorithms and methods for scene background modeling through objective evaluation on a common dataset.
DATASET
The SBMnet dataset (http://scenebackgroundmodeling.net) provides a diverse set of 80 different videos. They have been selected to cover a wide range of scene background modeling challenges and are representative of typical indoor and outdoor visual data captured in surveillance, smart environment, and video database scenarios. SBMnet includes the following challenges: Basic, Intermittent Motion, Clutter, Jitter, Illumination Changes, Background Motion, Very Long and Very Short videos.
PERFORMANCE EVALUATION
The SBMnet dataset provides also tools to compute performance metrics commonly adopted for the problem (namely, AGE, pEPs, pCEPs, PSNR, MS-SSIM, CQM) and thus identify algorithms that are robust across various challenges. The SBMnet website also comes with an automatic system which allows authors to upload their results and see how their method compares with others.
PARTICIPATION
– Researchers from both the academia and the industry are welcome to submit results.
– Results must be reported for each video of each category.
– Only one set of tuning parameters should be used for all videos.
– Methods published in the past can be submitted as long as extensive evaluation over all 8 video categories is performed.
– Although not mandatory, participants are strongly encouraged to make available a software implementation of the proposed method, together with all the parameterizations required to execute it.
Results from all submissions that meet certain minimum quality standards will be reported and maintained on the contest website. The website will serve as the platform for diffusing all the information to the participants, including the procedural details, dataset, announcements, etc., and it will remain available after the competition, so that the research community can make use of the dataset and the evaluation protocol in their research works.
PUBLICATION
All the participants that meet the results minimum quality standards will be invited to submit a regular paper describing their method and results according to the ICPR 2016 format. The submitted papers will undergo blind peer review in order to guarantee originality and technical soundness.
IMPORTANT DATES
May 16, 2016 Registration to the competition is opened
May 16, 2016 Publication of the dataset
July 7, 2016 Registration to the competition is closed
July 18, 2016 Deadline to submit contest results
July 19, 2016 Deadline to submit contest paper
August 5, 2016 Notification of acceptance
September 5, 2016 Deadline to submit camera ready contest paper
MAIN ORGANIZERS
Pierre-Marc Jodoin, University of Sherbrooke, Canada
e-mail: Pierre-Marc.Jodoin@USherbrooke.ca
home page: http://www.dmi.usherb.ca/~jodoin/
Lucia Maddalena, National Research Council, Italy
e-mail: lucia.maddalena@cnr.it
home page: http://www.na.icar.cnr.it/~maddalena.l/