As in previous years, VBS 2026 will be part of the International Conference on MultiMedia Modeling 2026 (MMM 2026) in Prague, Czech Republic, and organized as a special side event to the Welcome Reception. It will be a moderated session where participants solve Known-Item Search (KIS), Ad-Hoc Video Search (AVS), and Visual Question Answering (VQA) tasks that are issued as live presentation of scenes of interest, either as a visual clip, or as a textual description (an overview of task types can be found here). Results need to be submitted to the VBS server (DRES), which evaluates the correctness of submissions and ranks all teams.

Changes to Previous Years

VBS 2026 will have a few changes:

  • Virtual attendance is not allowed any longer
  • There is no more Novice Session (only the developers themselves participate)
  • In terms of task types, KISV will be the minority. More tasks will be of type KIST, KISC, VQA, and AVS.

Datasets

VBS 2026 will use the entire V3C dataset (from the Vimeo Creative Commons Collection) in collaboration with NIST, i.e., TRECVID 2025 (i.e., with the Ad-Hoc Video Search (AVS) Task), as well as the extended marine video (underwater/scuba diving) dataset and the LapGynLHE dataset (surgeries in laparoscopic gynecology), but most tasks will be issued from V3C.

V3C includes three shards: V3C1 consists of 7,475 video files, amounting to 1,000 hours of video content (1,082,659 predefined segments) and 1.3 TB in size, and was also used in previous years. V3C2 contains additional 9,760 video files, amounting to 1,300 hours of video content (1,425,454 predefined segments) and 1.6 TB in size. V3C3 contains additional 11,215 video files, amount to 1,500 hours of video content (1,635,580 predefined segments). In order to download the dataset (provided by NIST), please complete this data agreement form and send a scan to angela.ellis@nist.gov with CC to gawad@nist.gov and ks@itec.aau.at. In the response email by NIST, you will find the link for downloading the data.

The marine video (underwater) dataset has been provided by Prof. Sai-Kit Yeung (many thanks!) and is available for download here.

The LapGynLHE dataset (laparoscopic gynecology) is available for download via SFTP. Please contact Klaus Schoeffmann for the data agreement form and (after signing the form) the username and password.

VBS Server and Testing

The VBS uses the Distributed Retrieval Evaluation Server (DRES) to evaluate found segments for correctness. DRES will run as a public service on a server at Klagenfurt University, Austria. Participants can submit found segments to the server via a simple HTTP-like protocol described in the Client Examples. The format is defined in the OpenAPI specification. The server is connected to a projector on-site and presents the current score of all teams lively (in addition to presenting task descriptions). The server and example tasks from the previous years are provided here; for Angular systems, another description is provided here.

Participation

Anyone with an exploratory video search tool that allows for retrieval, interactive browsing, and exploration in a video collection may participate. The only requirement is to submit an extended demo paper about the system until the submission deadline.

There are no restrictions regarding allowed features, except for presentation screen recording during the competition, which is not allowed. That means in addition to interactive content search you can use any automatic content search as well.

Participants are expected to implement the functionality to send results and logs to the VBS Server via a REST API, which requires some efforts for integration.

Paper Submission Instructions

To participate, please submit an extended demo paper (6+2 pages in Springer LNCS format, where the 2 pages may be used for references) by the deadline via the MMM 2026 Submission System (please select „Video Browser Showdown“ track). The submission should include a detailed description of the video search tool (including a screenshot) and how it supports interactive search in video data. Submissions will be peer-reviewed to ensure maximum quality. Accepted papers will be published in the proceedings of the MMM conference. In the public VBS session, each system needs to be presented (typically as a concise introductory video, or sometimes as a poster – but this will depend on the local situation and announced a few weeks before the competition).

Existing Tools and Results

In order to give new teams an easy entry, we provide also existing tools and results that are described here.

Journal Paper

We plan to write a joint journal paper after the VBS competition, to which each participating team should contribute to. The winning team will be honored to be in charge of the journal paper (as the main author).

Video Browser Showdown - The Video Retrieval Competition