POPART PROJECT 

Previz for all productions



Project


POPART aims to democratize previz for all kinds of film and TV production.

Previz is not only required for highly complex VFX shots. Previz gives you more artistic control, reduces production risks and optimizes your budget. Previz contributes to film-making from the planning stage to filming with actors on-set, and even during the post-production steps.

POPART proposes an original approach to combine quick and easy setup with a robust and precise camera tracking. Finally, it allows to optimize everything in full quality based on the RAW data collected during the shooting.

Public project deliverables to the EU commision can be found here.

News

  • 11th October 2016 - POPART project has been presented during the invited talk "Camera Tracking for Visual Effects in Movies" at the 5th Croatian Computer Vision Workshop (CCVW 2016) in Osijek (Croatia).
  • 24th August 2016 - Final review with a demonstration of the Previz system at Equippe AS Studio in Grimstad (Norway).
  • 27th June 2016 - Join us at CVPR's Monday morning poster session at poster booth #60 to discuss about our paper "Detection and Accurate Localization of Circular Fiducials under Highly Challenging Conditions"
  • 10th June 2016 - We will live demo the POPART technology at the Grimstad short film festival.
  • 12th May 2016 - The paper Robustness of 3D point positions to camera baselines in markerless AR systems has been presented at MMSys 2016.
  • 11th May 2016 - We present our demo Immersed gaming in Minecraft at the ACM Multimedia Systems Conference (MMSys 2016) in Klagenfurt (Austria).
  • 14th April 2016 - We will present the POPART project in the Band Pro Booth #C10408 at NAB 2016, Las Vegas. Come visit us if you are there.
  • 13th March 2016 - The paper "Detection and Accurate Localization of Circular Fiducials under Highly Challenging Conditions" has been accepted at CVPR 2016, Las Vegas.
  • 18th December 2015 - Press Release. POPART technology demonstrated at Cap Digital, Paris.
  • October 2015 - Demonstration of mixed reality technologies on a series of public art performances in Vienna, a collaboration of performance artists and researchers.

Product

POPART delivers a fully integrated commercial solution including software and hardware for on-set Previz and also delivers a free solution for 3D reconstruction, photomodeling and camera tracking.

Camera Tracking
RIG

The POPART camera tracking solution provides a camera RIG with two witness cameras shutter synchronized with the main camera. We provide scripts to conform data and perform fully automatic camera tracking of your footage. Additionally, we provide scripts to generate scenes for your preferred camera tracking software for manual corrections.

Previz
Software & Hardware

The POPART previz solution provides realtime previzualization On-Set with a full integration to improve your post-production. 

Post-Production & Previz Servicing

LABO and Mikros Image propose Previz integrated with Post-Production servicing.

Open Source

We build a fully integrated software for 3D reconstruction, photomodeling and camera tracking. We aim to provide a strong software basis with state-of-the-art computer vision algorithms that can be tested, analyzed and reused. Links between academia and industry is a requirement to provide cutting-edge algorithms with the robustness and the quality required all along the visual effects and shooting process.

The POPART software builds on a set of core libraries, both newly written ones and such that are adapted from existing software. All of these core libraries are based upon open standards and released in open source. This open approach enables both us and other users to achieve a high degree of integration and easy customisation for any studio pipeline.

Beyond our project objectives, open source is a way of life. We love to exchange ideas, improve ourselves while making improvements for other people and discover new collaboration opportunities to expand everybody’s horizon.

  OpenMVG

Open Multiple View Geometry

OpenMVG library solves classical problems in Multiple View Geometry and provides a full Structure-from-Motion pipeline with a strong focus on accuracy. The library scope is not limited to the creative sectors but can be used in a wide spectrum of fields, like urban planning, industrial design, archeology, medical applications, etc.

  MayaMVG

Photomodeling plugin - Autodesk © Maya

MayaMVG allows graphic artists to do photomodeling on top of a 3D reconstruction (point cloud and cameras) with pixel precision.

  ofxMVG

OpenFX Camera Localization and Calibration plugins.

The CameraLocalizer plugin estimates the camera pose of an image regarding an existing 3D reconstruction generated by openMVG. The plugin supports multiple clips in input to localize a RIG of cameras (multiple cameras rigidly fixed). The LensCalibration plugin estimates the distortion parameters according to the couple camera/optics.

  CCTag

Concentric Circles Tag

This library allows you to detect and identify CCTag markers. Such marker system can deliver sub-pixel precision while being largely robust to challenging shooting conditions (e.g. motion blur, occlusion and poor lighting conditions).

  PopSIFT

Scale-Invariant Feature Transform (SIFT)

This library provides a GPU implementation of SIFT.
25 fps on HD images on recent graphic cards.

Open Datasets

We released 2 datasets allowing to evaluate 3D reconstructions and camera tracking.

Levallois Town Hall

This dataset consists of 596 rendered images along with a 3D ground truth. It allows to run 3D reconstruction solutions on these 596 images to measure the reconstruction quality according to the 3D ground-truth. We also provide the rendering of a virtual camera to evaluate camera tracking with different rendering options.  This dataset consists of 596 rendered images along with a 3D ground truth. It allows to run 3D reconstruction solutions on these 596 images to measure the reconstruction quality according to the 3D ground-truth. We also provide the rendering of a virtual camera to evaluate camera tracking with different rendering options. 

Green Screen Studio

This dataset is an example of an everyday production with professional cameras and green screen studio with artificial lighting environment. The dataset contains raw video files shot on RED EPIC, with additional witness camera footage shot in a mixed reality studio setting with CCtag fiducial markers attached to the wall and ceiling. The goal of this dataset is to provide a reference footage for improving the open source CCtag fiducial marker library. It contains a reference camera tracking and all material required to build a virtual composite using the POPART camera tracking system.


Toulouse Capitole

This dataset collects 56 images taken in Place du Capitole in Toulouse (FRA) and 4 related videos taken while moving around in the place. The aim of the dataset is to test and evaluate the camera tracking algorithms developed for the POPART project, and help the reproducibility of the experiments. The camera tracking algorithms developed for the POPART project are based on model tracking of an existing 3D reconstruction of the scene. First a collection of still images are taken and a SfM pipeline is used to perform the 3D reconstruction of the scene. As result of this first step, a 3D point cloud is generated. The camera tracking algorithms are then based on camera localization techniques: each frame is individually localized w.r.t. the 3D point cloud using the photometric information (SIFT features) associated to each point. This allows to align the point cloud to the current frame and thus compute the camera pose.


10.5281/zenodo.49896

Cap Digital Courtyard

This dataset collects 197 images taken in courtyard of Cap Digital building in Paris (FRA) and 2 video sequences taken with a camera rig composed of 1 main camera and 2 witness cameras on the side looking outwards. The aim of the dataset is to test and evaluate the camera tracking algorithms developed for the POPART project, and help the reproducibility of the experiments. The camera tracking algorithms developed for the POPART project are based on model tracking of an existing 3D reconstruction of the scene. First a collection of still images are taken and a SfM pipeline is used to perform the 3D reconstruction of the scene. As result of this first step, a 3D point cloud is generated. The camera tracking algorithms are then based on camera localization techniques: each frame is individually localized w.r.t. the 3D point cloud using the photometric information (SIFT features) associated to each point. This allows to align the point cloud to the current frame and thus compute the camera pose. 


10.5281/zenodo.59370

This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 644874.