Papers

Many of the papers related to INSIGHT are available for download at Philippe Martinez’ Researchgate page, as well as Kevin Cain’s Researchgate and Academia sites.

Select Publications & Conference Courses

Etude pluridisciplinaire de chapelles funéraires thébaines de l’époque ramesside Matthias Alfeld, Kevin Cain, Catherine Defeyt, Pauline Martinetto, Philippe Martinez, Jared Murnan, Silvia Pedetti, Philippe Walter, 2017

This report details recent collaborations around the work of the French Mission in Thebes, Egypt.

The Eye of the Medusa – XRF Imaging Reveals Unknown Traces of Antique Polychromy Matthias Alfeld, Maud Mulliez, Philippe Martinez, Kevin Cain, Philippe Walter, 2016

The colorful decoration of statues and buildings in antique times is commonly described by the term Antique Polychromy. It is well known among scholars but less so in the interested public and its exact form is subject of ongoing research.

Computation & cultural heritage: fundamentals and applications K. Cain (Chair), G. Downing, P. Debevec, B. Brown, G. Ward, M. Glencross, P. Cignoni International Conference on Computer Graphics and Interactive Techniques archive ACM SIGGRAPH 2009 Courses table of contents New Orleans, Louisiana Year of Publication: 2009

This SIGGRAPH course introduces the broad aims of graphics researchers working in archaeology and cultural heritage.

 Efficient Field Capture of Epigraphy via Photometric Stereo J. Paterson† and K. Cain‡ †Oxford 3D Technology ‡Institute for the Study and Integration of Graphical Heritage Techniques The 7th International Symposium on Virtual Reality, Archaeology and Cultural Heritage VAST (2006)

Oxford computer vision post-doctoral researcher James Paterson developed a novel implementation of photometric stereo which was deployed during the Maya Skies field capture. Photometric stereo has long been known as useful image-based modeling technique in the lab; James’ variant successfully extended the technique to field work by relaxing the requirement for a fixed camera. A paper from the project, written with Kevin Cain, presents detailed reconstructions of a panel from the Platform of Eagles and Jaguars at Chichen Itza. This panel, which Kevin was able to photograph in a few minutes, boasts sub-milimetric detail difficult to discern with the human eye.

A Perceptually Validated Model for Surface Depth Hallucination M. Glencross, C. Jay, J. Jiu, F. Melendez, R. Hubbold The University of Manchester G. Ward† Dolby Canada

CGI rendering pioneer Greg Ward joined the Maya Skies capture team to test his ‘texture hallucination’ approach to generating plausible, realistic 3D models from photographs. Greg’s data from the Southern Venus Platform at Chichen are incorporated in the paper and video linked above. Any additional photometric stereo data sets were captured on site by Mark Eakle, including warrior columns from the Temple of the Warriors, inscriptions from the Pyramid of Kukulkan.

Fast Traversal of time-indexed Panoramas A. Hui, K. Cain, P. Martinez INSIGHT SIGGRAPH 2008 New Tech Demos Traversing Complex Environments Using Time-Indexed High-Dynamic-Range Panoramas Theme: Complexity and Accessibility

Panoramic photography have proved to be a popular, simple tool for recording archaeological sites. Antonio Hui’s automated time-lapse panoramic capture system enabled him to shoot time-lapse panoramas at Chichen Itza from many points, simultaneously. In a SIGGRAPH 2008 New Tech Demo, Antonio demonstrated a system to tame the huge amounts of data that resulted. We found that users can target features by traversing sets of panoramas much more quickly than with access to photos, and that this speed advantage scales logrythmically with the size of the data set. This system demonstrates practical capture and display of multi-viewpoint, synchronized time-lapse panoramas in high dynamic range. The capture technique performed well during demanding archaeological site work at Chichén Itzá, México, for the National Science Foundation project Maya Skies. The demonstration features a novel viewer that allows the user to quickly traverse the image data generated by the capture system via embedded links drawn in the interface. In timed tests, users are able to identify environmental elements and search for scene features much more rapidly using this system instead of a typical image viewer, and the time advantage scales as the database size increases.

Dense Depth from Stereo Y. He, Geometry Systems Inc.

Youda He of Geometry Systems Inc. created an image-based scan rig for field acquisition of sculptured panels. The system uses a pair of digital SLR cameras and novel structured lighting.

Long-Baseline 3D Reconstruction M. Glencross, F. Melendez, The University of Manchester G. Ward† Dolby Canada

Researcher Mashhuda Glencross and her colleagues James brought to Chichen a new system for developing 3D models from sets of photographs. Mashhuda’s graduate student Francho Melindez has continued the work. Francho has included data from the Platform of the Eagles and Jaguars from Chichen Itza in his doctoral thesis at the University of Manchester

Portable Scanning Rigs for Maya Field Archaeology M. Eakle, K. Cain

While 3D capture equipment is fairly light and flexible, the difficulty of articulating a scanner at a complex site such as a Maya pyramid structure requires special consideration. Our rigs were designed to be stable enough to allow bracketed, multiple-exposure scanning. The

Radiance and Autodesk Maya integration K. Ibrahim

INSIGHT Visual Effects Technical Director Ken Ibrahim (the Matrix, Pirates of the Carribean) worked on site at Chichen Itza with Radiance inventor Greg Ward (Dolby Canada) to create a plug-in to export Alias Maya scenes for rendering in Radiance. Ken’s MEL script can be used to export geometry with simple shaders for rendering in Radiance. Notes and examples are available in the download link above, including a distribution of Radiance itself.