Print

Print


AHRC Collaborative Doctoral Partnership

Research Studentship 2019

The National Gallery, Scientific Department & Imperial College London

Multimodal analytical imaging of Old Master Paintings: addressing the challenges of registration, mosaic construction and image resolution

Applications are invited for a Collaborative Doctoral Partnership PhD studentship, to be undertaken at Imperial College London (Electrical and Electronic Engineering Department) and the National Gallery (Scientific Department). This studentship will be jointly supervised by Professor Pier Luigi Dragotti at Imperial College London (ICL) and Dr Catherine Higgitt at the National Gallery (NG). The studentship is for a three-year (full-time) project entitled ‘Multimodal analytical imaging of Old Master Paintings: addressing the challenges of registration, mosaic construction and image resolution’, to commence on 1 October 2019. The student may also apply to the Student Development Fund (see below) to allow a (remunerated) placement of up to 6 months duration at the National Gallery during the PhD to further develop and expand their skills. The student will spend concentrated periods of time both at Imperial College London and at the National Gallery. This is an exciting interdisciplinary project involving close collaboration between engineers with expertise in signal and image processing, conservation scientists, conservators and curators. The student will also have the opportunity to interact with researchers involved in an EPSRC-funded joint-research project between ICL, NG and University College London (http://gow.epsrc.ac.uk/NGBOViewGrant.aspx?GrantRef=EP/R032785/1).

Summary of Project:

In the art historical study of paintings and to inform their conservation, there is a long tradition of using a range of imaging techniques to improve understanding of an artist's creative process, working methods, palette and materials. These techniques range from visible images under different lighting or magnification, images acquired using different forms of radiation e.g. infrared reflectograms or X-radiographs, to image sets generated using new spectroscopic imaging methods like macro X-ray fluorescence scanning (MA-XRF) or hyperspectral imaging (HSI). However, to harness the wealth of information contained within these very large multi-modal datasets, an essential first step is to accurately align the images. Registration and mosaicking normally involves finding common, invariant features between images and aligning the images using these 'control points'. However, with paintings, each modality may contain both similar and unique features making registration particularly challenging. Various approaches have been developed for registration of multimodal data from paintings but may fail if the spatial resolution of the data differs (e.g. MA-XRF data) and are not automatic (important when handling very large HSI and MA-XRF datasets increasingly available in the field) nor invariant to geometric transformation and colour-inconsistency.

This project aims to facilitate processing and interpretation of multimodal datasets from paintings by developing new registration methods to automatically extract features common to different modalities that are resilient to variation in acquisition conditions, spatial resolution and geometric distortions, etc. The project will also develop methods to enhance the spatial resolutions of some of the modalities which normally have a resolution which is much smaller than that of the visible image and will achieve that by leveraging correlation among modalities. Performance will be bench-marked against current approaches. The optimised algorithms will both enhance spatial resolutions of low resolution modalities and automatically register and mosaic multimodal images and will be packaged as open-source user-friendly software tools to allow wide adoption by and adaptation for a variety of arts and humanities end-users, greatly facilitating use of the numerous and diverse technical images now generated in their research.

Such tools, besides facilitating registration specifically, will assist more in-depth data interpretation by identifying features unique to a modality which may relate to concealed/altered features in a painting. By improving our ability to extract and visualise information contained within multimodal image sets, this research opens up the possibility to gain unprecedented insights into the creation, history and condition of Old Master paintings whilst also offering the possibility of providing new ways to interact with art and to present it on modern media devices to provide new experiences. The methods will be applicable to a wide range of image modalities and will both improve on current practice and be an essential pre-requisite to the broader use of advanced signal processing methods in the cultural heritage sector in order to fully exploit the rich variety of digital data now being generated. The results obtained are expected to stimulate further broader exploration of such methods in the arts and humanities field.

Funding:

This Collaborative Doctoral Partnership PhD studentship is funded by the AHRC. The full studentship award for students with UK residency* includes fees and a stipend of approximately £16,000 per annum plus approximately £500 p.a. additional stipend payment for Collaborative Doctoral students for 3 years. In addition, the Student Development Fund (equivalent to 0.5 years of stipend payments) is also available to support the cost of training, work placements, and other development opportunities. Students with EU residency are eligible for a fees-only studentship award. International applicants are normally not eligible to apply for this studentship. The student will receive additional support towards further research expenses from The National Gallery over the course of the research studentship. When appropriate, further support to attend conferences will be provided by Imperial College London. Both partners and the CDP consortium will provide opportunities for training and career development.

*UK residency means having settled status in the UK that is no restriction on how long you can stay in the UK; and having been “ordinarily resident” in the UK for 3 years prior to the start of the studentship that is you must have been normally residing in the UK apart from temporary or occasional absences; and not been residing in the UK wholly or mainly for the purposes of full-time education.

Eligibility:

Applicants must have a good first degree (usually a minimum 2:1) or a Masters degree (or other equivalent experience) in Electrical/Electronic Engineering, Mathematics, Physics or related areas. They should be highly motivated individuals with a keen interest in conducting interdisciplinary research. The project would suit a candidate with an interest in developing cutting-edge scientific techniques and complex data processing methods to challenging questions such as those posed by cultural heritage sector. Students must also meet the eligibility requirements for Post Graduate Studies at Imperial College London.

Further Information and application:

Interested applicants should contact the main supervisors Professor Pier Luigi Dragotti ([log in to unmask]) and Dr Catherine Higgitt ([log in to unmask]) ideally by 15th June 2019 and they should include in the email a covering letter and their CV.
--
Joseph Padfield
Conservation Scientist
Scientific Department
The National Gallery
Trafalgar Square
London WC2N 5DN
+44 (0)20 7747 2553
http://research.ng-london.org.uk
http://www.twitter.com/JoePadfield
[The National Gallery, Trafalgar Square, London WC2N 5DN]<http://www.nationalgallery.org.uk>
[Mantegna and Bellini]<https://www.nationalgallery.org.uk/whats-on/exhibitions/sorolla>

****************************************************************
       website:  http://museumscomputergroup.org.uk/
       Twitter:  http://www.twitter.com/ukmcg
      Facebook:  http://www.facebook.com/museumscomputergroup
 [un]subscribe:  http://museumscomputergroup.org.uk/email-list/
****************************************************************