Ask for more information
Bench Fees / Research Training & Support Grant.
The Engineering and Physical Sciences Research Council (EPSRC).
Applications are invited for a fully funded 3,5 year full-time PhD studentship (including home tuition fees, annual stipend and consumables) starting on 1st February 2024.
Aim of the project
Ultrasound elastography is a non-invasive technique for assessing mechanical properties of tissue using ultrasound-based displacement tracking. However, accurately estimating tissue displacements in all directions remains challenging with current ultrasound technology because the sound waves are emitted and received from one direction. We have developed a new multi-transducer ultrasound approach that uses a diversity of directions in transmit and receive to achieve much more isotropic performance. This is highly effective for imaging and it could be transformative for elastography. Critically, the approach should allow much more accurate estimation of the displacement field from ultrasound data. The goal of this project is to explore the potential of this coherent multi-transducer approach for elastography.
The particular aims are:
- Improve 2D/3D tracking estimates by developing advanced quantitative ultrasound techniques to exploit the coherent use of multiple transducers.
- Devise an accurate approach to calculate the full strain field.
- Quantify the benefits on breast ultrasound elastography.
Ultrasound elastography is an emerging non-invasive imaging technique for detecting alterations in mechanical properties of tissue . Usually, ultrasound radiofrequency data are collected from tissue to first track displacements and then assess mechanical properties such as strain and elasticity. Thus, elastography is highly dependent on the quality of ultrasound data and images and the resulting strain or elasticity maps are affected by the same limitations as speckle tracking. Indeed, artefacts like shadows, reverberation, low resolution, and poor signal-to-noise-ratio (SNR) at large imaging depth prevent elastography images in practice . All of these limitations are imposed by the way ultrasound operates using a small hand-held transducer.
Lateral displacements are more challenging to assess due to the limitations in lateral resolution, lack of phase information, and poor SNR. In practice only one displacement out of three is accurately estimated and one of nine components of the strain tensor. In addition, when 2D imaging is performed, out of plane motions can induce estimation artefacts. Tissue motion and deformation are not limited to a single dimension and most tissues exhibit anisotropic mechanical and functional properties. Potentially, a much more accurate estimation of displacements would be possible with a coherent-multi transducer approach, which leads to multi-view images with significant improvements in resolution and sensitivity [3, 4].
This project aims to develop and evaluate advanced quantitative ultrasound techniques to exploit the coherent use of multiple ultrasound transducers and improve both accuracy and precision in 2D/3D tracking estimates that will allow the calculation of the strain field and transform ultrasound elastography methods. A comprehensive framework for algorithm verification and validation will be built using ultrasound simulations and experimental data obtained in a controlled setup in phantoms and ex vivo animal tissue.
The methodology contains three main steps:
1) Design and implement novel 2D and 3D tracking algorithms for multi-transducer ultrasound data. The student will investigate the use of 2D/3D displacement estimation methods using both RF data and signal envelope and identify a compliant method for speckle-tracking according to the unique features of the coherent use of multiple ultrasound transducers. Then, how to exploit the spatial information diversity from the multi-view data will be investigated and incorporated into the tracking algorithm.
2) Propose and define strategies to calculate the full strain field leveraging the advanced tracking algorithms developed. The student will explore possible strategies to accurately calculate the displacement gradient in order to reconstruct the entire strain tensor. A validation framework will be developed, comprising finite element modelling, ultrasound simulations and ex vivo experiments in animal tissue. Axial, lateral, and elevational displacements will be measured and the strain components calculated.
3) Quantify the potential improvements on ultrasound elastography to assess connectivity of tissues and lesions using breast tissue as an example application. Different lesion sizes and levels of lesion bonding will be considered and assessed by the new elastography method. Finally, this project will investigate whether this accurate strain field estimation could offer additional information about the connectivity of tissues and lesions and explore new avenues for clinical applications.
 Ophir, J., Cespedes, I., Ponnekanti, H., Yazdi, Y., & Li, X. (1991). Elastography: a quantitative method for imaging the elasticity of biological tissues. Ultrasonic imaging, 13(2), 111-134.
 Sigrist, R. M., Liau, J., El Kaffas, A., Chammas, M. C., & Willmann, J. K. (2017). Ultrasound elastography: review of techniques and clinical applications. Theranostics, 7(5), 1303.
 Peralta, L., Gomez, A., Luan, Y., Kim, B. H., Hajnal, J. V., & Eckersley, R. J. (2019). Coherent multi-transducer ultrasound imaging. IEEE transactions on ultrasonics, ferroelectrics, and frequency control, 66(8), 1316-1330.
 Peralta, L., Ramalli, A., Reinwald, M., Eckersley, R. J., & Hajnal, J. V. (2020). Impact of Aperture, Depth, and Acoustic Clutter on the Performance of Coherent Multi-Transducer Ultrasound Imaging. Applied Sciences, 10(21), 7655.
Informal email enquiries from interested students to the supervisor are encouraged (contact details below).
Dr Laura Peralta - email: email@example.com
Each studentship is fully funded for 3.5 years. This includes home tuition fees, stipend and generous project consumables.
Stipend: Students will receive a tax-free stipend at the UKRI rate of £20,622 (AY 2023/24) per year as a living allowance.
Research Training Support Grant (RTSG): A generous project allowance will be provided for research consumables and for attending UK and international conferences.
Candidates who meet the eligibility requirements for Home Fee status will be eligible to apply for this project. Home students will be eligible for a full UKRI award, including fees and stipend, if they satisfy the UKRI criteria below, including residency requirements. To be classed as a Home student, candidates must meet the following criteria:
- be a UK National (meeting residency requirements), or
- have settled status, or
- have pre-settled status (meeting residency requirements), or
- have indefinite leave to remain or enter.
Prospective candidates should have a 1st or 2:1 M-level qualification in Biomedical Engineering, Physics, Engineering, Computer Science, Mathematics, or a related programme.
Preference will be given to candidates with a background conducive to multidisciplinary research and preferably programming skills.
We welcome eligible applicants from any personal background, who are pleased to join diverse and friendly research groups.
Please submit an application for the Biomedical Engineering and Imaging Science Research MPhil/PhD (Full-time) programme using the King’s Apply system. Please include the following with your application:
- A PDF copy of your CV should be uploaded to the Employment History section.
- A 500-word personal statement outlining your motivation for undertaking postgraduate research should be uploaded to the Supporting statement section.
Funding information: Please choose Option 5 “I am applying for a funding award or scholarship administered by King’s College London” and under “Award Scheme Code or Name” enter BMEIS_DTP_LPP. Failing to include this code might result in you not being considered for this funding.
The closing date is the 30th October 2023 (please note that the applications can be closed earlier if a suitable candidate is found).