Repository logo
  • Log In
    Log in via Symplectic to deposit your publication(s).
Repository logo
  • Communities & Collections
  • Research Outputs
  • Statistics
  • Log In
    Log in via Symplectic to deposit your publication(s).
  1. Home
  2. Faculty of Medicine
  3. Global Health
  4. Institute of Global Health Innovation
  5. Data-driven microscopic pose and depth estimation for optical microrobot manipulation
 
  • Details
Data-driven microscopic pose and depth estimation for optical microrobot manipulation
File(s)
ACS-Revision-Dan_BL.pdf (1.93 MB)
Accepted version
Author(s)
Zhang, Dandan
Lo, Frank P-W
Zheng, Jian-Qing
Bai, Wenjia
Yang, Guang-Zhong
more
Type
Journal Article
Abstract
Optical microrobots have a wide range of applications in biomedical research for both in vitro and in vivo studies. In most microrobotic systems, the video captured by a monocular camera is the only way for visualizing the movements of microrobots, and only planar motion, in general, can be captured by a monocular camera system. Accurate depth estimation is essential for 3D reconstruction or autofocusing of microplatforms, while the pose and depth estimation are necessary to enhance the 3D perception of the microrobotic systems to enable dexterous micromanipulation and other tasks. In this paper, we propose a data-driven method for pose and depth estimation in an optically manipulated microrobotic system. Focus measurement is used to obtain features for Gaussian Process Regression (GPR), which enables precise depth estimation. For mobile microrobots with varying poses, a novel method is developed based on a deep residual neural network with the incorporation of prior domain knowledge about the optical microrobots encoded via GPR. The method can simultaneously track microrobots with complex shapes and estimate the pose and depth values of the optical microrobots. Cross-validation has been conducted to demonstrate the submicron accuracy of the proposed method and precise pose and depth perception for microrobots. We further demonstrate the generalizability of the method by adapting it to microrobots of different shapes using transfer learning with few-shot calibration. Intuitive visualization is provided to facilitate effective human-robot interaction during micromanipulation based on pose and depth estimation results.
Date Issued
2020-09-25
Date Acceptance
2020-09-01
Citation
ACS Photonics, 2020, 7 (11), pp.3003-3014
URI
http://hdl.handle.net/10044/1/88045
URL
https://pubs.acs.org/doi/10.1021/acsphotonics.0c00997
DOI
https://www.dx.doi.org/10.1021/acsphotonics.0c00997
ISSN
2330-4022
Publisher
American Chemical Society
Start Page
3003
End Page
3014
Journal / Book Title
ACS Photonics
Volume
7
Issue
11
Copyright Statement
© 2020 American Chemical Society. This document is the Accepted Manuscript version of a Published Work that appeared in final form in ACS Photonics, after peer review and technical editing by the publisher. To access the final edited and published work see https://doi.org/10.1021/acsphotonics.0c00997
Sponsor
Engineering & Physical Science Research Council (EPSRC)
Identifier
http://gateway.webofknowledge.com/gateway/Gateway.cgi?GWVersion=2&SrcApp=PARTNER_APP&SrcAuth=LinksAMR&KeyUT=WOS:000592916800009&DestLinkType=FullRecord&DestApp=ALL_WOS&UsrCustomerID=1ba7043ffcc86c417c072aa74d649202
Grant Number
EP/P012779/1
Subjects
Science & Technology
Technology
Physical Sciences
Nanoscience & Nanotechnology
Materials Science, Multidisciplinary
Optics
Physics, Applied
Physics, Condensed Matter
Science & Technology - Other Topics
Materials Science
Physics
optical microrobot
pose estimation
depth estimation
deep learning
image processing
Publication Status
Published
Date Publish Online
2020-09-25
About
Spiral Depositing with Spiral Publishing with Spiral Symplectic
Contact us
Open access team Report an issue
Other Services
Scholarly Communications Library Services
logo

Imperial College London

South Kensington Campus

London SW7 2AZ, UK

tel: +44 (0)20 7589 5111

Accessibility Modern slavery statement Cookie Policy

Built with DSpace-CRIS software - Extension maintained and optimized by 4Science

  • Cookie settings
  • Privacy policy
  • End User Agreement
  • Send Feedback