Arruda, Ermano ORCID: 0000-0001-5177-6552 (2021). Generative and predictive models for robust manipulation. University of Birmingham. Ph.D.
|
Arruda2020PhD.pdf
Text - Accepted Version Available under License All rights reserved. Download (2MB) | Preview |
Abstract
Probabilistic modelling of manipulation skills, perception and uncertainty pose many challenges at different stages of a typical robot manipulation pipeline. This thesis is about devising algorithms and strategies for improving robustness in object manipulation skills acquired from demonstration and derived from learnt physical models in non-prehensile tasks such as pushing. Manipulation skills can be made robust in different ways: first by improving time performance for grasp synthesis, second by employing active perceptual strategies that exploit generated grasp action hypothesis to more efficiently gather task-relevant information for grasp generation, and finally via exploiting predictive uncertainty in learnt physical models. Hence, robust manipulation skills emerge from the interplay of a triad of capabilities: generative modelling for action synthesis, active perception, and finally learning and exploiting uncertainty in physical interactions. This thesis addresses these problems by
• Showing how parametric models for approximating multimodal distributions can be used as a computationally faster method for generative grasp synthesis.
• Exploiting generative methods for dexterous grasp synthesis and investigating how active vision strategies can be applied to improve grasp execution safety, success rate, and utilise fewer camera views of an object for grasp generation.
• Outlining methods to model and exploit predictive uncertainty from learnt forward models to achieve robust, uncertainty-averse non-prehensile manipulation, such as push manipulation.
In particular, the thesis: (i) presents a framework for generative grasp synthesis with applications for real-time grasp synthesis suitable for multi-fingered robot hands; (ii) describes a sensorisation method for under-actuated hands, such as the Pisa/IIT SoftHand, which allows us to deploy the aforementioned grasp synthesis framework to this type of robotic hand; (iii) provides an active vision approach for view selection that makes use of generative grasp synthesis methods to perform perceptual predictions in order to leverage grasp performance, taking into account grasp execution safety and contact information; and (iv) finally, going beyond prehensile skills, provides an approach to model and exploit predictive uncertainty from learnt physics applied to push manipulation. Experimental results are presented in simulation and on real robot platforms to validate the proposed methods.
Type of Work: | Thesis (Doctorates > Ph.D.) | ||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Award Type: | Doctorates > Ph.D. | ||||||||||||
Supervisor(s): |
|
||||||||||||
Licence: | All rights reserved | ||||||||||||
College/Faculty: | Colleges (2008 onwards) > College of Engineering & Physical Sciences | ||||||||||||
School or Department: | School of Computer Science | ||||||||||||
Funders: | Other | ||||||||||||
Other Funders: | Self-funded | ||||||||||||
Subjects: | Q Science > QA Mathematics > QA75 Electronic computers. Computer science Q Science > QA Mathematics > QA76 Computer software T Technology > T Technology (General) Z Bibliography. Library Science. Information Resources > Z665 Library Science. Information Science |
||||||||||||
URI: | http://etheses.bham.ac.uk/id/eprint/10305 |
Actions
Request a Correction | |
View Item |
Downloads
Downloads per month over past year