Intelligent Perception and Control Lab ​

智能感知与控制实验室

Develop intelligent systems via the design of various high-performance sensors and the development of tailored effective algorithms.

我们致力于通过高性能传感器的设计以及高效算法的开发,研究多种智能系统。

News & Events

Research

研究工作

Research interest of our lab mainly include: Robotics; Intelligent Perception, RoboticControl, Artificial Intelligence and lts Applications, and Optimization Algorithms.

本实验室的研究方向主要包括:机器人;智能感知;机器人控制;人工智能算法应用;优化算法等。

Read More

Latest Publications

Numerical Simulation Assisted Design of a Soft-rigid-hybrid Tactile Finger for Surface Recognition

Haptic perception endows human being with the ability to perceive various objects and interact with surrounding environments. Although a variety of embedded-type tactile sensors have been designed, the mechanical characteristics and underlying dynamics behind the contact process is seldom explored, which significantly impairs the sensor's performances. In this work, we proposed a numerical simulation assisted design methodology to develop a soft-rigid-hybrid tactile finger, fully mimicking the structure and appearance of human fingers. The mechanism underlying the dynamic interactions gripping and contact scenarios was analyzed to reveal the stress distribution within the soft-rigid-hybrid finger, based on which the arrangement of sensing elements, i.e., the amount and the position and orientation, was determined. Further, the designed finger was fabricated and extended to construct a tactile gripper, with which robotic exploration on cylindrical objects was implemented. Excellent agreement of the tactile signals was found between simulation and experiment during the manipulation process. When combined with our developed perception model, it achieved an accuracy of 93.64% in the recognition of eight surfaces. In addition to contributing in the novel design of tactile grippers, this work provides a systematic methodology to design tactile sensors from the essential mechanics.

Read More

To date, numerous tactile sensors and algorithms have been developed to tackle various perception problems. However, excessive endeavors are devoted to the multifarious definition, extraction, and analysis of hand-crafted features in order to improve perception accuracy. To address this problem, in this article, we designed a tactile finger containing four sensing elements (SE) to perceive both dynamic and static stimuli and meanwhile proposed a novel signal processing pipeline. The pipeline mainly consisted of time-series signals conversion, an automatic deep features extractor, and a shallow recognition model. When the tactile finger was applied to explore 16 surfaces based on a robotic platform, the four-channel signals were converted and concatenated into a time-frequency image via continuous wavelet transform (CWT). A deep feature extraction network was constructed based on a pretrained deep learning (DL) model, Resnet101, to extract the required features, which acted as high-level representations of the most discriminative components from the tactile images. Finally, these features were fed into a shallow machine learning (ML) model, i.e., extreme learning machine (ELM), achieving an accuracy as high as 92.38%. In such a manner, the powerful learning capability of DL models was transferred to the new recognition model directly while the tedious feature extraction procedures were alleviated. Besides, several relevant issues, such as the layer depth, the DL model type, and the shallow recognition model, are addressed and discussed to reveal their influences on performance.

Surface Recognition With a Tactile Finger Based on Automatic Features Transferred From Deep Learning

Read More

Deformable linear objects (DLOs), such as rods, cables, and ropes, play important roles in daily life. However, manipulation of DLOs is challenging as large geometrically nonlinear deformations may occur during the manipulation process. This problem is made even more difficult as the different deformation modes (e.g., stretching, bending, and twisting) may result in elastic instabilities during manipulation. In this paper, we formulate a physics-guided data-driven method to solve a challenging manipulation task—accurately deploying a DLO (an elastic rod) onto a rigid substrate along various prescribed patterns. Our framework combines machine learning, scaling analysis, and physical simulations to develop a physics-based neural controller for deployment. We explore the complex interplay between the gravitational and elastic energies of the manipulated DLO and obtain a control method for DLO deployment that is robust against friction and material properties. Out of the numerous geometrical and material properties of the rod and substrate, we show that only three non-dimensional parameters are needed to describe the deployment process with physical analysis. Therefore, the essence of the controlling law for the manipulation task can be constructed with a low-dimensional model, drastically increasing the computation speed. The effectiveness of our optimal control scheme is shown through a comprehensive robotic case study comparing against a heuristic control method for deploying rods for a wide variety of patterns. In addition to this, we also showcase the practicality of our control scheme by having a robot accomplish challenging highlevel tasks such as mimicking human handwriting, cable placement, and tying knots.

Sim2Real Neural Controllers for Physics-Based Robotic Deployment of Deformable Linear Objects

Read More

As a newly emerged assistive device, data gloves are able to help amputees rebuild their sense of haptic perception, empower robots with dexterous manipulation, and even enhance human's capability of remote sensing and control. Up to now, it has boosted a wide range of potential applications, e.g., tele-operation, medical rehabilitation, and virtual reality. In this paper, a low-cost and easy-to-fabricate flexible data glove was designed consisting of 5\textendash row by 4\textendash column piezo-resistive sensing elements (SEs), two flexible electronic circuits and two protective polydimethylsiloxane (PDMS) layers. Based on the design of a 'switch' structure and a simplified wiring layout, the stimulus locations and force values could be determined conveniently with a dynamic scanning algorithm, although there were only 9-path signal outputs. After the experimental verification of its perception performance, a recognition model was established based on an extreme learning machine (ELM) algorithm to recognize 10 hand gestures, one of its potential applications, in which 20 subjects participated. It achieved a recognition accuracy of 92.37% and the standard deviation was 1.80% among these individuals, which validated the performance of our designed data glove.

Design of a Flexible Data Glove for Gesture Recognition

Read More