Verified Assured Learning for Unmanned Embedded Systems (VALUES)


This project aims at development of a research framework for autonomous navigation using a 1/5 scale unmanned ground vehicle, with the long-term goal of combining learning-based control with physics-based models. To this end, a high fidelity simulation environment (using CARLA) as well as the necessary architecture for fast integration with robot hardware was developed. The motion planning architecture uses a hybrid system formalism, where the system switches between different actions. The actions encode closed-loop controlled motions, parametrized by given desired velocity and lateral offset with respect to a given route. A simple greedy action enumeration policy is used to select the optimal action based on a user-defined reward function, subject to collision constraints.

JHU Map Overlook in CARLA JHU Map in CARLA

JHU Map Rainy Weather CARLA JHU Map Sunset Weather CARLA

JHU UGV Autonomous Navigation Planner Intersection

Surgical Robotic System for Autonomous Minimally Invasive Orthopaedic Surgery


Developed a dexterous robotic system for minimally invasive autonomous debridement of osteolytic bone lesions in confined spaces. The proposed system is distinguished from the state-of-the-art orthopaedics systems by combining a rigid-link robot with a flexible continuum robot that enhances reach in difficult-to-access spaces often encountered in surgery. The continuum robot is equipped with flexible debriding instruments and fiber Bragg grating sensors. Surgeon plans on the patient’s preoperative computed tomography and the robotic system performs the task autonomously under the surgeon’s supervision. An optimization-based controller generates control commands on the fly to execute the task while satisfying physical and safety constraints.

Concurrent Control of Positioning and Continuum Robots Debriding Simulated Hard Bone

Pelvic Osteolysis Femoral Osteonecrosis

Data-Driven Shape Sensing of Continuum Robots


This work proposes a data-driven learning-based approach for shape sensing and distal-end position pstimation of continuum robots in constrained environments using Fiber Bragg Grating (FBG) sensors. The proposed approach uses only the sensory data from an unmodeled uncalibrated sensor embedded in the CM to estimate the shape and DPE. A deep neural network was used to increase the shape sensing accuracy in presence of disturbance compared to mechanics-based model-dependent approaches. A variety of flexible small-size (< 0.5 mm outside diamter) sensors with different stiffness, bending capabilites, and sensing resolution were designed and embedded into the continuum robot.

Data-Driven Sensing of Continuum Robots Fiber Bragg Grating Sensors

Learning-Based Collision Detection in Continuum Robots


Conventional continuum robot collision detection algorithms rely on a combination of exact continuum robot constrained kinematics model, geometrical assumptions such as constant curvature behavior, a priori knowledge of the environmental constraint geometry, and/or additional sensors to scan the environment or sense contacts. We proposed a data-driven machine learning approach using only the available sensory information, without requiring any prior geometrical assumptions, model of the continuum robot or the surrounding environment. The proposed algorithm was implemented and evaluated on a non-constant curvature continuum robot, equipped with Fiber Bragg Grating (FBG) optical sensors for shape sensing purposes. Results demonstrate successful detection of collisions (using only fiber optic, not vision) in constrained environments with soft and hard obstacles with unknown stiffness and location.

Hard Foam Collision Detection Soft Foam Collision Detection

Gelatin Phantom Collision Detection Finger Collision Detection