Latest Events

Reach Out and Touch Something

Haptic rendering is the application of forces in a virtual environment to a user interface (such as a flight control stick, haptic glove or surgical robot hand controls). The virtual environment can represent a physical environment, it can be fully synthetic (like in a video game), or it can be an augmented reality combination representing both physical and synthetic components. In this virtual environment, motions of the hand control device interact with a virtual representation of the physical environment, and kinesthetic feedback is provided to the user.

Towards Continual Topological Mapping with Introspection

For robust, life-long autonomous operation in dynamic unstructured environments, mobile robots must contend with vast amounts of continually evolving data. The exploring robot must adapt to its environment and refine its workspace representation with new observations. The key competency we seek is “introspection”: to ability to determine what is perplexing, which can further drive active information acquisition or human disambiguation. The talk explores this in the context of place recognition and semantic mapping.

Reformulating planning as probabilistic inference---where it helps and where not

Reformulating planning problems as probabilistic inference problems is interesting, but does not necessarly solve fundamental problems. In this talk I will review three variations of the theme where the reformulation has lead to novel theoretical insights and efficient algorithms. These are in the context of stochastic optimal control and model-free Reinforcement Learning, for multi-agent POMDPs, and for relational MDPs. I will conclude with some questions and first steps on a problem we currently work on: how to efficiently plan in the case of uncertainty over existence of objects.

Learning from Demonstrations for Robotic Manipulation

 I will talk about the research I have done with my collaborators on teaching robots to perform manipulation tasks based on human demonstrations. Of particular interest are tasks involving deformable objects, where it is hard to perceive the full state of the system and model its dynamics. I will also talk about some techniques we have developed to solve the associated perception and motion planning problems. 

Self-Manipulation and Dynamic Transitions for a Legged Robot

As dynamic robot behaviors become more capable and well understood, there becomes a need for a wide variety of equally capable and understood transitions between these behaviors. For legged robots, we introduce a new formalism for understanding behavioral components as a self-manipulation, and then build up a hybrid system that defines topologically the space of dynamic transitions as a cellular complex. Our primary motivation is not to facilitate numerical simulation but rather to promote design insight -- behavior design, controller design, and platform design.

Next generation soft wearable robots

Next generation wearable robots will use soft materials such as textiles and elastomers to provide a more conformal, unobtrusive and compliant means to interface to the human body. These robots will augment the capabilities of healthy individuals (e.g. improved walking efficiency, increased grip strength) in addition to assisting patients who suffer from physical or neurological disorders. This talk will focus on two different projects that demonstrate the design, fabrication and control principles required to realize these systems.