Matthew Pan

Ph.D.

Research

3D Visualization for Orthopedic Surgical Intervention

Research Institutions

Summary

I was involved in the development of a software program, given the name “MotionStation3D”, to assist researchers and surgeons in their computer assisted surgery and joint kinematics studies.  MotionStation3D is built upon National Instrument’s LabView software development environment and is designed to record and process motion data from multiple sensing instruments, display 3D spatial data and send commands to actuate simulator hardware.  Integration of the open-source Visualization Toolkit (VTK) – an open source, freely available 3D computer graphics package – allows MotionStation3D to represent structures in real 3D space and  provides a powerful engine for assisting motion, implantation and surgical procedures.  MotionStation3D can:

  • Represent specimen and testing apparatus using 3D models from CT scans and/or 3D models in real time
  • Simulate motion of a tool/specimen  in a virtual environment with the use of 3D trackers (e.g., Flock of Birds and Optotrak); and
  • Provide an interfacing between trackers and specimen in the virtual VTK 3D world.

Implant Position Tracking in MotionStation3D

Picture 1 of 3

A Low-Cost Laparoscopic Robot for Image-Guided Telesurgery

Research Institutions

Summary

While attending the University of Waterloo, I was a member of a three person team working to design and construct a low-cost laparoscopic surgical robot for training and research purposes.

With the only FDA approved laparoscopic robot costing $1.5 Million USD (DaVinci, Intuitive Surgical), robots like ours that offer similar dexterity and features can actually be fairly useful for surgical training, trying new surgical approaches, and for research development.

Features

  • 5 Degrees of Freedom mechanical design enforcing a remote center of rotation
  • Counter-weighted arms for motor efficiency
  • Designed to be controlled by multiple types of controllers (e.g., the Novint Falcon)
  • Features a tool actuation mechanism that could grasp and operate conventional laparoscopic tools interchangeably
  • Provided image-guidance and 3D visualization of the work space through the open source 3DSlicer

3D Render

Picture 1 of 5

Rapid-Prototyped High Fidelity Haptic Displays

Research Institutions

Summary

One of the first projects I was involved with in my master’s research  was the development of low-cost, rapid-prototyped haptic displays to support our research in affective haptics.  The displays were not only designed to be expressive, conveying emotional and contextual content through synthetic touch, but also have been built to be worn by the user for mobile applications and field studies.  The prototypes have been built with minimal time and monetary investment (< $100 US) as part of a physical brainstorming exercise, making them “disposable” and allowing for  to be more flexible with new ideas during the prototyping stages.

Haptic Sleeve

Picture 1 of 2

A sleeve constructed using a compression sleeve, pager motors and an Arduino LilyPad.

Physiologically-Triggered Audio Bookmarking

Research Institutions

Summary

This work explores a novel interaction paradigm driven by implicit, low-attention user control, accomplished by monitoring a user’s physiological state. With a team of engineers and computer scientists, I have designed and prototyped this interaction for a first use case of bookmarking an audio stream, to holistically explore this implicit interaction concept. We use a user’s galvanic skin conductance (GSR) is monitored for orienting responses (ORs) to external interruptions; our prototype automatically bookmarks the media such that the user can attend to the interruption, then resume listening from the point he/she is interrupted.

Algorithm Schematic

Picture 1 of 5

Flowchart of orienting response detection process.

Haptic Feedback in Implicit Bookmarking

Research Institutions

Summary

In this project, I examined how haptic feedback could enable an implicit human-computer interaction in the context of an audio stream listening use case (see above).  Here, I investigated two uses of haptic feedback to support this implicit interaction and mitigate effects of noisy (false-positive) bookmarking: (a) low-attention notification when a bookmark is placed, and (b) focused-attention display of bookmarks during resumptive navigation. The contributions of this work include an approach to handling noisy data in implicit HCI, an implementation of haptic notifications that signal implicit system behavior, and discussion of user mental models that may be active in this context.

Haptic Wrist Device

Picture 1 of 4

Affect Recognition in Music Listening through Kalman Filtering

Research Institutions

Summary

In this project, I investigated the development of a system to model a user’s affective state using multiple physiological channel inputs. To provide a framework for this exploration, I use the detection of preferences in music listening as a use case. The movement of music storage away from ‘hard’ physical media formats (e.g., compact discs, cassettes) to ‘soft’ digital files (e.g., mp3, wma) has allowed for music to be easily amassed and transferred to digital music players and online. No longer are users restricted to listen to what they buy on CD, instead, they are able to explore and personalize playlists using internet music services such as Pandora, Last.fm, Spotify and 8tracks  which promote/encourage exploration of expansive music collections spanning a vast assortment of genres. Given the vast collection of songs, these services require user self-reports and ratings (i.e., thumbs up, thumbs down) to notify the service which music selections are liked/disliked. This information is then used to algorithmically personalize and tailor playlists for the user.

The aid in this categorization, what I have investigated in this project is similar in nature to the one provided for the bookmarking application above: to reduce this cognitive demand,  have the system automatically recognize user music preference through user physiological signals and to act appropriately provided this information.  To do this, I developed an unscented Kalman filter to categorize emotional content to determine time-varying user preference during music listening.

Model for music preference detection

Picture 1 of 2

Diagram showing input (u – features drawn from music), states (x – user preference for the music) and outputs (y – physiological measures) of the system model.

Coiled Nylon-Actuated Robot Manipulator

Research Institutions

Summary

In a joint project between the CARIS and Molecular Mechatronics labs (one of the labs which developed the coiled nylon artificial muscle actuator),  we built several robotic hand prototypes as part of a demonstration on how these nylon actuators can be used in robotics applications.

Nylon-Actuated Hand - Prototype 2

Picture 1 of 3

Understand the Role of Gaze in Robot-to-Human Handovers

Research Institutions

Summary

In this work, we provide empirical evidence that using human-like gaze cues during human-robot handovers can improve the timing and perceived quality of the handover event. Handovers serve as the foundation of many human-robot tasks. Fluent, legible handover interactions require appropriate nonverbal cues to signal handover intent, location and timing. Inspired by observations of human-human handovers, we implemented gaze behaviors on a PR2 humanoid robot. The robot handed over water bottles to a total of 102 naïve subjects while varying its gaze behaviour: no gaze, gaze designed to elicit shared attention at the handover location, and the shared attention gaze complemented with a turntaking cue. We compared subject perception of and reaction time to the robot-initiated handovers across the three gaze conditions. Results indicate that subjects reach for the offered object significantly earlier when a robot provides a shared attention gaze cue during a handover. We also observed a statistical trend of subjects preferring handovers with turn-taking gaze cues over the other conditions. Our work demonstrates that gaze can play a key role in improving user experience of human-robot handovers, and help make handovers fast and fluent.

Experimental Conditions

Picture 1 of 3

Collaborative Human-Focused Assistive Robotics for Manufacturing

Research Institutions

Summary

New developments, innovations, and advancements in robotic technology are paving the way for the use of intelligent robots to enable, support, and enhance the capabilities of human workers in manufacturing environments. While the vast majority of current industrial robots have little to no direct interaction with humans, we envision that future industrial robots will assist people in the workplace, support workers in a variety of tasks, improve manufacturing quality and processes, and increase productivity. CHARM is a large multi-institutional project in collaboration with General Motors of Canada Limited (GM), which aims to advance both the vision and the technology for safe human-robot interaction (HRI) in vehicle manufacturing industries.  We investigate both (1) robotic technology development: communication, control, and perception; and (2) system design methodology: interaction design, information coordination (situational awareness), and integration.

 

Human-Robot Collaborative Lifting

Research Institutions

Summary

Technological advances are leading towards robot assistants that are helpful and affordable. However, before such robots can be deployed widely, both design and control strategies fostering safe and intuitive interaction between robots and their non-expert human users are required. A cooperative task often identified as potential application for robot assistants is cooperative manipulations of a large/awkward object. In this work, we propose a controller for a robot assistant to help a human raise and lower a physically large object. Fifty human subjects independently participated in the tuning of our proposed controller across four studies. Results show that user preference in tuning this controller is independent of load length and can be mapped linearly across the controller’s natural frequency-damping ratio space. Additionally, we find evidence for a universal, ‘one-size-fits-most’ tuning that is preferred (or at least acceptable) to a majority of users such that customizing the controller tuning to individual users may not be necessary.

Experiment Setup Diagram

Picture 1 of 1

Characterization of Handover Orientations used by Humans for Efficient Robot to Human Handovers

Research Institutions

Summary

We conducted a user study to measure and compare natural handover orientations with giver-centered and receiver-centered handover orientations for twenty common objects. We use a distance minimization approach to compute mean handover orientations from the data. The computed means of the receiver-centered orientations can then be used by robot givers to achieve more efficient and socially acceptable handovers. Furthermore, we introduce the notion of an affordance axis for comparing handover orientations, and offer a definition for computing them. Observable patterns were found in receiver-centered handover orientations. Comparisons show that depending on the object, natural handover orientations may not be receiver-centered; thus, robots may need to distinguish between good and bad handover orientations when learning from natural handovers.

Automatic Detection of Handovers

Research Institutions

Summary

From the perspective of a robot, something as effortless and mundane to us as the ability to receive an object from a human is a difficult task to them.  Much of this difficulty lies in the robot’s need to be able to detect non-verbal cues from the participating human to detect and infer timing and location of the handover. This work explores one of the first tasks that must be completed by a robot in order to receiver an object from a human giver; that is, to have a robot detect when a collaborating human initiates a handover of an object through recognition of non-verbal cues. The importance of having a robot be able to complete such a task is apparent: having the robot be able to recognize that a handover is in progress is a necessary prerequisite to performing and completing the object handover. Previously, obtaining the data necessary to detect non-verbal cues was a challenge in itself: kinematic data had to be extracted from raw video footage or from motion capture studios.  Thankfully, the availability of inexpensive, off the shelf sensors such as the Microsoft Kinect, have given robots access to human body kinematic data where cues relating to handovers and other interactions can detected. To achieve learning of handover cues from kinematic data, we propose using support vector machines (SVMs).  SVMs are recognized as a robust method in pattern recognition and classification and have been applied to numerous classification and regression problems with exceptionally good performance. There are several advantages to using SVMs for classification tasks including the exploitation of the kernel trick (being able to use non-linear functions), inclusion of regularization and large margin for better generalization. Their robustness in binary classification makes SVMs a good fit for this work, and hence, we apply SVMs for automated recognition of when a handover occurs using human body kinematics.

Handover-MotionTracking

Picture 1 of 1