<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Projects | Chris Yee WONG, Ph.D., P.Eng.</title><link>https://chrisywong.github.io/projects/</link><atom:link href="https://chrisywong.github.io/projects/index.xml" rel="self" type="application/rss+xml"/><description>Projects</description><generator>HugoBlox Kit (https://hugoblox.com)</generator><language>en-us</language><lastBuildDate>Sun, 19 May 2024 00:00:00 +0000</lastBuildDate><item><title>Virtual reality and psHRI</title><link>https://chrisywong.github.io/projects/vr/</link><pubDate>Wed, 01 Jan 2025 00:00:00 +0000</pubDate><guid>https://chrisywong.github.io/projects/vr/</guid><description>&lt;p&gt;The goal is to use virtual reality as a tool for studying psHRI by achieving contextually-rich but low-cost interactions.&lt;/p&gt;
&lt;!-- Please visit my [ResearchGate project page](https://www.researchgate.net/project/Sensor-Observability-Analysis) to see the list of related research items. --&gt;</description></item><item><title>Robot-to-Human Grasping</title><link>https://chrisywong.github.io/projects/graspr2h/</link><pubDate>Tue, 30 May 2023 00:00:00 +0000</pubDate><guid>https://chrisywong.github.io/projects/graspr2h/</guid><description>&lt;p&gt;During interactions between a human and a robot, there may be a time when a robot must purposefully come into contact or grasp human (e.g., a robot grasps a human’s hand to physically guide them to perform a task, to teach a motion, or to provide stability and support). Depending on how the robot grasps the human (e.g., grasp location, orientation, force, and open/closed grip), different grasp types may elicit different emotional responses from the human. The goal of the project is to investigate what robot-to-human (R2H) contact/grasp factors affect the perceived safety and comfort of the interaction and how it differs from the similar human-to-human (H2H) contact/grasping.&lt;/p&gt;
&lt;!-- Please visit my [ResearchGate project page](https://www.researchgate.net/project/Sensor-Observability-Analysis) to see the list of related research items. --&gt;</description></item><item><title>Robots in Retail</title><link>https://chrisywong.github.io/projects/retail/</link><pubDate>Mon, 01 May 2023 00:00:00 +0000</pubDate><guid>https://chrisywong.github.io/projects/retail/</guid><description>&lt;p&gt;The goal of this project is to explore how robots can be used in retail spaces to assist shoppers, especially those that may have visual or mobility impairments.
Through discussion with our
, we discovered many misconceptions surrounding the blind and visually impaired (BVI) population. These misconceptions drive misinformed research into robotic assistants that are misaligned with the unmet needs of the BVI population and are an inefficient use of precious resources. Our current work discusses
as well as the potential for
.&lt;/p&gt;</description></item><item><title>Comfort in psHRI</title><link>https://chrisywong.github.io/projects/pshricomfort/</link><pubDate>Tue, 01 Feb 2022 00:00:00 +0000</pubDate><guid>https://chrisywong.github.io/projects/pshricomfort/</guid><description>&lt;p&gt;Users must feel comfortable when interacting with robots prior to mass adoption. This early-stage project involves examining several factors that may involve interactions between robots and humans.&lt;/p&gt;
&lt;p&gt;For example, how does humanoid robot &lt;em&gt;form&lt;/em&gt; impact user comfort during psHRI? Particularly with the goal of &lt;em&gt;successful ageing&lt;/em&gt; in mind, are there factors that are specific to the elderly population?&lt;/p&gt;
&lt;p&gt;By comparing participant microbehaviours, physiological signals, and questionnaire answers when performing the same task with different robots, we hope to determine if certain humanoid robot features affect user comfort in order to guide future robotic assistant designs.&lt;/p&gt;
&lt;!-- Please visit my [ResearchGate project page](https://www.researchgate.net/project/Sensor-Observability-Analysis) to see the list of related research items. --&gt;</description></item><item><title>Sensor Observability Analysis</title><link>https://chrisywong.github.io/projects/soa/</link><pubDate>Sat, 01 Jan 2022 00:00:00 +0000</pubDate><guid>https://chrisywong.github.io/projects/soa/</guid><description>&lt;!-- Sensor Observability Analysis, akin to the kinematic manipulability index, aims to quantify the quality of sensor observations of task-space quantities based on the robot configuration for optimization purposes. --&gt;
&lt;p&gt;Sensor Observability Analysis, akin to the kinematic manipulability index, is a novel performance metric for articulated robotic mechanisms.
The goal is to analyse and evaluate the performance of robot-mounted distributed directional or axial-based sensors to observe specific axes in task space as a function of joint configuration.
For example, joint torque sensors are often used in serial robot manipulators and assumed to be perfectly capable of estimating end effector forces, but certain joint configurations may cause one or more task-space axes to be unobservable as a result of how the joint torque sensors are aligned.
The proposed sensor observability analysis provides a method to analyse the cumulative quality of a robot configuration to observe the task space.
The resultant metrics can then be used in optimization and in null-space control to avoid sensor observability singular configurations or to maximize sensor observability in particular directions.
Parallels are drawn between sensor observability and the traditional kinematic Jacobian for the particular case of joint torque sensors in serial robot manipulators.
Compared to kinematic analysis using the Jacobian in serial manipulators, sensor observability analysis is shown to be more generalizable in terms of analysing non-joint-mounted sensors and can potentially be applied to sensor types other than for force sensing, e.g., link-mounted proximity sensors.
We demonstrate the utility and importance of sensor observability in physical interactions using simulations and experiments of a custom 3-DOF robot and the Baxter robot.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Related Research Items&lt;/strong&gt;:&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;C. Y. Wong and W. Suleiman, &amp;ldquo;Sensor Observability Analysis for Maximizing Task-Space Observability of Articulated Robots,&amp;rdquo; in &lt;em&gt;IEEE Transactions on Robotics&lt;/em&gt;, 2024. (Accepted)
&lt;ul&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;C. Y. Wong and W. Suleiman, &amp;ldquo;Sensor Observability Index: Evaluating Sensor Alignment for Task-Space Observability in Robotic Manipulators,&amp;rdquo; &lt;em&gt;2022 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)&lt;/em&gt;, Kyoto, Japan, 2022, pp. 1276-1282.
&lt;ul&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>Touch in pHRI</title><link>https://chrisywong.github.io/projects/phri/</link><pubDate>Fri, 27 Sep 2019 00:00:00 +0000</pubDate><guid>https://chrisywong.github.io/projects/phri/</guid><description>&lt;p&gt;Development of pHRI techniques with a particular focus on multimodal touch interpretation and intention detection for the goal of achieving safe, comfortable, and intuitive autonomous pHRI.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Related Research Items:&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;
&lt;p&gt;C. Y. Wong, S. Samadi, W. Suleiman and A. Kheddar, &amp;ldquo;&lt;em&gt;Touch Semantics for Intuitive Physical Manipulation of Humanoids&lt;/em&gt;,&amp;rdquo; in IEEE Transactions on Human-Machine Systems, vol. 52, no. 6, pp. 1111-1121, Dec. 2022, doi: 10.1109/THMS.2022.3207699.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Video 1:
&lt;/li&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;p&gt;C. Y. Wong, L. Vergez and W. Suleiman, &amp;ldquo;&lt;em&gt;Vision- and Tactile-Based Continuous Multimodal Intention and Attention Recognition for Safer Physical Human–Robot Interaction&lt;/em&gt;,&amp;rdquo; in IEEE Transactions on Automation Science and Engineering, vol. 21, no. 3, pp. 3205-3215, July 2024, doi: 10.1109/TASE.2023.3276856.&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;Video:
&lt;/li&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;li&gt;
&lt;/li&gt;
&lt;/ul&gt;
&lt;/li&gt;
&lt;/ul&gt;</description></item><item><title>Single Cell Manipulation</title><link>https://chrisywong.github.io/projects/singlecellsurgery/</link><pubDate>Sun, 01 Sep 2013 00:00:00 +0000</pubDate><guid>https://chrisywong.github.io/projects/singlecellsurgery/</guid><description/></item><item><title>Hyrbrid Quadruped Posture Generation and Step Climbing</title><link>https://chrisywong.github.io/projects/mht/</link><pubDate>Sat, 01 Sep 2012 00:00:00 +0000</pubDate><guid>https://chrisywong.github.io/projects/mht/</guid><description/></item></channel></rss>