ROS Resources: Documentation | Support | Discussion Forum | Service Status | Q&A answers.ros.org

Announcing GazeSense_Bridge Package

Robotics researchers and developers can now rely on the GazeSense™ bridge to measure attention towards objects in ROS

https://eyeware.tech/gazesense/

GazeSense™ is an application developed by Eyeware Tech SA (http://www.eyeware.tech) that provides 3D eye tracking by relying on consumer 3D sensors. GazeSense™ allows to define virtual 3D objects with respect to the camera and world coordinate systems and measure the attention of people towards the objects.

The bridge currently implements an example in which the attention sensing objects are defined in terms of 3D primitives (eg. planes, cylinders, points). The tracking parameters, like head pose, as well as the measurement of attention towards the virtual 3D objects, are then published as the topic gs_persons . Other ROS nodes can subscribe receiving this feed information. Currently, the GazeSense™ bridge does not yet support reading RGBD camera data from a ROS node. Should this feature be of help for you, reach out to us to let us know via an email to products@eyeware.tech or by simply submitting your request in our Trello Eyeware Products Board / Ideas & Requests (https://trello.com/b/HLiqqYs4/eyeware-products). Markers are also provided for visualization within rviz (http://wiki.ros.org/rviz).

GazeSense ROS plugin Github repo: https://github.com/eyeware/eyeware-ros/tree/master/gazesense_bridge
GazeSense ROS wiki page: http://wiki.ros.org/eyeware-ros
Find us on ROS Index at: https://index.ros.org/p/gazesense_bridge/github-eyeware-eyeware-ros/#melodic

What will GazeSense do on ROS? It offers a mechanism for perception, where GazeSense provides robotics researchers and developers with the real-time signals on attention towards objects. One would be able to tell if a person interacting with the robot is looking at either of the defined objects or not, or if looking at the robot itself. The attention label (‘what is the person looking at’) is then published as a topic into the ROS framework.

Sensing attention is particularly useful for people working in the human-robot interaction (HRI) as attention sensing is key for understanding engagement, intention and for building rapport.

The first iteration of the GazeSense ROS plugin is focused towards a hard-coded version to allow a user to:

  • define objects and feed them into GazeSense;
  • capture the GazeSense signals on attention towards objects and publish it into ROS.

Why did we create a GazeSense ROS plugin available for all robotics researchers and developers using ROS?

Because creating truly robust and general-purpose robot software is hard, and no one can hope to do it alone. We are thus joining the world class, collaborative robotics software development platform and vibrant ROS community of roboticists, to help contribute producing robust solutions and build on each other’s work.

If you are looking for more features, like for example a more ‘plug-n-play’ object definition for the GazeSense ROS bridge, reach out to us to let us know via an email to products@eyeware.tech or by simply submitting your request in our Ideas & Requests Trello Board.

Your feedback is most appreciated!

Thanks,
Alexandra Petrus

2 Likes