Tutorial

Tutorial Title: Human Motion Tracking with Microsoft Kinect


Instructor: Wenbing Zhao, Professor, Cleveland State University,USA


Dr. Zhao is a Full Professor at the Department of Electrical Engineering and Computer Science, Cleveland State University (CSU). He earned his Ph.D. degree at University of California, Santa Barbara in 2002. Dr. Zhao has been doing research on smart and connected health since 2010 and on distributed systems since 1998. He has an active sponsored research grant on building a Kinect-based system to enhance safe patient handling in nursing homes, and has been teaching a course on Kinect application development at CSU. Dr. Zhao has over 150 peer-reviewed publications, and a US patent (pending) on privacy-aware selective human activity tracking using programmable depth cameras. He has served on several research panels for the US National Science Foundation, as the Program Chair for IEEE Smart World Congress (Toulouse, France) in 2016, and as a member of the technical program committee for many IEEE conferences. 


Keywords: Human-Centered Cyber-Physical Systems, Depth Cameras, Wearable Computing, Gesture and Activity Recognition, Machine Learning


This tutorial will enable participants to understand cutting-edge depth sensing and wearable computing technologies, learn various algorithms for human activity recognition, and learn how to build human-centered cyber-physical systems with real-time sensing, decision making, and feedback.


This tutorial covers emerging topics on how to use Microsoft Kinect and commercial-off-the-shelf wearable devices such as Smart Watches and Microsoft Band for innovative human-centered cyber-physical systems. This tutorial will contain the following topics: (1) Human motion tracking with Microsoft Kinect; (2) Gesture and activity recognition algorithms, including both rule-based and machine-learning based approaches; (3) Activity and sleep tracking using inertial sensors; (4) Physiological sensing using Smart Watches and Microsoft Band, such as heart rate, heart rate variation, skin conductance level, etc.; (5) Applications of Microsoft Kinect and wearable devices; (6) Human-computer interactions, including natural user interfaces, context-aware sensing, and haptic feedback; (7) Open research issues. This tutorial will enable participants to understand cutting-edge depth sensing and wearable computing technologies, learn various algorithms for human activity recognition, and learn how to build human-centered cyber-physical systems with real-time sensing, decision making, and feedback.

Outline of the tutorial:

·         Introduction to human-centered cyber-physical system

ü  Many exciting systems are waiting to be developed that can improve standard of living, reduce the cost of healthcare, improve safety, etc.

·         Introduction to modern cyber-enabled gadgets

ü  Microsoft Kinect: Depth sensing camera with microphone array

ü  Microsoft Band: Inertial sensing as well as physiological tracking

ü  Smart watches: Apple watch, Android Wear watches, and Pebble smart watch. Inertial sensing and heart rate tracking

ü  Smart phones: bridge Bluetooth connections from wearable devices, convenient to use.

·         Survey of existing applications of modern cyber-enabled gadgets

ü  Healthcare (physical therapy, operating room assistance, and fall detection and prevention)

ü  Virtual reality and gaming

ü  Robotics control and interaction,

ü  Workplace safety training

ü  Speech and sign language recognition,

ü  Education and performing arts

·         Programming cyber-enabled gadgets

ü  Kinect software development kit (SDK): Depth sensing; skeleton tracking, using microphone array

ü  Microsoft Band SDK: Heart rate sensing, skin temperature sensing, measuring Galvanic skin response (skin conductance level), using microphone, ambient light sensing, inertial sensing, haptics feedback

ü  Android Wear watch SDK (identical to Android SDK). Inertial sensing, heart rate sensing, haptics feedback

ü  Pebble SDK: inertial sensing, heart rate sensing, haptics feedback

·         Human motion/activity tracking and recognition

ü  Algorithmic- (rule-) based gesture/activity recognition based skeleton data

ü  Machine-learning based gesture/activity recognition based skeleton data

ü  Activity intensity and step counting using inertial sensing data

ü  Sleep tracking using inertial sensing data

ü  Gait tracking using skeleton data and/or inertial sensing data

ü  Context aware recognition

·         Voice/Speech recognition

ü  Simple keyword based and grammar based voice recognition

ü  Speaker identification and speed recognition with CMU Sphinx

ü  Cloud-based speech recognition services

·         Virtual/mixed reality interface with Unity platform

ü  Introduction to Unity 3D platform

ü  Scene design and scripting

ü  Controlling avatars using realtime motion sensing data

·         User interface design on wearable devices

ü  Pebble watch interface

ü  Android Wear watch interface

·         Open research issues

ü  Continuously monitoring of one or more users in multiple rooms/buildings via federation of multiple human motion tracking systems

ü  Non-intrusive user identification (biometrics)

ü  System usability and long term stability

ü  Tracking cognitive, behavioral, and functional patterns of seniors and children with Autism


Copyright ©www.iscsic.org 2024-2025 All Rights Reserved.