Tutorial

Tutorial Title: Human Motion Tracking with Microsoft Kinect


Instructor: Wenbing Zhao, Professor, Cleveland State University,USA


Dr. Zhao is a Full Professor at the Department of Electrical Engineering and Computer Science, Cleveland State University (CSU). He earned his Ph.D. degree at University of California, Santa Barbara in 2002. Dr. Zhao has been doing research on smart and connected health since 2010 and on distributed systems since 1998. He has an active sponsored research grant on building a Kinect-based system to enhance safe patient handling in nursing homes, and has been teaching a course on Kinect application development at CSU. Dr. Zhao has over 150 peer-reviewed publications, and a US patent (pending) on privacy-aware selective human activity tracking using programmable depth cameras. He has served on several research panels for the US National Science Foundation, as the Program Chair for IEEE Smart World Congress (Toulouse, France) in 2016, and as a member of the technical program committee for many IEEE conferences. 


Keywords: Human-Centered Cyber-Physical Systems, Depth Cameras, Wearable Computing, Gesture and Activity Recognition, Machine Learning


This tutorial will enable participants to understand cutting-edge depth sensing and wearable computing technologies, learn various algorithms for human activity recognition, and learn how to build human-centered cyber-physical systems with real-time sensing, decision making, and feedback.


This tutorial covers emerging topics on how to use Microsoft Kinect and commercial-off-the-shelf wearable devices such as Smart Watches and Microsoft Band for innovative human-centered cyber-physical systems. This tutorial will contain the following topics: (1) Human motion tracking with Microsoft Kinect; (2) Gesture and activity recognition algorithms, including both rule-based and machine-learning based approaches; (3) Activity and sleep tracking using inertial sensors; (4) Physiological sensing using Smart Watches and Microsoft Band, such as heart rate, heart rate variation, skin conductance level, etc.; (5) Applications of Microsoft Kinect and wearable devices; (6) Human-computer interactions, including natural user interfaces, context-aware sensing, and haptic feedback; (7) Open research issues. This tutorial will enable participants to understand cutting-edge depth sensing and wearable computing technologies, learn various algorithms for human activity recognition, and learn how to build human-centered cyber-physical systems with real-time sensing, decision making, and feedback.

Outline of the tutorial:

·         Introduction to human-centered cyber-physical system

ü  Many exciting systems are waiting to be developed that can improve standard of living, reduce the cost of healthcare, improve safety, etc.

·         Introduction to modern cyber-enabled gadgets

ü  Microsoft Kinect: Depth sensing camera with microphone array

ü  Microsoft Band: Inertial sensing as well as physiological tracking

ü  Smart watches: Apple watch, Android Wear watches, and Pebble smart watch. Inertial sensing and heart rate tracking

ü  Smart phones: bridge Bluetooth connections from wearable devices, convenient to use.

·         Survey of existing applications of modern cyber-enabled gadgets

ü  Healthcare (physical therapy, operating room assistance, and fall detection and prevention)

ü  Virtual reality and gaming

ü  Robotics control and interaction,

ü  Workplace safety training

ü  Speech and sign language recognition,

ü  Education and performing arts

·         Programming cyber-enabled gadgets

ü  Kinect software development kit (SDK): Depth sensing; skeleton tracking, using microphone array

ü  Microsoft Band SDK: Heart rate sensing, skin temperature sensing, measuring Galvanic skin response (skin conductance level), using microphone, ambient light sensing, inertial sensing, haptics feedback

ü  Android Wear watch SDK (identical to Android SDK). Inertial sensing, heart rate sensing, haptics feedback

ü  Pebble SDK: inertial sensing, heart rate sensing, haptics feedback

·         Human motion/activity tracking and recognition

ü  Algorithmic- (rule-) based gesture/activity recognition based skeleton data

ü  Machine-learning based gesture/activity recognition based skeleton data

ü  Activity intensity and step counting using inertial sensing data

ü  Sleep tracking using inertial sensing data

ü  Gait tracking using skeleton data and/or inertial sensing data

ü  Context aware recognition

·         Voice/Speech recognition

ü  Simple keyword based and grammar based voice recognition

ü  Speaker identification and speed recognition with CMU Sphinx

ü  Cloud-based speech recognition services

·         Virtual/mixed reality interface with Unity platform

ü  Introduction to Unity 3D platform

ü  Scene design and scripting

ü  Controlling avatars using realtime motion sensing data

·         User interface design on wearable devices

ü  Pebble watch interface

ü  Android Wear watch interface

·         Open research issues

ü  Continuously monitoring of one or more users in multiple rooms/buildings via federation of multiple human motion tracking systems

ü  Non-intrusive user identification (biometrics)

ü  System usability and long term stability

ü  Tracking cognitive, behavioral, and functional patterns of seniors and children with Autism


Tutorial

Title: Human Motion Tracking with Microsoft Kinect

Wenbing Zhao, Professor, Cleveland State University

Keywords: Human-Centered Cyber-Physical Systems, Depth Cameras, Wearable Computing, Gesture and Activity Recognition, Machine Learning

Important Date

Submission Deadline(Full paper)Sept. 20,  2017
Submission Deadline(Abstract)Sept. 1,  2017
Author notificationBefore Set.30, 2017
Final versionBefore Oct.5, 2017
RegistrationBefore Oct.5, 2017
Main conferenceOct 20-22,2017

Oral Presentation Guidelines

speaker.png

· Oral presentations of volunteered papers are 17 minutes with 3 minutes for discussion(20 minutes in total for Q&A.)

· All presenters must bring their PowerPoint/PDF presentation on a USB memory key one hour before their presentation to have it preloaded on the session computers. Individual presentations then begin with the click of a mouse.

· All events are conducted in English

· Practice your presentation before and and time it.  

· Use active words, short sentences. Words should reinforce visual material.  

· Our events are designed to be as interactive and frank as possible. All speakers and participants are encouraged to participate in their own personal capacity.

· Participants are encouraged to contribute in their own personal capacity

· Speak loudly and clearly.  

· You may wish to bring business cards to share your contact information with other conference attendees.

Best Paper Award 


The main purpose of instituting a Best Oral Presentation Award: This award is intended to reward the presenters for the extra effort it takes to prepare a top caliber presentation. Through this mechanism we hope to encourage an even higher caliber of presentation.

The Award: The recipient will receive a certificate.

Number of Awards: each session will select a best oral presentation

Student best presentation: The award is open to any presenter under 35 years of age.

The criteria for the Best Oral Presentation Award are

•Clarity of submitted abstract

•Importance of the work

•Novelty of the work

•Level of completion of the work

•Uniqueness or originality of the research topic

•Ability of the presenter to explain the work

Media Partners


1490687023424945.png