Collaborated Research

Before joining Clarkson University, Prof Imtiaz had the following collaborations:

Personal Automatic Cigarette Tracker 2.0. Collaboration with the University of Alabama

• Design and development of multi-sensory wearable sensor systems to objectively monitor behavioral and physiological manifestation of cigarette smoking in free-living. Major developments include a low power chest device, a hand device, an instrumented lighter, an egocentric camera, a Raspberry PI-based smart IoT Charger, etc.

• Application of computational intelligence to extract information on smoking habits. Major accomplishments include the development of an SVM-based machine learning model to detect smoking from heart rate parameters, an RCNN-based deep learning model to recognize smoking events from full-day images, an image classifier to categorize smoking environment and smoking context, SVM and CNN-LSTM based models to classify smoking and non-smoking gestures of the hand, smoke inhalations, extraction of smoke metrics, etc.

Automatic Ingestion Monitor 2.0. Collaboration with the University of Alabama

• Design and development of multi-sensory wearable camera-based sensor systems to objectively monitor the eating behavior of an individual in free living. Major developments include a low-power egocentric eye-glass camera, an ear-mount sensor, etc.

• Application of computational intelligence to extract information on food intake. Major accomplishments include the development of an SVM-based machine learning model to detect eating episodes and determine the type of food.

• Design and development of a low-power stereo camera to monitor the cooking and eating environment in the rural area.

Monitoring of Infant Feeding. Collaboration with the University of Alabama

• Design and development of an intelligent infant bottle (comprising of Omnivision camera, IMU, Pressure sensor, etc.) to monitor the milk intake of an infant.

• Development of computer algorithms to assess the nutritive sucking patterns of infants.

• Automatic extraction of breastfeeding statistics from the egocentric images captured from the mother’s eyeglass camera.

Personalized Prosthesis Controller Design. Collaboration with the University of Alabama

• Design and development of a measurement exoskeleton to support the development of robotic lower-limb prostheses. Development of computer algorithms to assess the human gait and walking speed from wearable and exoskeleton sensors.

• Development of a YOLO-deep learning model to recognize obstacles and involvement with stairs during human walking from eye level egocentric images.

• Development of a LIDAR-based sensor to obtain ambulatory velocity profile. • Development of an ANN to predict the treadmill walking speed from a chest-mounted IMU.