{"id":350,"date":"2024-10-14T16:12:22","date_gmt":"2024-10-14T16:12:22","guid":{"rendered":"https:\/\/sites.clarkson.edu\/avhbac\/?page_id=350"},"modified":"2024-10-14T16:12:22","modified_gmt":"2024-10-14T16:12:22","slug":"prosthetic-hand","status":"publish","type":"page","link":"https:\/\/sites.clarkson.edu\/avhbac\/prosthetic-hand\/","title":{"rendered":"Prosthetic Hand"},"content":{"rendered":"\n<h2 class=\"wp-block-heading\"><strong>Vision-Enabled Pediatric Prosthetic Hand<\/strong><\/h2>\n\n\n\n<h3 class=\"wp-block-heading has-medium-font-size\"><strong>Empowering children with advanced, low-cost prosthetics.<\/strong><\/h3>\n\n\n\n<div class=\"wp-block-media-text is-stacked-on-mobile\" style=\"grid-template-columns:15% auto\"><figure class=\"wp-block-media-text__media\"><img loading=\"lazy\" decoding=\"async\" width=\"277\" height=\"500\" src=\"https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/Hand-Design-2-2.png\" alt=\"Prosthetic Hand design, digital drawing\" class=\"wp-image-351 size-full\" srcset=\"https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/Hand-Design-2-2.png 277w, https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/Hand-Design-2-2-166x300.png 166w\" sizes=\"auto, (max-width: 277px) 100vw, 277px\" \/><\/figure><div class=\"wp-block-media-text__content\">\n<p>Our research is to present an &#8216;AI vision&#8217; based digital design of the control hardware for a novel pediatric prosthesis&nbsp;to assist children with upper limb disabilities. This prosthetic hand will have an anthropomorphic appearance, soft structure, multi-articulating functionality for grasping a wide range of objects, and lower weight with a size similar to the natural hand of the target population: children aged 5-10 years old. <span style=\"margin: 0px;padding: 0px\">This machine-vision-based&nbsp;<em>system-on-an-FPGA<\/em>&nbsp;research has immense societal importance as children in their early and middle childhood with limb loss are underserved with current prosthetic hand options.<\/span> The constant growth of children requires a frequent replacement of their hand prostheses, which, due to the high cost of commercial prostheses, is not affordable for many families. The majority of prostheses&nbsp;available to them are myoelectric-based and priced at around 14,000 USD. Our goal is to present an alternative (vision-controlled) but smart prosthetic hand customizable for a range of body\/arm sizes. At the same time, we seek to maintain a low cost and power budget to maintain accessibility for third-world countries.<\/p>\n<\/div><\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p><strong>Introduction to the Problem<\/strong>: Every year, children with congenital or acquired limb loss face significant physical and emotional challenges due to a lack of affordable prosthetics. Commercial prostheses, while functional, are often out of reach for many families due to costs that can exceed $14,000. Furthermore, existing solutions like myoelectric prostheses come with technical limitations that hinder mass adoption.<\/p>\n\n\n\n<p><strong>Our Solution<\/strong>:<br>Our team has developed a prosthetic hand designed specifically for children aged 4-10, integrating AI-based vision control. With its lightweight, customizable, and 3D-printed design, this hand offers a functional, affordable, and scalable alternative to conventional prostheses. By leveraging machine vision and sensors, our design allows for real-time object detection and automatic grasping without the need for extensive user training.<\/p>\n\n\n\n<ul class=\"wp-block-list\">\n<li><strong>AI Vision Control<\/strong>: A wrist-mounted camera provides real-time object detection and distance approximation for precise grasping.<\/li>\n\n\n\n<li><strong>Low-Cost 3D Printing<\/strong>: The hand is made using thermoplastic polyester (PLA) and TPU, ensuring both flexibility and durability.<\/li>\n\n\n\n<li><strong>Power Efficiency<\/strong>: Using an FPGA-based control system reduces overall power consumption, making it affordable and accessible even in resource-constrained environments.<\/li>\n<\/ul>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<details class=\"wp-block-details is-layout-flow wp-block-details-is-layout-flow\"><summary>Design &amp; Materials<\/summary>\n<p>Our prosthetic hand is 3D-printed in-house using a combination of PLA and TPU, creating a soft, flexible structure. This allows for a lightweight design while ensuring the hand can handle the physical stresses of daily use. TPU\u2019s flexibility is used in key areas like the fingertips to enhance the hand&#8217;s grasping ability, while PLA offers durability in other structural parts.<\/p>\n<\/details>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<details class=\"wp-block-details is-layout-flow wp-block-details-is-layout-flow\"><summary>Actuation &amp; Control<\/summary>\n<p>The prosthetic hand uses three degrees of actuation to replicate the movements required for everyday activities. The hand\u2019s movements are controlled by an AI vision system embedded within the wrist, which uses a micro camera to detect objects in real time. This system enables precise, automatic grasping, allowing children to use the prosthesis with minimal training.<\/p>\n<\/details>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<details class=\"wp-block-details is-layout-flow wp-block-details-is-layout-flow\"><summary>Real-World Testing<\/summary>\n<p>Our prototypes have undergone rigorous lab testing, with real-world testing set to begin soon. Certified prosthetists and clinicians are currently reviewing the design, and we will begin testing with a small group of children with upper limb differences. The prosthetic is designed to be customizable to accommodate the child\u2019s growth, ensuring long-term use without requiring frequent replacement.<\/p>\n<\/details>\n<\/div>\n<\/div>\n\n\n\n<hr class=\"wp-block-separator has-alpha-channel-opacity\"\/>\n\n\n\n<p><strong>The Research Team &amp; Collaborators<\/strong><\/p>\n\n\n\n<div class=\"wp-block-columns is-layout-flex wp-container-core-columns-is-layout-9d6595d7 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<p><strong>PI Imtiaz<\/strong><br>With over a decade of experience in machine vision and sensor development, PI Imtiaz leads the research team at Clarkson University. His expertise in designing low-power AI systems has resulted in numerous innovations, from wearable sensors to heart rate monitors. He is currently applying these principles to prosthetic development, creating a robust, vision-enabled control unit for pediatric prostheses.<\/p>\n<\/div>\n\n\n\n<div class=\"wp-block-column is-layout-flow wp-block-column-is-layout-flow\">\n<p><strong>CoPI Kevin Fite<\/strong><br>Kevin Fite brings a wealth of experience in assistive technologies, specifically in designing and controlling upper-extremity prosthetic limbs. His research at the Laboratory for Intelligent Automation focuses on low-cost, customizable solutions for individuals with limb loss. Kevin has been instrumental in guiding the development of this prosthetic hand\u2019s mechanical and actuation system.<\/p>\n<\/div>\n<\/div>\n\n\n\n<p><strong>Collaborators<\/strong>: This project is a collaboration between Clarkson University\u2019s AI Vision Lab, Lab for Intelligent Automation, and the Center for Advanced PCB Design and Manufacture. <\/p>\n\n\n\n<figure class=\"wp-block-gallery has-nested-images columns-default is-cropped wp-block-gallery-1 is-layout-flex wp-block-gallery-is-layout-flex\">\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"277\" height=\"500\" data-id=\"351\" src=\"https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/Hand-Design-2-2.png\" alt=\"Prosthetic Hand design, digital drawing\" class=\"wp-image-351\" srcset=\"https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/Hand-Design-2-2.png 277w, https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/Hand-Design-2-2-166x300.png 166w\" sizes=\"auto, (max-width: 277px) 100vw, 277px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"390\" height=\"687\" data-id=\"352\" src=\"https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/ISO-HAND-DESIGN-1.png\" alt=\"Prosthetic Hand design, digital drawing\" class=\"wp-image-352\" srcset=\"https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/ISO-HAND-DESIGN-1.png 390w, https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/ISO-HAND-DESIGN-1-170x300.png 170w\" sizes=\"auto, (max-width: 390px) 100vw, 390px\" \/><\/figure>\n\n\n\n<figure class=\"wp-block-image size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"821\" height=\"653\" data-id=\"353\" src=\"https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/PrintInPlace_finger-1.png\" alt=\"Prosthetic Hand design, digital drawing\" class=\"wp-image-353\" srcset=\"https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/PrintInPlace_finger-1.png 821w, https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/PrintInPlace_finger-1-300x239.png 300w, https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/PrintInPlace_finger-1-768x611.png 768w\" sizes=\"auto, (max-width: 821px) 100vw, 821px\" \/><\/figure>\n<\/figure>\n\n\n\n<p>Upon detecting an object, the Time-of-Flight (TOF) sensor (VL6180x) accurately calculates its distance. If the object is within 70-90mm of the hand, a &#8220;Hand Close&#8221; signal is sent to the motor controller to initiate the closing action. As the hand closes, pressure sensors provide feedback, and if the pressure exceeds a predefined threshold, the closing motion halts to prevent overexertion. Once the hand is fully closed, the accelerometer (ADXL345) is activated, awaiting a specific gesture. Upon detecting the gesture, the system sends a &#8220;Hand Open&#8221; signal to the motor controller to reopen the hand. A video demonstration of how the previous design functions is shown below.<\/p>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Vision Controlled Sensorized Prosthetic Hand\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/7qkj1d9e5f4?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<figure class=\"wp-block-embed is-type-video is-provider-youtube wp-block-embed-youtube wp-embed-aspect-16-9 wp-has-aspect-ratio\"><div class=\"wp-block-embed__wrapper\">\n<iframe loading=\"lazy\" title=\"Vision Controlled Sensorized Prosthetic Hand\" width=\"640\" height=\"360\" src=\"https:\/\/www.youtube.com\/embed\/yInMzl4Ef7A?feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" referrerpolicy=\"strict-origin-when-cross-origin\" allowfullscreen><\/iframe>\n<\/div><\/figure>\n\n\n\n<p><strong>Image Collection<\/strong>. To develop highly efficient deep neural networks, initially, 2000 images were captured from the wrist position to train on a computer before customization for a resource constraint processor. This pilot study has captured images of six object classes (ball, cup, bottle, pen, spoon, keys); more image collection is required for a variety of object classes (commonly encountered in daily life) under different lighting conditions and backgrounds, etc. This data collection may not require the true amputee population. After image collection, similar to the pilot study, a manual image review will be performed by two individuals to verify whether the cameras are capable of capturing images under all lighting conditions, including in dim environments.<\/p>\n\n\n<div class=\"wp-block-image\">\n<figure class=\"aligncenter size-full\"><img loading=\"lazy\" decoding=\"async\" width=\"620\" height=\"542\" src=\"https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/prosthetic-hand-sample_images.png\" alt=\"examples of prosthetic hand in a testing environment.\" class=\"wp-image-354\" srcset=\"https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/prosthetic-hand-sample_images.png 620w, https:\/\/sites.clarkson.edu\/avhbac\/wp-content\/uploads\/sites\/70\/2024\/10\/prosthetic-hand-sample_images-300x262.png 300w\" sizes=\"auto, (max-width: 620px) 100vw, 620px\" \/><\/figure>\n<\/div>","protected":false},"excerpt":{"rendered":"<p>Vision-Enabled Pediatric Prosthetic Hand Empowering children with advanced, low-cost prosthetics. Our research is to present an &#8216;AI vision&#8217; based digital design of the control hardware for a novel pediatric prosthesis&nbsp;to assist children with upper limb disabilities. This prosthetic hand will have an anthropomorphic appearance, soft structure, multi-articulating functionality for grasping a wide range of objects, [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"advgb_blocks_editor_width":"","advgb_blocks_columns_visual_guide":"","footnotes":""},"class_list":["post-350","page","type-page","status-publish","hentry"],"coauthors":[],"author_meta":{"author_link":"https:\/\/sites.clarkson.edu\/avhbac\/author\/moeler\/","display_name":"moeler"},"relative_dates":{"created":"Posted 1 year ago","modified":"Updated 1 year ago"},"absolute_dates":{"created":"Posted on October 14, 2024","modified":"Updated on October 14, 2024"},"absolute_dates_time":{"created":"Posted on October 14, 2024 4:12 pm","modified":"Updated on October 14, 2024 4:12 pm"},"featured_img_caption":"","featured_img":false,"series_order":"","_links":{"self":[{"href":"https:\/\/sites.clarkson.edu\/avhbac\/wp-json\/wp\/v2\/pages\/350","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/sites.clarkson.edu\/avhbac\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/sites.clarkson.edu\/avhbac\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/sites.clarkson.edu\/avhbac\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/sites.clarkson.edu\/avhbac\/wp-json\/wp\/v2\/comments?post=350"}],"version-history":[{"count":1,"href":"https:\/\/sites.clarkson.edu\/avhbac\/wp-json\/wp\/v2\/pages\/350\/revisions"}],"predecessor-version":[{"id":355,"href":"https:\/\/sites.clarkson.edu\/avhbac\/wp-json\/wp\/v2\/pages\/350\/revisions\/355"}],"wp:attachment":[{"href":"https:\/\/sites.clarkson.edu\/avhbac\/wp-json\/wp\/v2\/media?parent=350"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}