{"id":5128,"date":"2024-09-10T08:01:17","date_gmt":"2024-09-10T08:01:17","guid":{"rendered":"https:\/\/ieee-robio.org\/2024\/?page_id=5128"},"modified":"2024-12-02T10:57:24","modified_gmt":"2024-12-02T10:57:24","slug":"invited-talks","status":"publish","type":"page","link":"https:\/\/ieee-robio.org\/2024\/invited-talks\/","title":{"rendered":"Invited Talks"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-page\" data-elementor-id=\"5128\" class=\"elementor elementor-5128\" data-elementor-post-type=\"page\">\n\t\t\t\t<div class=\"elementor-element elementor-element-c655aee e-flex e-con-boxed e-con e-parent\" data-id=\"c655aee\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-3928018 elementor-widget elementor-widget-heading\" data-id=\"3928018\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Plenary Talks<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-e5030d5 e-flex e-con-boxed e-con e-parent\" data-id=\"e5030d5\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-2dd9328 e-con-full e-flex e-con e-child\" data-id=\"2dd9328\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-d9e2e4b elementor-widget elementor-widget-image\" data-id=\"d9e2e4b\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img fetchpriority=\"high\" decoding=\"async\" width=\"341\" height=\"455\" src=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/10\/Yunhui_liu_uniformed_1-e1730559514666.jpg\" class=\"attachment-medium_large size-medium_large wp-image-5674\" alt=\"\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-80ebbc9 e-con-full e-flex e-con e-child\" data-id=\"80ebbc9\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-5ca6699 elementor-widget elementor-widget-text-editor\" data-id=\"5ca6699\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>December 11th 9:00-10:00, <span dir=\"ltr\" role=\"presentation\">Thai Chitlada 1<\/span><span dir=\"ltr\" role=\"presentation\">, 2\/F<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e442bd7 elementor-widget elementor-widget-heading\" data-id=\"e442bd7\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Yunhui Liu<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-568f8df elementor-widget elementor-widget-heading\" data-id=\"568f8df\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Professor, The Chinese University of Hong Kong, China<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-3b9b20d elementor-widget elementor-widget-heading\" data-id=\"3b9b20d\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Towards Vision-Driven Robots<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c7347d1 elementor-widget elementor-widget-text-editor\" data-id=\"c7347d1\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Although robots are being widely used in various sectors, their performance and intelligence level are still far below humans\u2019 expectations. One of the major reasons is that robots are not skillful in coordinating visual information collected by vision with their actions. Efficient and effective coordination of eyes with arms\/hands in grasping\/manipulation, and with legs or wheels in walking or moving are crucial for robots to act reliably, robustly, and efficiently in natural environments. Vision-driven robots, i.e. robots that are driven by visual information or feedback, will be the key paradigms for robots towards real-world applications. This talk presents technical challenges in vision-driven robots and demonstrates our latest results on 3D visual sensing and perception, vision-driven robot grasping and manipulation, etc. Applications of the vision-driven robotics technologies in manufacturing, logistics and healthcare will be introduced as well.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-61429e8 e-flex e-con-boxed e-con e-parent\" data-id=\"61429e8\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-603a819 e-con-full e-flex e-con e-child\" data-id=\"603a819\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-4daf249 elementor-widget elementor-widget-text-editor\" data-id=\"4daf249\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>December 12th 9:00-10:00, <span dir=\"ltr\" role=\"presentation\">Thai Chitlada 1<\/span><span dir=\"ltr\" role=\"presentation\">, 2\/F<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-15966c7 elementor-widget elementor-widget-heading\" data-id=\"15966c7\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Yasuhisa Hirata<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ad6a31e elementor-widget elementor-widget-heading\" data-id=\"ad6a31e\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Professor, Tohoku University, Japan<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e6657eb elementor-widget elementor-widget-heading\" data-id=\"e6657eb\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Envisioning a Future Society with AI-Enabled Robots<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d707153 elementor-widget elementor-widget-text-editor\" data-id=\"d707153\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>This talk introduces our Moonshot project, part of Japan&#8217;s National Research and Development (R&amp;D) program. The Moonshot program supports high-risk, high-impact R&amp;D aimed at achieving ambitious goals and addressing challenges such as the super-aging population. The objective of our project is to develop adaptable AI-enabled robots that can be deployed in various settings. Currently, we are working on a range of assistive robots called the Robotic Nimbus, which can alter their shape and form based on the user\u2019s condition, environment, and the task at hand. These robots are designed to provide appropriate assistance, particularly for the elderly and disabled, empowering them to act independently.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-88f6517 e-con-full e-flex e-con e-child\" data-id=\"88f6517\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-67aa187 elementor-widget elementor-widget-image\" data-id=\"67aa187\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"341\" height=\"455\" src=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/10\/hirata_uniformed.jpg\" class=\"attachment-medium_large size-medium_large wp-image-5528\" alt=\"\" srcset=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/10\/hirata_uniformed.jpg 341w, https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/10\/hirata_uniformed-225x300.jpg 225w\" sizes=\"(max-width: 341px) 100vw, 341px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-16b60fd e-flex e-con-boxed e-con e-parent\" data-id=\"16b60fd\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-88edeb1 e-con-full e-flex e-con e-child\" data-id=\"88edeb1\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-804c123 elementor-widget elementor-widget-image\" data-id=\"804c123\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img decoding=\"async\" width=\"768\" height=\"1023\" src=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/12\/Jianwei_Zhang-e1733055770429-768x1023.jpg\" class=\"attachment-medium_large size-medium_large wp-image-5976\" alt=\"\" srcset=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/12\/Jianwei_Zhang-e1733055770429-768x1023.jpg 768w, https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/12\/Jianwei_Zhang-e1733055770429-225x300.jpg 225w\" sizes=\"(max-width: 768px) 100vw, 768px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-e42eb7d e-con-full e-flex e-con e-child\" data-id=\"e42eb7d\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-fc5dc93 elementor-widget elementor-widget-text-editor\" data-id=\"fc5dc93\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>December 13th 9:00-10:00, <span dir=\"ltr\" role=\"presentation\">Thai Chitlada 1<\/span><span dir=\"ltr\" role=\"presentation\">, 2\/F<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-2fa2646 elementor-widget elementor-widget-heading\" data-id=\"2fa2646\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Jianwei Zhang<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-9494ea8 elementor-widget elementor-widget-heading\" data-id=\"9494ea8\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Professor, University of Hamburg, Germany<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-09ed616 elementor-widget elementor-widget-heading\" data-id=\"09ed616\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">The Convergence of Embodied AI and Modular Control Towards Generalist Robots<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-9100601 elementor-widget elementor-widget-text-editor\" data-id=\"9100601\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Traditional modular control approaches in robotics primarily rely on manual programming and analytical models with hand-crafted rules for action planning and execution. While functional for specific tasks, these methods limit the dexterity and adaptability of robots in complex, open-ended environments. The emergence of embodied AI marks a rapid advancement in developing general-purpose robotic manipulation. Large multimodal models (LMMs) facilitate action planning by combining bottom-up skills, enabling robots to generate versatile and effective task sequences. In this talk, I will introduce foundational concepts inspired by cognitive systems that allow robots to better comprehend multimodal scenarios by integrating knowledge and learning. Next, I will also explore how LMMs learning techniques can be integrated into intelligent robotic systems. Finally, I will outline the key modules required to elevate a robot\u2019s intelligence and adaptability and a hybrid architecture provides a balanced approach, avoiding the challenges of purely end-to-end training while enhancing physical interpretability. In parallel, I will showcase robotic platforms demonstrating capabilities in dexterous manipulation and robust dynamic locomotion, emphasizing their potential for general human-service applications.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-e92db02 e-flex e-con-boxed e-con e-parent\" data-id=\"e92db02\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t\t\t<div class=\"elementor-element elementor-element-6a52387 elementor-widget elementor-widget-heading\" data-id=\"6a52387\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Keynote Talks<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-1bcf593 e-flex e-con-boxed e-con e-parent\" data-id=\"1bcf593\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-a9768ff e-con-full e-flex e-con e-child\" data-id=\"a9768ff\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-69540fb elementor-widget elementor-widget-image\" data-id=\"69540fb\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"341\" height=\"454\" src=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/10\/hashimoto-e1728984975170.jpg\" class=\"attachment-medium_large size-medium_large wp-image-5572\" alt=\"\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-618d714 e-con-full e-flex e-con e-child\" data-id=\"618d714\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-d7eca5b elementor-widget elementor-widget-text-editor\" data-id=\"d7eca5b\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>December 11th 13:00-13:30, <span dir=\"ltr\" role=\"presentation\">Thai Chitlada 1<\/span><span dir=\"ltr\" role=\"presentation\">, 2\/F<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-080aab7 elementor-widget elementor-widget-heading\" data-id=\"080aab7\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Koichi Hashimoto<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5d63830 elementor-widget elementor-widget-heading\" data-id=\"5d63830\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Professor, Tohoku University, Japan<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-1196828 elementor-widget elementor-widget-heading\" data-id=\"1196828\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">3D point cloud-based visual servo<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ef87a94 elementor-widget elementor-widget-text-editor\" data-id=\"ef87a94\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Visual servo refers to the control of a robot using visual feedback, typically from cameras or depth sensors. The goal is to minimize the error between the current and desired states, where \u201cstate\u201d can refer to the position, orientation, or configuration of an object or the robot itself. In position-based visual servo (PBVS), the robot\u2019s end-effector is guided by estimating the target&#8217;s 3D position and orientation (pose). In image-based visual servo (IBVS), the control is done directly in the image plane without explicitly computing the pose. A 3D point cloud is a collection of data points representing a 3D surface or object. It is often captured using sensors such as depth cameras, LiDAR, or stereo vision systems. In visual servo, this data is used to estimate the geometry and pose of objects in the robot\u2019s workspace (3DBVS). The advantage of using point clouds is that they provide rich spatial information about the environment, allowing for more accurate tracking and manipulation in 3D space compared to 2D image data. This naturally suggests PBVS for robot control. However, several points of discussion exist on estimating the target position and orientation. The talk will introduce recent approaches to visual servo based on 3D point cloud sensors.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-2512516 e-flex e-con-boxed e-con e-parent\" data-id=\"2512516\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-07369ca e-con-full e-flex e-con e-child\" data-id=\"07369ca\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-3b28717 elementor-widget elementor-widget-text-editor\" data-id=\"3b28717\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>December 11th 13:30-14:00, <span dir=\"ltr\" role=\"presentation\">Thai Chitlada 1<\/span><span dir=\"ltr\" role=\"presentation\">, 2\/F<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-4951fae elementor-widget elementor-widget-heading\" data-id=\"4951fae\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Weiwei Wan<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-cefa0f7 elementor-widget elementor-widget-heading\" data-id=\"cefa0f7\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Professor, Osaka University, Japan<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-8fae1e1 elementor-widget elementor-widget-heading\" data-id=\"8fae1e1\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">AI-Driven Robot Manipulators for Laboratory Automation<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-9d511fb elementor-widget elementor-widget-text-editor\" data-id=\"9d511fb\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Biological and chemical laboratories demand significant human labor and flexibility to respond to dynamic conditions such as biological growth or chemical reactions. Such requirements present challenges for traditional automation. In this talk, I will introduce two research projects from our lab focused on automating laboratory tasks through AI-driven robotic manipulators. These systems leverage AI for recognition, reinforcement learning to generate task sequences, and motion planning to autonomously create flexible task or action sequences. The flexibility enables the developed robotic manipulator systems to adapt to the unique demands of biological and chemical experiments. The developed systems have been deployed to real experimental scenarios, contributing to discoveries of new mechanisms and facilitating experimental processes that require high adaptability.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-d425282 e-con-full e-flex e-con e-child\" data-id=\"d425282\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-82fecae elementor-widget elementor-widget-image\" data-id=\"82fecae\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"768\" height=\"1025\" src=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/weiwei_Wan-e1731487026966-768x1025.jpg\" class=\"attachment-medium_large size-medium_large wp-image-5782\" alt=\"\" srcset=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/weiwei_Wan-e1731487026966-768x1025.jpg 768w, https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/weiwei_Wan-e1731487026966-225x300.jpg 225w, https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/weiwei_Wan-e1731487026966-1151x1536.jpg 1151w\" sizes=\"(max-width: 768px) 100vw, 768px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-17aa0a2 e-flex e-con-boxed e-con e-parent\" data-id=\"17aa0a2\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-c74cb32 e-con-full e-flex e-con e-child\" data-id=\"c74cb32\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-75b3674 elementor-widget elementor-widget-image\" data-id=\"75b3674\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"341\" height=\"454\" src=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/\u56fe\u72471-e1730559465221.jpg\" class=\"attachment-medium_large size-medium_large wp-image-5681\" alt=\"\" srcset=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/\u56fe\u72471-e1730559465221.jpg 341w, https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/\u56fe\u72471-e1730559465221-225x300.jpg 225w\" sizes=\"(max-width: 341px) 100vw, 341px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-93a0ba4 e-con-full e-flex e-con e-child\" data-id=\"93a0ba4\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-5398316 elementor-widget elementor-widget-text-editor\" data-id=\"5398316\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>December 12th 13:00-13:30, <span dir=\"ltr\" role=\"presentation\">Thai Chitlada 1<\/span><span dir=\"ltr\" role=\"presentation\">, 2\/F<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-61184d4 elementor-widget elementor-widget-heading\" data-id=\"61184d4\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Hao Liu<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-eccd49b elementor-widget elementor-widget-heading\" data-id=\"eccd49b\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Professor, Shenyang Institute of Automation, Chinese Academy of Sciences, China<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-bff3ea6 elementor-widget elementor-widget-heading\" data-id=\"bff3ea6\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Augmented Sensing and Autonomous Control of Flexible Surgical Robots<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-8fcba9b elementor-widget elementor-widget-text-editor\" data-id=\"8fcba9b\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p><span style=\"font-size: 18px; color: var( --e-global-color-secondary ); font-family: var( --e-global-typography-text-font-family ), Sans-serif; font-weight: var( --e-global-typography-text-font-weight );\">Surgery through human body cavities such as the digestive tract and blood vessels is more minimally invasive and is an important direction in the development of modern medicine. The flexible surgical robot has excellent environmental adaptability and dexterous manipulating capability, and are an important enabling technology for operations within the human body lumens. It has been widely studied around the world and has achieved preliminary clinical applications. However, restricted by its small size and soft tissue cavity environment, the sensing ability of flexible surgical robot is rather weak. And its manipulating performance towards complex cavity environments and surgical tasks is still very limited. The report will review the current research status of flexible surgical robots and share the speaker&#8217;s long-term research progress in the sensing and intelligent control methods of flexible surgical robots.<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-8d92e8f e-flex e-con-boxed e-con e-parent\" data-id=\"8d92e8f\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-13a9782 e-con-full e-flex e-con e-child\" data-id=\"13a9782\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-59b7098 elementor-widget elementor-widget-text-editor\" data-id=\"59b7098\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>December 12th 13:30-14:00, <span dir=\"ltr\" role=\"presentation\">Thai Chitlada 1<\/span><span dir=\"ltr\" role=\"presentation\">, 2\/F<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-a850cbc elementor-widget elementor-widget-heading\" data-id=\"a850cbc\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Xinyu Liu <\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d1b0ce4 elementor-widget elementor-widget-heading\" data-id=\"d1b0ce4\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Professor, University of Toronto, Canada<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e3dcdb3 elementor-widget elementor-widget-heading\" data-id=\"e3dcdb3\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Robotic Manipulation of Small Model Organisms<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-ab909f3 elementor-widget elementor-widget-text-editor\" data-id=\"ab909f3\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Robotic manipulation has become an enabling technology for experimental studies of living biological samples such as cells, tissues, and organisms. In this talk, I will introduce our recent research on developing robotic devices and systems for performing a variety of manipulation tasks on small model organisms including Drosophila larva and C. elegans. We have designed novel microfluidic devices for controlling the position and orientation of swimming\/crawling organisms, developed new computer vision algorithms and learning-based models for characterizing the morphological and molecular features of these organisms, and invented automated robotic systems for applying multimodal stimulations on and injecting genetic materials into the organism bodies. These innovative robotic tools have enabled new studies on neuroscience, development, and genetics of Drosophila and C. elegans. I will present our results on both technology development and biological application, and will also briefly discuss the future directions in this area.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-4ab3400 e-con-full e-flex e-con e-child\" data-id=\"4ab3400\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-37f519a elementor-widget elementor-widget-image\" data-id=\"37f519a\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"341\" height=\"455\" src=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/10\/xinyu_liu_uniformed.jpg\" class=\"attachment-medium_large size-medium_large wp-image-5529\" alt=\"\" srcset=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/10\/xinyu_liu_uniformed.jpg 341w, https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/10\/xinyu_liu_uniformed-225x300.jpg 225w\" sizes=\"(max-width: 341px) 100vw, 341px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-e1882fa e-flex e-con-boxed e-con e-parent\" data-id=\"e1882fa\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-ddc12ff e-con-full e-flex e-con e-child\" data-id=\"ddc12ff\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-86fb2f7 elementor-widget elementor-widget-image\" data-id=\"86fb2f7\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"341\" height=\"454\" src=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/Antoine-e1731488347685.jpg\" class=\"attachment-medium_large size-medium_large wp-image-5796\" alt=\"\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-a3ad13f e-con-full e-flex e-con e-child\" data-id=\"a3ad13f\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-1140026 elementor-widget elementor-widget-text-editor\" data-id=\"1140026\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>December 13th 13:00-13:30, <span dir=\"ltr\" role=\"presentation\">Thai Chitlada 1<\/span><span dir=\"ltr\" role=\"presentation\">, 2\/F<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-cc85a0f elementor-widget elementor-widget-heading\" data-id=\"cc85a0f\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Antoine Ferreira<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-6b932d6 elementor-widget elementor-widget-heading\" data-id=\"6b932d6\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Professor, INSA Centre Val de Loire, France<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-d084b2e elementor-widget elementor-widget-heading\" data-id=\"d084b2e\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">AI-powered Navigation of Magnetic Microrobots for Targeted Drug Delivery <\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-e14616b elementor-widget elementor-widget-text-editor\" data-id=\"e14616b\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Microscale robots introduce great perspectives into many medical applications such as drug delivery. Fully automatic microrobots\u2019 real-time detection and tracking using medical imagers are investigated for future clinical translation. Ultrasound imaging has been employed to monitor single agents and collective swarms of microrobots in vitro and ex vivo in controlled experimental conditions. However, low contrast and spatial resolution still limit the effective employment of such a method in a medical microrobotic scenario due to uncertainties associated with the position of microrobots. The positioning error arises due to the inaccuracy of the US-based visual feedback, which is provided by the detection and tracking algorithms. The application of deep learning networks is a promising solution to detect and track real-time microrobots in noisy clinical imagers. In this presentation, the navigation performance of endovascular magnetic microrobots with different geometries, materials and sizes are investigated in clinical settings with state-of-the-art deep learning detection and tracking research.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-427dd90 e-flex e-con-boxed e-con e-parent\" data-id=\"427dd90\" data-element_type=\"container\">\n\t\t\t\t\t<div class=\"e-con-inner\">\n\t\t<div class=\"elementor-element elementor-element-e8deae7 e-con-full e-flex e-con e-child\" data-id=\"e8deae7\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-c94166c elementor-widget elementor-widget-text-editor\" data-id=\"c94166c\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>December 13th 13:30-14:00, <span dir=\"ltr\" role=\"presentation\">Thai Chitlada 1<\/span><span dir=\"ltr\" role=\"presentation\">, 2\/F<\/span><\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-7677d4a elementor-widget elementor-widget-heading\" data-id=\"7677d4a\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Jackrit Suthakorn<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-c3e264f elementor-widget elementor-widget-heading\" data-id=\"c3e264f\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Professor, Mahidol University, Thailand<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-dc1b88d elementor-widget elementor-widget-heading\" data-id=\"dc1b88d\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Transforming Patient Care through Medical Robotics Research in Surgical, Rehabilitation, and Hospital Service Robotics<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-fe2e81c elementor-widget elementor-widget-text-editor\" data-id=\"fe2e81c\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>Medical robotics research is transforming patient care by advancing surgical precision, improving rehabilitation, and hospital services. This keynote presentation explores advances in surgical robotics that enable minimally invasive procedures with high accuracy, improving patient outcomes and reducing recovery time. Rehabilitation robotics offers tailored support, enabling patient mobility and independence, while hospital service robots optimize logistics, alleviating healthcare staff workload. By integrating artificial intelligence, sensor technology, and biomechanical design, medical robotics not only enhances safety and efficiency but also advances a patient-centered approach. These advancements promise to reshape healthcare, providing accessible, effective solutions across different clinical environments. At the forefront of this advancement is the Center for Biomedical and Robotics Technology (BART LAB), a leading research center dedicated to developing cutting-edge robotic technologies tailored to address real-world clinical challenges. Collaborating closely with hospitals and medical professionals, BART LAB ensures that its research aligns with the practical demands of healthcare providers. By prioritizing patient safety and efficacy from the outset, BART LAB adheres to ISO standards and regulatory guidelines, building trust within the medical community. Clinical trials rigorously test and validate these technologies, bridging the gap between lab research and practical patient care. This patient-centered approach drives significant advancements in medical robotics, enhancing precision, accessibility, and effectiveness across healthcare applications. Robotics is transforming surgery, rehabilitation, and hospital services, improving patient outcomes and operational efficiency as research and collaboration propel the development of adaptive, intelligent systems.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t<div class=\"elementor-element elementor-element-bca0453 e-con-full e-flex e-con e-child\" data-id=\"bca0453\" data-element_type=\"container\">\n\t\t\t\t<div class=\"elementor-element elementor-element-94de37a elementor-widget elementor-widget-image\" data-id=\"94de37a\" data-element_type=\"widget\" data-widget_type=\"image.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<img loading=\"lazy\" decoding=\"async\" width=\"343\" height=\"444\" src=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/Jackrit-e1731489255134.jpg\" class=\"attachment-medium_large size-medium_large wp-image-5797\" alt=\"\" srcset=\"https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/Jackrit-e1731489255134.jpg 343w, https:\/\/ieee-robio.org\/2024\/wp-content\/uploads\/2024\/11\/Jackrit-e1731489255134-232x300.jpg 232w\" sizes=\"(max-width: 343px) 100vw, 343px\" \/>\t\t\t\t\t\t\t\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Plenary Talks December 11th 9:00-10:00, Thai Chitlada 1, 2\/F Yunhui Liu Professor, The Chinese University of Hong Kong, China Towards Vision-Driven Robots Although robots are being widely used in various sectors, their performance and intelligence level are still far below humans\u2019 expectations. One of the major reasons is that robots are not skillful in coordinating&#8230; <\/p>\n<div class=\"clear\"><\/div>\n<p><a href=\"https:\/\/ieee-robio.org\/2024\/invited-talks\/\" class=\"gdlr-info-font excerpt-read-more\">Read More<\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"footnotes":""},"class_list":["post-5128","page","type-page","status-publish","hentry"],"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/ieee-robio.org\/2024\/wp-json\/wp\/v2\/pages\/5128","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ieee-robio.org\/2024\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/ieee-robio.org\/2024\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/ieee-robio.org\/2024\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ieee-robio.org\/2024\/wp-json\/wp\/v2\/comments?post=5128"}],"version-history":[{"count":10,"href":"https:\/\/ieee-robio.org\/2024\/wp-json\/wp\/v2\/pages\/5128\/revisions"}],"predecessor-version":[{"id":5985,"href":"https:\/\/ieee-robio.org\/2024\/wp-json\/wp\/v2\/pages\/5128\/revisions\/5985"}],"wp:attachment":[{"href":"https:\/\/ieee-robio.org\/2024\/wp-json\/wp\/v2\/media?parent=5128"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}