{"id":169556,"date":"2024-08-20T09:22:50","date_gmt":"2024-08-20T00:22:50","guid":{"rendered":"http:\/\/ee.presscat.kr\/?post_type=research-achieve&#038;p=169556"},"modified":"2026-04-13T09:21:57","modified_gmt":"2026-04-13T00:21:57","slug":"professor-yong-man-ros-research-team-wins-outstanding-paper-award-at-ai-top-tier-conference-acl-2024","status":"publish","type":"research-achieve","link":"http:\/\/ee.presscat.kr\/en\/research-achieve\/professor-yong-man-ros-research-team-wins-outstanding-paper-award-at-ai-top-tier-conference-acl-2024\/","title":{"rendered":"Professor Yong Man Ro\u2019s Research Team Wins Outstanding Paper Award at AI Top tier Conference (ACL 2024)"},"content":{"rendered":"<p><strong><span style=\"color: #000000\">Professor Yong Man Ro\u2019s Research Team Wins Outstanding Paper Award at AI Top tier Conference (ACL 2024)<\/span><\/strong><\/p>\n<p><span style=\"color: #000000\">\u00a0 \u00a0<img fetchpriority=\"high\" decoding=\"async\" class=\"alignnone size-full wp-image-169557\" src=\"http:\/\/ee.presscat.kr\/wp-content\/uploads\/2024\/08\/1.png\" alt=\"\" width=\"457\" height=\"246\" title=\"\"><\/span><\/p>\n<p><span style=\"color: #000000\">&lt;(from left) Se Jin Park ph.d. candidate, Chae Won Kim ph.d. candidate&gt;<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"color: #000000\">PhD students Se Jin Park and Chae Won Kim from Professor Yong-Man Ro\u2019s research team in the School of Electrical Engineering at KAIST have won the Outstanding Paper Award at the ACL (Association for Computational Linguistics) 2024 conference, held in Bangkok. <\/span><\/p>\n<p><span style=\"color: #000000\">ACL is recognized as the world\u2019s leading conference in the field of Natural Language Processing (NLP) and is one of the top-tier international conferences in Artificial Intelligence (AI).<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"color: #000000\">Their award-winning paper, titled &#8220;Let&#8217;s Go Real Talk: Spoken Dialogue Model for Face-to-Face Conversation,&#8221; introduces an innovative model designed to make interactions between humans and AI more natural and human-like.<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"color: #000000\">Unlike traditional text-based or speech-based dialogue models, this research developed a Human Multimodal LLM (Large Language Model) that enables AI to comprehend both visual cues and vocal signals from humans. Additionally, it allows the AI to engage in conversations using human-like facial expressions and speech.<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"color: #000000\">This breakthrough opens up new possibilities for improving the intuitiveness and effectiveness of human-AI interactions by simultaneously processing visual and auditory signals during conversations.<\/span><\/p>\n<p>&nbsp;<\/p>\n<p><span style=\"color: #000000\"><img decoding=\"async\" class=\"alignnone  wp-image-169559\" src=\"http:\/\/ee.presscat.kr\/wp-content\/uploads\/2024\/08\/Inline-image-2024-08-19-07.30.20.729.png\" alt=\"\" width=\"671\" height=\"553\" title=\"\"><\/span><\/p>\n<div><span style=\"color: #000000\">The paper was also presented as an oral presentation at the ACL 2024 conference in Bangkok, where it garnered significant attention.<\/span><\/div>\n<div>\u00a0<\/div>\n<div><\/p>\n<p><span style=\"color: #000000\">Professor Yong\u00a0Man Ro stated, &#8221;\u00a0This research marks a significant advancement in human-AI interaction, and we hope this technology will be widely applied in various real-world applications. <\/span><\/p>\n<p><span style=\"color: #000000\">This award is yet another example of the international recognition of the excellence of AI research at KAIST\u2019s School of Electrical Engineering.&#8221;<\/span><\/p>\n<\/div>\n<p>&nbsp;<\/p>\n","protected":false},"excerpt":{"rendered":"<p>1233<\/p>\n","protected":false},"featured_media":169563,"template":"","research_category":[],"class_list":["post-169556","research-achieve","type-research-achieve","status-publish","has-post-thumbnail","hentry"],"acf":[],"_links":{"self":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/research-achieve\/169556","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/research-achieve"}],"about":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/types\/research-achieve"}],"wp:featuredmedia":[{"embeddable":true,"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/media\/169563"}],"wp:attachment":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/media?parent=169556"}],"wp:term":[{"taxonomy":"research_category","embeddable":true,"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/research_category?post=169556"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}