{"id":132853,"date":"2022-07-14T11:28:49","date_gmt":"2022-07-14T02:28:49","guid":{"rendered":"http:\/\/192.249.19.202\/?post_type=ai-in-signal&#038;p=132853"},"modified":"2026-04-07T09:03:50","modified_gmt":"2026-04-07T00:03:50","slug":"ee-prof-yoo-chang-dong-and-kweon-in-so-s-team-give-oral-presentation-at-eccv-2022","status":"publish","type":"ai-in-signal","link":"http:\/\/ee.presscat.kr\/en\/ai-in-signal\/ee-prof-yoo-chang-dong-and-kweon-in-so-s-team-give-oral-presentation-at-eccv-2022\/","title":{"rendered":"EE Prof. Yoo, Chang Dong and Kweon, In So \u2019s team Give Oral Presentation at ECCV 2022"},"content":{"rendered":"<p dir=\"ltr\"><span style=\"font-family: verdana, geneva, sans-serif\">Title:\u00a0EE Professor Yoo, Chang Dong and \u00a0Kweon, In So \u2019s Research Team Give Oral Presentation on Self-supervised Learning at ECCV 2022<\/span><\/p>\n<p dir=\"ltr\"><span style=\"font-family: verdana, geneva, sans-serif\">KAIST EE Prof. Yoo, Chang Dong and Kweon, In So\u2019s team conducted a joint research and proposed a self-supervised learning method that is remarkably robust and performs well with even only a small volume of labeled data.<\/span><\/p>\n<p dir=\"ltr\"><span style=\"font-family: verdana, geneva, sans-serif\"><img fetchpriority=\"high\" decoding=\"async\" class=\"alignnone size-full wp-image-131892\" src=\"http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/\uc5f0\uad6c\uc9c4_2022-eccv_\ud64d\ubcf4.png\" alt=\"\" width=\"438\" height=\"123\" title=\"\" srcset=\"http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/\uc5f0\uad6c\uc9c4_2022-eccv_\ud64d\ubcf4.png 438w, http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/\uc5f0\uad6c\uc9c4_2022-eccv_\ud64d\ubcf4-300x84.png 300w\" sizes=\"(max-width: 438px) 100vw, 438px\" \/><\/span><\/p>\n<p dir=\"ltr\"><span style=\"font-family: verdana, geneva, sans-serif\">&lt;(From left) EE Professors Yoo, Chang Dong and Kweon, In So and Researchers Chaoning Zhang and Kang Zhang&gt;<\/span><\/p>\n<p dir=\"ltr\"><span style=\"font-family: verdana, geneva, sans-serif\">ECCV began in 1990 and has since focused on introducing the latest findings in artificial intelligence and machine learning research on vision and signal processing. It has long been a renowned conference on computer vision and deep learning,\u00a0<\/span><span style=\"font-family: verdana, geneva, sans-serif\">and its 2022 rendition gathered 5,803 submissions, only 1,650 (28%) of which have had the honor of being accepted, and merely a select 158 (2.7%) of the accepts given the opportunity for an oral presentation. <\/span><\/p>\n<p dir=\"ltr\"><span style=\"font-family: verdana, geneva, sans-serif\">The team\u2019s findings titled \u201cDecoupled Adversarial Contrastive Learning for Self-supervised Adversarial Robustness\u201d earned the oral presentation honor and will be presented on Oct. 23, 2022 in Tel Aviv, Israel.<\/span><\/p>\n<p dir=\"ltr\"><span style=\"font-family: verdana, geneva, sans-serif\">While artificial intelligence is making progress in various domains, it has yet to win full trust from humans. Reliable learning should encompass learning from little data as well as robust learning, and attempts at this objective have been made with combining self-supervised learning and adversarial learning. This work utilizes distillation methods to efficiently put together the two and proposed an adversarial learning framework capable of self-supervised learning without labels.<\/span><\/p>\n<p dir=\"ltr\"><span style=\"font-family: verdana, geneva, sans-serif\"> The paper outlining these findings has been selected as an ECCV Oral Presentation (acceptance rate 2.7%) work. The work is a joint endeavor by Professors Yoo, Chang Dong and Kweon, In So, and their team, and it promises exciting opportunities for providing high-performance services based on robust artificial intelligence learning from little data.<\/span><\/p>\n<p dir=\"ltr\"><span style=\"font-family: verdana, geneva, sans-serif\">This research is supported by IITP by MSIT.<\/span><\/p>\n<p dir=\"ltr\"><span style=\"font-family: verdana, geneva, sans-serif\"><img decoding=\"async\" class=\"alignnone size-full wp-image-131894\" src=\"http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/dataURItoBlob.png\" alt=\"\" width=\"559\" height=\"184\" title=\"\" srcset=\"http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/dataURItoBlob.png 559w, http:\/\/ee.presscat.kr\/wp-content\/uploads\/2022\/07\/dataURItoBlob-300x99.png 300w\" sizes=\"(max-width: 559px) 100vw, 559px\" \/><\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>1543<\/p>\n","protected":false},"featured_media":0,"template":"","class_list":["post-132853","ai-in-signal","type-ai-in-signal","status-publish","hentry"],"acf":[],"_links":{"self":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/ai-in-signal\/132853","targetHints":{"allow":["GET"]}}],"collection":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/ai-in-signal"}],"about":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/types\/ai-in-signal"}],"wp:attachment":[{"href":"http:\/\/ee.presscat.kr\/en\/wp-json\/wp\/v2\/media?parent=132853"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}