{"id":1110,"date":"2025-11-11T14:54:34","date_gmt":"2025-11-11T05:54:34","guid":{"rendered":"https:\/\/kmlab.nagaokaut.ac.jp\/?p=1110"},"modified":"2025-12-21T14:58:18","modified_gmt":"2025-12-21T05:58:18","slug":"wacv-2026","status":"publish","type":"post","link":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/2025\/11\/11\/wacv-2026\/","title":{"rendered":"WACV 2026 \u8ad6\u6587\u63a1\u629e"},"content":{"rendered":"<p data-start=\"132\" data-end=\"236\"><a href=\"https:\/\/wacv.thecvf.com\/\">Winter Conference on Applications of Computer Vision 2026\uff08WACV 2026\uff09<\/a>\u306b\u304a\u3044\u3066\u3001\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c <strong data-start=\"212\" data-end=\"228\">\u63a1\u629e\uff08Accepted\uff09<\/strong> \u3055\u308c\u307e\u3057\u305f<\/p>\n<p data-start=\"238\" data-end=\"436\"><strong data-start=\"238\" data-end=\"285\">H. Cui, W. Hua, R. Huang, S. Jia, T. Hayama<\/strong>,<br data-start=\"286\" data-end=\"289\" \/><em data-start=\"289\" data-end=\"386\">\u201cSasMamba: A Lightweight Structure-Aware Stride State Space Model for 3D Human Pose Estimation\u201d<\/em>,<br data-start=\"387\" data-end=\"390\" \/>in Proc. WACV 2026, Tucson, Arizona, USA, 2026<\/p>\n<p><a href=\"https:\/\/arxiv.org\/abs\/2511.08872\">https:\/\/arxiv.org\/abs\/2511.08872<\/a><\/p>\n<p data-start=\"438\" data-end=\"546\">WACV \u306f\u3001\u30b3\u30f3\u30d4\u30e5\u30fc\u30bf\u30d3\u30b8\u30e7\u30f3\u5206\u91ce\u306b\u304a\u3051\u308b\u4e3b\u8981\u306a\u56fd\u969b\u4f1a\u8b70\u306e\u4e00\u3064\u3067\u3042\u308a\u3001\u672c\u63a1\u629e\u306f\u672c\u7814\u7a76\u306e\u5b66\u8853\u7684\u610f\u7fa9\u3068\u6280\u8853\u7684\u8ca2\u732e\u304c\u56fd\u969b\u7684\u306b\u8a55\u4fa1\u3055\u308c\u305f\u7d50\u679c\u3067\u3059<br data-start=\"508\" data-end=\"511\" \/>\u767a\u8868\u306f 2026\u5e743\u6708\u3001\u7c73\u56fd\u30a2\u30ea\u30be\u30ca\u5dde\u30c4\u30fc\u30bd\u30f3\u306b\u3066\u884c\u308f\u308c\u308b\u4e88\u5b9a\u3067\u3059<\/p>\n<hr data-start=\"548\" data-end=\"551\" \/>\n<h2 data-start=\"553\" data-end=\"580\">Paper Overview<\/h2>\n<p data-start=\"582\" data-end=\"935\"><strong data-start=\"582\" data-end=\"594\">SasMamba<\/strong> proposes a lightweight and structure-aware state space model for <strong data-start=\"660\" data-end=\"688\">3D human pose estimation<\/strong>.<br data-start=\"689\" data-end=\"692\" \/>Unlike conventional transformer-based or heavy temporal models, the proposed method introduces a <em data-start=\"789\" data-end=\"827\">stride-aware state space formulation<\/em> that efficiently captures long-range temporal dependencies while explicitly considering skeletal structure.<\/p>\n<p data-start=\"937\" data-end=\"1363\">By integrating structural priors of the human body with a compact state space design, SasMamba achieves a favorable balance between <strong data-start=\"1069\" data-end=\"1123\">accuracy, computational efficiency, and model size<\/strong>.<br data-start=\"1124\" data-end=\"1127\" \/>Experimental results demonstrate that the proposed approach maintains competitive performance on benchmark datasets while significantly reducing computational cost, making it suitable for real-time and resource-constrained applications.<\/p>\n<hr data-start=\"1365\" data-end=\"1368\" \/>\n<h2 data-start=\"1370\" data-end=\"1382\">\u8ad6\u6587\u6982\u8981<\/h2>\n<p data-start=\"1384\" data-end=\"1540\"><strong data-start=\"1384\" data-end=\"1396\">SasMamba<\/strong> \u306f\u3001<strong data-start=\"1399\" data-end=\"1412\">3\u6b21\u5143\u30d2\u30c8\u59ff\u52e2\u63a8\u5b9a<\/strong>\u306e\u305f\u3081\u306e\u8efd\u91cf\u304b\u3064\u69cb\u9020\u8a8d\u8b58\u578b\u306e\u72b6\u614b\u7a7a\u9593\u30e2\u30c7\u30eb\u3092\u63d0\u6848\u3059\u308b\u7814\u7a76\u3067\u3059<br data-start=\"1442\" data-end=\"1445\" \/>\u5f93\u6765\u306eTransformer\u7cfb\u30e2\u30c7\u30eb\u3084\u5927\u898f\u6a21\u306a\u6642\u7cfb\u5217\u30e2\u30c7\u30eb\u3068\u306f\u7570\u306a\u308a\u3001\u672c\u624b\u6cd5\u3067\u306f <strong data-start=\"1486\" data-end=\"1513\">Stride\uff08\u6642\u9593\u9593\u9694\uff09\u3092\u8003\u616e\u3057\u305f\u72b6\u614b\u7a7a\u9593\u8868\u73fe<\/strong> \u3092\u5c0e\u5165\u3057\u3001\u9577\u6642\u9593\u306e\u52d5\u4f5c\u4f9d\u5b58\u95a2\u4fc2\u3092\u52b9\u7387\u7684\u306b\u6349\u3048\u307e\u3059<\/p>\n<p data-start=\"1542\" data-end=\"1574\">\u3055\u3089\u306b\u3001\u4eba\u4f53\u9aa8\u683c\u306e\u69cb\u9020\u60c5\u5831\u3092\u660e\u793a\u7684\u306b\u30e2\u30c7\u30eb\u306b\u7d44\u307f\u8fbc\u3080\u3053\u3068\u3067\u3001<strong>\u9ad8\u3044\u63a8\u5b9a\u7cbe\u5ea6\u3001\u4f4e\u8a08\u7b97\u30b3\u30b9\u30c8\u3001\u5c0f\u898f\u6a21\u30e2\u30c7\u30eb<\/strong>\u3001\u3092\u4e21\u7acb\u3057\u3066\u3044\u307e\u3059<br data-start=\"1617\" data-end=\"1620\" \/>\u5b9f\u9a13\u7d50\u679c\u304b\u3089\u3001SasMamba \u306f\u65e2\u5b58\u624b\u6cd5\u3068\u540c\u7b49\u4ee5\u4e0a\u306e\u6027\u80fd\u3092\u7dad\u6301\u3057\u3064\u3064\u3001<strong data-start=\"1656\" data-end=\"1686\">\u30ea\u30a2\u30eb\u30bf\u30a4\u30e0\u51e6\u7406\u3084\u8a08\u7b97\u8cc7\u6e90\u304c\u9650\u3089\u308c\u305f\u74b0\u5883\u306b\u3082\u9069\u7528\u53ef\u80fd<\/strong>\u3067\u3042\u308b\u3053\u3068\u304c\u793a\u3055\u308c\u307e\u3057\u305f<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Winter Conference on Applications of Computer Vision 2026\uff08WACV 2026\uff09\u306b\u304a\u3044\u3066\u3001\u4ee5\u4e0b\u306e\u8ad6\u6587\u304c \u63a1\u629e\uff08Accepted\uff09 \u3055\u308c\u307e\u3057\u305f H. Cui, W.  &hellip; <a href=\"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/2025\/11\/11\/wacv-2026\/\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">WACV 2026 \u8ad6\u6587\u63a1\u629e<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_exactmetrics_skip_tracking":false,"_exactmetrics_sitenote_active":false,"_exactmetrics_sitenote_note":"","_exactmetrics_sitenote_category":0,"_monsterinsights_skip_tracking":false,"_monsterinsights_sitenote_active":false,"_monsterinsights_sitenote_note":"","_monsterinsights_sitenote_category":0,"footnotes":""},"categories":[1],"tags":[],"class_list":["post-1110","post","type-post","status-publish","format-standard","hentry","category-1","without-featured-image"],"_links":{"self":[{"href":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/wp-json\/wp\/v2\/posts\/1110","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/wp-json\/wp\/v2\/comments?post=1110"}],"version-history":[{"count":1,"href":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/wp-json\/wp\/v2\/posts\/1110\/revisions"}],"predecessor-version":[{"id":1111,"href":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/wp-json\/wp\/v2\/posts\/1110\/revisions\/1111"}],"wp:attachment":[{"href":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/wp-json\/wp\/v2\/media?parent=1110"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/wp-json\/wp\/v2\/categories?post=1110"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/kmlab.nagaokaut.ac.jp\/index.php\/wp-json\/wp\/v2\/tags?post=1110"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}