{"id":684,"date":"2024-12-19T13:31:20","date_gmt":"2024-12-19T05:31:20","guid":{"rendered":"https:\/\/511cvlab.sinkers.cn\/?page_id=684"},"modified":"2026-04-13T12:23:43","modified_gmt":"2026-04-13T04:23:43","slug":"publications","status":"publish","type":"page","link":"https:\/\/cv.nirc.top\/zh\/publications\/","title":{"rendered":"Publications"},"content":{"rendered":"<div class=\"wp-block-group has-border-color has-tertiary-border-color has-global-padding is-layout-constrained wp-container-core-group-is-layout-375ed4bb wp-block-group-is-layout-constrained\" style=\"border-width:2px;border-radius:9px;padding-top:var(--wp--preset--spacing--50);padding-right:var(--wp--preset--spacing--50);padding-bottom:var(--wp--preset--spacing--50);padding-left:var(--wp--preset--spacing--50);box-shadow:6px 6px 9px rgba(0, 0, 0, 0.2)\">\n<div class=\"wp-block-group is-layout-grid wp-container-core-group-is-layout-e2bd5cb0 wp-block-group-is-layout-grid\">\n<div class=\"wp-block-group is-content-justification-center is-nowrap is-layout-flex wp-container-core-group-is-layout-0a12bad6 wp-block-group-is-layout-flex\">\n<div class=\"wp-block-outermost-icon-block items-justified-center\"><div class=\"icon-container has-icon-color has-contrast-color\" style=\"color:#232d3f;width:35px;transform:rotate(0deg) scaleX(1) scaleY(1)\"><svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewbox=\"0 0 24 24\" aria-hidden=\"true\"><path d=\"M14.5 5.5h-7V7h7V5.5ZM7.5 9h7v1.5h-7V9Zm7 3.5h-7V14h7v-1.5Z\"><\/path><path d=\"M16 2H6a2 2 0 0 0-2 2v12a2 2 0 0 0 2 2h10a2 2 0 0 0 2-2V4a2 2 0 0 0-2-2ZM6 3.5h10a.5.5 0 0 1 .5.5v12a.5.5 0 0 1-.5.5H6a.5.5 0 0 1-.5-.5V4a.5.5 0 0 1 .5-.5Z\"><\/path><path d=\"M20 8v11c0 .69-.31 1-.999 1H6v1.5h13.001c1.52 0 2.499-.982 2.499-2.5V8H20Z\"><\/path><\/svg><\/div><\/div>\n\n\n\n<p class=\"has-text-align-center\">Research Highlights<\/p>\n<\/div>\n<\/div>\n\n\n\n<p class=\"has-x-small-font-size\">\u8fd1\u4e94\u5e74\u6765\uff0c\u56e2\u961f\u7d2f\u8ba1\u53d1\u8868\u9ad8\u6c34\u5e73\u5b66\u672f\u8bba\u6587 40 \u4f59\u7bc7\uff0c\u5176\u4e2d CCF-A \u7c7b\u9876\u7ea7\u4f1a\u8bae\u53ca\u671f\u520a\u8bba\u6587 25 \u7bc7\u3002 <mark style=\"background-color:rgba(0, 0, 0, 0)\" class=\"has-inline-color has-custom-high-light-color\">Notable achievements include a Distinguished Paper Award at AAAI 2023, an Oral presentation at ICCV 2023, an Oral at ECAI 2023, and three Highlights at CVPR 2026. <\/mark>Our work has been published in premier venues such as CVPR*8, ICCV*3, AAAI*3, ACM Multimedia*3, NeurIPS*2, IJCAI*1, WWW*1, and CSCW*1 , as well as leading journals such as IEEE TIP and TCYB\uff0cwith our team also securing 1st Place in the ICCV 2025 HANDS Challenge. This body of work covers a range of topics including 3D hand and human pose estimation, high-fidelity 3D rendering, generative modeling, gesture recognition, multimodal fusion, and category-agnostic pose learning.<\/p>\n<\/div>\n\n\n\n<div style=\"height:50px\" aria-hidden=\"true\" class=\"wp-block-spacer\"><\/div>\n\n\n\n<div class=\"wp-block-query is-layout-flow wp-block-query-is-layout-flow\">\n<div class=\"wp-block-query is-layout-flow wp-block-query-is-layout-flow\">\n<div class=\"wp-block-group is-layout-flow wp-block-group-is-layout-flow\">\n<div class=\"wp-block-group has-global-padding is-layout-constrained wp-container-core-group-is-layout-fbc3937f wp-block-group-is-layout-constrained\"><ul class=\"wp-block-post-template is-layout-flow wp-container-core-post-template-is-layout-5291b468 wp-block-post-template-is-layout-flow\"><li class=\"wp-block-post post-2665 post type-post status-publish format-standard has-post-thumbnail hentry category-uncategorized tag-highlight\">\n\n<div class=\"wp-block-columns alignwide is-layout-flex wp-container-core-columns-is-layout-cbe57604 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><figure class=\"aligncenter wp-block-post-featured-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1498\" height=\"560\" src=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Liu.jpg\" class=\"attachment-post-thumbnail size-post-thumbnail wp-post-image\" alt=\"Clay-to-Stone: Phase-wise 3D Gaussian Splatting for Monocular Articulated Hand-Object Manipulation Modeling\" title=\"Clay-to-Stone: Phase-wise 3D Gaussian Splatting for Monocular Articulated Hand-Object Manipulation Modeling\" style=\"object-fit:cover;\" srcset=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Liu.jpg 1498w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Liu-300x112.jpg 300w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Liu-1024x383.jpg 1024w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Liu-768x287.jpg 768w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Liu-18x7.jpg 18w\" sizes=\"auto, (max-width: 1498px) 100vw, 1498px\" \/><\/figure><\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-9703d62a wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\"><h3 class=\"wp-block-post-title has-small-font-size\"><a href=\"https:\/\/cv.nirc.top\/zh\/2026\/clay-to-stone-phase-wise-3d-gaussian-splatting-for-monocular-articulated-hand-object-manipulation-modeling\/\" target=\"_self\" >Clay-to-Stone: Phase-wise 3D Gaussian Splatting for Monocular Articulated Hand-Object Manipulation Modeling<\/a><\/h3>\n\n\n<div class=\"acf-post-field-block\">\n  <span>Xingyu Liu*, Pengfei Ren*, Qi Qi\u2020, Haifeng Sun\u2020, Zirui Zhuang, Jianxin Liao, Jingyu Wang\u2020<\/span>\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-0f13be2a wp-block-group-is-layout-flex\"><div style=\"padding-right:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CVPR<\/div><\/div>\n\n<div style=\"padding-right:8px;padding-left:8px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">2026 &#8211; Highlight<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">(<\/p>\n\n\n<div style=\"padding-right:0px;padding-left:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CCF-A<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">)<\/p>\n\n\n<div style=\"padding-right:0;padding-left:var(--wp--preset--spacing--40)\" class=\"taxonomy-post_tag has-link-color wp-elements-cf4548bfb3ce851c875ebef3c1d25115 wp-block-post-terms has-text-color has-custom-color has-x-small-font-size\"><a href=\"https:\/\/cv.nirc.top\/zh\/tag\/highlight\/\" rel=\"tag\">Highlight<\/a><\/div><\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-bd0213bd wp-block-group-is-layout-flex\" style=\"padding-right:0;padding-left:0\"><div style=\"border-radius:100px; padding-right:0;padding-left:0;\" class=\"is-acf-field is-display-inline-block wp-block-mfb-meta-field-block has-background has-tertiary-background-color has-x-small-font-size\"><div class=\"prefix\"> <\/div><div class=\"value\"><a href=\"https:\/\/github.com\/ru1ven\/ARGS\" target=\"\">Code<\/a><\/div><div class=\"suffix\"> <\/div><\/div>\n\n\n\n\n\n<a style=\"border-radius:100px;border-width:0px; padding-top:0;padding-bottom:0;padding-left:0.5rem;padding-right:0.5rem;\" class=\"wp-block-read-more has-background has-tertiary-background-color has-x-small-font-size\" href=\"https:\/\/cv.nirc.top\/zh\/2026\/clay-to-stone-phase-wise-3d-gaussian-splatting-for-monocular-articulated-hand-object-manipulation-modeling\/\" target=\"_blank\">ProjectPage<span class=\"screen-reader-text\">\uff1aClay-to-Stone: Phase-wise 3D Gaussian Splatting for Monocular Articulated Hand-Object Manipulation Modeling<\/span><\/a><\/div>\n<\/div>\n<\/div>\n\n<\/li><li class=\"wp-block-post post-2662 post type-post status-publish format-standard has-post-thumbnail hentry category-uncategorized\">\n\n<div class=\"wp-block-columns alignwide is-layout-flex wp-container-core-columns-is-layout-cbe57604 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><figure class=\"aligncenter wp-block-post-featured-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1520\" height=\"526\" src=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Wang.jpg\" class=\"attachment-post-thumbnail size-post-thumbnail wp-post-image\" alt=\"MGDHand: Multi-Granularity Prior-to-Inertial Distillation Framework for\" title=\"MGDHand: Multi-Granularity Prior-to-Inertial Distillation Framework for\" style=\"object-fit:cover;\" srcset=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Wang.jpg 1520w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Wang-300x104.jpg 300w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Wang-1024x354.jpg 1024w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Wang-768x266.jpg 768w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_X_Wang-18x6.jpg 18w\" sizes=\"auto, (max-width: 1520px) 100vw, 1520px\" \/><\/figure><\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-9703d62a wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\"><h3 class=\"wp-block-post-title has-small-font-size\"><a href=\"https:\/\/cv.nirc.top\/zh\/2026\/mgdhand-multi-granularity-prior-to-inertial-distillation-framework-for\/\" target=\"_self\" >MGDHand: Multi-Granularity Prior-to-Inertial Distillation Framework for<\/a><\/h3>\n\n\n<div class=\"acf-post-field-block\">\n  <span>Xinyi Wang, Pengfei Ren\u2020, Haoyang Zhang\u2020, Hanling Zhan, Yingxi Li, Liang Xie, Yue Gao, Erwei Yin<\/span>\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-0f13be2a wp-block-group-is-layout-flex\"><div style=\"padding-right:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CVPR<\/div><\/div>\n\n<div style=\"padding-right:8px;padding-left:8px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">2026<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">(<\/p>\n\n\n<div style=\"padding-right:0px;padding-left:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CCF-A<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">)<\/p>\n\n\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-bd0213bd wp-block-group-is-layout-flex\" style=\"padding-right:0;padding-left:0\">\n\n\n\n\n\n<a style=\"border-radius:100px;border-width:0px; padding-top:0;padding-bottom:0;padding-left:0.5rem;padding-right:0.5rem;\" class=\"wp-block-read-more has-background has-tertiary-background-color has-x-small-font-size\" href=\"https:\/\/cv.nirc.top\/zh\/2026\/mgdhand-multi-granularity-prior-to-inertial-distillation-framework-for\/\" target=\"_blank\">ProjectPage<span class=\"screen-reader-text\">\uff1aMGDHand: Multi-Granularity Prior-to-Inertial Distillation Framework for<\/span><\/a><\/div>\n<\/div>\n<\/div>\n\n<\/li><li class=\"wp-block-post post-2658 post type-post status-publish format-standard has-post-thumbnail hentry category-uncategorized tag-highlight\">\n\n<div class=\"wp-block-columns alignwide is-layout-flex wp-container-core-columns-is-layout-cbe57604 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><figure class=\"aligncenter wp-block-post-featured-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1276\" height=\"666\" src=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_H_Chang.jpg\" class=\"attachment-post-thumbnail size-post-thumbnail wp-post-image\" alt=\"OMG-Bench: A New Challenging Benchmark for Skeleton-based Online Micro\" title=\"OMG-Bench: A New Challenging Benchmark for Skeleton-based Online Micro\" style=\"object-fit:cover;\" srcset=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_H_Chang.jpg 1276w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_H_Chang-300x157.jpg 300w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_H_Chang-1024x534.jpg 1024w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_H_Chang-768x401.jpg 768w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_H_Chang-18x9.jpg 18w\" sizes=\"auto, (max-width: 1276px) 100vw, 1276px\" \/><\/figure><\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-9703d62a wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\"><h3 class=\"wp-block-post-title has-small-font-size\"><a href=\"https:\/\/cv.nirc.top\/zh\/2026\/omg-bench-a-new-challenging-benchmark-for-skeleton-based-online-micro\/\" target=\"_self\" >OMG-Bench: A New Challenging Benchmark for Skeleton-based Online Micro<\/a><\/h3>\n\n\n<div class=\"acf-post-field-block\">\n  <span>Haochen Chang, Pengfei Ren\u2020, Buyuan Zhang, Da Li, Tianhao Han, Haoyang Zhang, Liang Xie, Hongbo Chen, Erwei Yin\u2020<\/span>\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-0f13be2a wp-block-group-is-layout-flex\"><div style=\"padding-right:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CVPR<\/div><\/div>\n\n<div style=\"padding-right:8px;padding-left:8px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">2026<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">(<\/p>\n\n\n<div style=\"padding-right:0px;padding-left:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CCF-A<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">)<\/p>\n\n\n<div style=\"padding-right:0;padding-left:var(--wp--preset--spacing--40)\" class=\"taxonomy-post_tag has-link-color wp-elements-cf4548bfb3ce851c875ebef3c1d25115 wp-block-post-terms has-text-color has-custom-color has-x-small-font-size\"><a href=\"https:\/\/cv.nirc.top\/zh\/tag\/highlight\/\" rel=\"tag\">Highlight<\/a><\/div><\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-bd0213bd wp-block-group-is-layout-flex\" style=\"padding-right:0;padding-left:0\"><div style=\"border-radius:100px; padding-right:0;padding-left:0;\" class=\"is-acf-field is-display-inline-block wp-block-mfb-meta-field-block has-background has-tertiary-background-color has-x-small-font-size\"><div class=\"prefix\"> <\/div><div class=\"value\"><a href=\"https:\/\/omg-bench.github.io\" target=\"\">Code<\/a><\/div><div class=\"suffix\"> <\/div><\/div>\n\n\n\n<div style=\"border-radius:100px; padding-right:0;padding-left:0;margin-right:0;margin-left:0;\" class=\"is-acf-field is-display-inline-block wp-block-mfb-meta-field-block has-background has-tertiary-background-color has-x-small-font-size\"><div class=\"prefix\"> <\/div><div class=\"value\"><a href=\"https:\/\/arxiv.org\/pdf\/2512.16727\" target=\"\">PDF<\/a><\/div><div class=\"suffix\"> <\/div><\/div>\n\n<a style=\"border-radius:100px;border-width:0px; padding-top:0;padding-bottom:0;padding-left:0.5rem;padding-right:0.5rem;\" class=\"wp-block-read-more has-background has-tertiary-background-color has-x-small-font-size\" href=\"https:\/\/cv.nirc.top\/zh\/2026\/omg-bench-a-new-challenging-benchmark-for-skeleton-based-online-micro\/\" target=\"_blank\">ProjectPage<span class=\"screen-reader-text\">\uff1aOMG-Bench: A New Challenging Benchmark for Skeleton-based Online Micro<\/span><\/a><\/div>\n<\/div>\n<\/div>\n\n<\/li><li class=\"wp-block-post post-2655 post type-post status-publish format-standard has-post-thumbnail hentry category-uncategorized\">\n\n<div class=\"wp-block-columns alignwide is-layout-flex wp-container-core-columns-is-layout-cbe57604 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><figure class=\"aligncenter wp-block-post-featured-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1520\" height=\"638\" src=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_S_Hao.jpg\" class=\"attachment-post-thumbnail size-post-thumbnail wp-post-image\" alt=\"A Temporal and Content Co-Awareness Latent Diffusion for Controllable Hand\" title=\"A Temporal and Content Co-Awareness Latent Diffusion for Controllable Hand\" style=\"object-fit:cover;\" srcset=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_S_Hao.jpg 1520w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_S_Hao-300x126.jpg 300w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_S_Hao-1024x430.jpg 1024w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_S_Hao-768x322.jpg 768w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_S_Hao-18x8.jpg 18w\" sizes=\"auto, (max-width: 1520px) 100vw, 1520px\" \/><\/figure><\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-9703d62a wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\"><h3 class=\"wp-block-post-title has-small-font-size\"><a href=\"https:\/\/cv.nirc.top\/zh\/2026\/a-temporal-and-content-co-awareness-latent-diffusion-for-controllable-hand\/\" target=\"_self\" >A Temporal and Content Co-Awareness Latent Diffusion for Controllable Hand<\/a><\/h3>\n\n\n<div class=\"acf-post-field-block\">\n  <span>Shuang Hao*, Pengfei Ren*, Haifeng Sun\u2020, Ting Pan, Qi Qi, Lei Zhang, Cong Liu, Jianxin Liao, Jingyu Wang\u2020<\/span>\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-0f13be2a wp-block-group-is-layout-flex\"><div style=\"padding-right:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CVPR<\/div><\/div>\n\n<div style=\"padding-right:8px;padding-left:8px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">2026<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">(<\/p>\n\n\n<div style=\"padding-right:0px;padding-left:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CCF-A<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">)<\/p>\n\n\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-bd0213bd wp-block-group-is-layout-flex\" style=\"padding-right:0;padding-left:0\">\n\n\n\n\n\n<a style=\"border-radius:100px;border-width:0px; padding-top:0;padding-bottom:0;padding-left:0.5rem;padding-right:0.5rem;\" class=\"wp-block-read-more has-background has-tertiary-background-color has-x-small-font-size\" href=\"https:\/\/cv.nirc.top\/zh\/2026\/a-temporal-and-content-co-awareness-latent-diffusion-for-controllable-hand\/\" target=\"_blank\">ProjectPage<span class=\"screen-reader-text\">\uff1aA Temporal and Content Co-Awareness Latent Diffusion for Controllable Hand<\/span><\/a><\/div>\n<\/div>\n<\/div>\n\n<\/li><li class=\"wp-block-post post-2651 post type-post status-publish format-standard has-post-thumbnail hentry category-uncategorized\">\n\n<div class=\"wp-block-columns alignwide is-layout-flex wp-container-core-columns-is-layout-cbe57604 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><figure class=\"aligncenter wp-block-post-featured-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1536\" height=\"546\" src=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_T_Han.jpg\" class=\"attachment-post-thumbnail size-post-thumbnail wp-post-image\" alt=\"UST-Hand: An Uncertainty-aware Spatiotemporal Point Cloud Interaction Network for 3D Self-supervised Hand Pose Estimation\" title=\"UST-Hand: An Uncertainty-aware Spatiotemporal Point Cloud Interaction Network for 3D Self-supervised Hand Pose Estimation\" style=\"object-fit:cover;\" srcset=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_T_Han.jpg 1536w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_T_Han-300x107.jpg 300w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_T_Han-1024x364.jpg 1024w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_T_Han-768x273.jpg 768w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2026\/03\/26cvpr_T_Han-18x6.jpg 18w\" sizes=\"auto, (max-width: 1536px) 100vw, 1536px\" \/><\/figure><\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-9703d62a wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\"><h3 class=\"wp-block-post-title has-small-font-size\"><a href=\"https:\/\/cv.nirc.top\/zh\/2026\/ust-hand-an-uncertainty-aware-spatiotemporal-point-cloud-interaction-network-for-3d-self-supervised-hand-pose-estimation\/\" target=\"_self\" >UST-Hand: An Uncertainty-aware Spatiotemporal Point Cloud Interaction Network for 3D Self-supervised Hand Pose Estimation<\/a><\/h3>\n\n\n<div class=\"acf-post-field-block\">\n  <span>Tianhao Han, Haoyang Zhang, Liang Xie, Haochen Chang, Kun Gao, Yuan Cheng, Pengfei Ren\u2020, Erwei Yin\u2020<\/span>\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-0f13be2a wp-block-group-is-layout-flex\"><div style=\"padding-right:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CVPR<\/div><\/div>\n\n<div style=\"padding-right:8px;padding-left:8px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">2026<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">(<\/p>\n\n\n<div style=\"padding-right:0px;padding-left:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CCF-A<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">)<\/p>\n\n\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-bd0213bd wp-block-group-is-layout-flex\" style=\"padding-right:0;padding-left:0\">\n\n\n\n\n\n<a style=\"border-radius:100px;border-width:0px; padding-top:0;padding-bottom:0;padding-left:0.5rem;padding-right:0.5rem;\" class=\"wp-block-read-more has-background has-tertiary-background-color has-x-small-font-size\" href=\"https:\/\/cv.nirc.top\/zh\/2026\/ust-hand-an-uncertainty-aware-spatiotemporal-point-cloud-interaction-network-for-3d-self-supervised-hand-pose-estimation\/\" target=\"_blank\">ProjectPage<span class=\"screen-reader-text\">\uff1aUST-Hand: An Uncertainty-aware Spatiotemporal Point Cloud Interaction Network for 3D Self-supervised Hand Pose Estimation<\/span><\/a><\/div>\n<\/div>\n<\/div>\n\n<\/li><li class=\"wp-block-post post-2499 post type-post status-publish format-standard has-post-thumbnail hentry category-multi-view-3d-pose-estimation\">\n\n<div class=\"wp-block-columns alignwide is-layout-flex wp-container-core-columns-is-layout-cbe57604 wp-block-columns-is-layout-flex\">\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-block-column-is-layout-flow\" style=\"flex-basis:33.33%\"><figure class=\"aligncenter wp-block-post-featured-image\"><img loading=\"lazy\" decoding=\"async\" width=\"2250\" height=\"802\" src=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2025\/10\/arch_overview-1-1.jpg\" class=\"attachment-post-thumbnail size-post-thumbnail wp-post-image\" alt=\"Unified 2D-3D Discrete Priors for Noise-Robust and Calibration-Free Multiview 3D Human Pose Estimation\" title=\"Unified 2D-3D Discrete Priors for Noise-Robust and Calibration-Free Multiview 3D Human Pose Estimation\" style=\"object-fit:cover;\" srcset=\"https:\/\/cv.nirc.top\/wp-content\/uploads\/2025\/10\/arch_overview-1-1.jpg 2250w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2025\/10\/arch_overview-1-1-300x107.jpg 300w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2025\/10\/arch_overview-1-1-1024x365.jpg 1024w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2025\/10\/arch_overview-1-1-768x274.jpg 768w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2025\/10\/arch_overview-1-1-1536x547.jpg 1536w, https:\/\/cv.nirc.top\/wp-content\/uploads\/2025\/10\/arch_overview-1-1-2048x730.jpg 2048w\" sizes=\"auto, (max-width: 2250px) 100vw, 2250px\" \/><\/figure><\/div>\n\n\n\n<div class=\"wp-block-column is-vertically-aligned-center is-layout-flow wp-container-core-column-is-layout-9703d62a wp-block-column-is-layout-flow\" style=\"flex-basis:66.66%\"><h3 class=\"wp-block-post-title has-small-font-size\"><a href=\"https:\/\/cv.nirc.top\/zh\/2025\/unified-2d-3d-discrete-priors-for-noise-robust-and-calibration-free-multiview-3d-human-pose-estimation\/\" target=\"_self\" >Unified 2D-3D Discrete Priors for Noise-Robust and Calibration-Free Multiview 3D Human Pose Estimation<\/a><\/h3>\n\n\n<div class=\"acf-post-field-block\">\n  <span>Geng Chen*, Pengfei Ren*, Xufeng Jian, Haifeng Sun\u2020, Menghao Zhang, Qi Qi, Zirui Zhuang, Jing Wang, Jianxin Liao, Jingyu Wang\u2020<\/span>\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-0f13be2a wp-block-group-is-layout-flex\"><div style=\"padding-right:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">NeurIPS<\/div><\/div>\n\n<div style=\"padding-right:8px;padding-left:8px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">2025<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">(<\/p>\n\n\n<div style=\"padding-right:0px;padding-left:0px;\" class=\"is-acf-field is-text-field wp-block-mfb-meta-field-block has-x-small-font-size\"><div class=\"value\">CCF-A<\/div><\/div>\n\n\n<p class=\"has-x-small-font-size\">)<\/p>\n\n\n<\/div>\n\n\n\n<div class=\"wp-block-group is-layout-flex wp-container-core-group-is-layout-bd0213bd wp-block-group-is-layout-flex\" style=\"padding-right:0;padding-left:0\">\n\n\n\n<div style=\"border-radius:100px; padding-right:0;padding-left:0;margin-right:0;margin-left:0;\" class=\"is-acf-field is-display-inline-block wp-block-mfb-meta-field-block has-background has-tertiary-background-color has-x-small-font-size\"><div class=\"prefix\"> <\/div><div class=\"value\"><a href=\"https:\/\/neurips.cc\/virtual\/2025\/loc\/san-diego\/poster\/117592\" target=\"_blank\" rel=\"noreferrer noopener\">PDF<\/a><\/div><div class=\"suffix\"> <\/div><\/div>\n\n<a style=\"border-radius:100px;border-width:0px; padding-top:0;padding-bottom:0;padding-left:0.5rem;padding-right:0.5rem;\" class=\"wp-block-read-more has-background has-tertiary-background-color has-x-small-font-size\" href=\"https:\/\/cv.nirc.top\/zh\/2025\/unified-2d-3d-discrete-priors-for-noise-robust-and-calibration-free-multiview-3d-human-pose-estimation\/\" target=\"_blank\">ProjectPage<span class=\"screen-reader-text\">\uff1aUnified 2D-3D Discrete Priors for Noise-Robust and Calibration-Free Multiview 3D Human Pose Estimation<\/span><\/a><\/div>\n<\/div>\n<\/div>\n\n<\/li><\/ul><\/div>\n<\/div>\n\n\n\n<div class=\"wp-block-group has-global-padding is-content-justification-left is-layout-constrained wp-container-core-group-is-layout-1f128930 wp-block-group-is-layout-constrained\" style=\"padding-top:var(--wp--preset--spacing--50);padding-bottom:var(--wp--preset--spacing--50)\"><nav class=\"wp-block-query-pagination is-content-justification-center is-layout-flex wp-container-core-query-pagination-is-layout-a89b3969 wp-block-query-pagination-is-layout-flex\" aria-label=\"\u5206\u9875\">\n\n\n<div class=\"wp-block-query-pagination-numbers\"><span aria-current=\"page\" class=\"page-numbers current\">1<\/span>\n<a class=\"page-numbers\" href=\"?query-12-page=2\">2<\/a>\n<a class=\"page-numbers\" href=\"?query-12-page=3\">3<\/a>\n<span class=\"page-numbers dots\">&hellip;<\/span>\n<a class=\"page-numbers\" href=\"?query-12-page=6\">6<\/a><\/div>\n\n<a href=\"\/zh\/wp-json\/wp\/v2\/pages\/684?query-12-page=2\" class=\"wp-block-query-pagination-next\">\u4e0b\u4e00\u9875<span class='wp-block-query-pagination-next-arrow is-arrow-chevron' aria-hidden='true'>\u00bb<\/span><\/a>\n<\/nav><\/div>\n<\/div>\n\n\n<\/div>\n\n\n\n<p><\/p>","protected":false},"excerpt":{"rendered":"<p>Research Highlights Over the past five years, our team has published over 40 high-level papers, including 25 in top-tier CCF-A venues. Notable achievements include a Distinguished Paper Award at AAAI 2023, an Oral presentation at ICCV 2023, an Oral at ECAI 2023, and three Highlights at CVPR 2026. Our work has been published in premier [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"_acf_changed":false,"footnotes":"","_links_to":"","_links_to_target":""},"class_list":["post-684","page","type-page","status-publish","hentry"],"acf":[],"_links":{"self":[{"href":"https:\/\/cv.nirc.top\/zh\/wp-json\/wp\/v2\/pages\/684","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/cv.nirc.top\/zh\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/cv.nirc.top\/zh\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/cv.nirc.top\/zh\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/cv.nirc.top\/zh\/wp-json\/wp\/v2\/comments?post=684"}],"version-history":[{"count":81,"href":"https:\/\/cv.nirc.top\/zh\/wp-json\/wp\/v2\/pages\/684\/revisions"}],"predecessor-version":[{"id":2690,"href":"https:\/\/cv.nirc.top\/zh\/wp-json\/wp\/v2\/pages\/684\/revisions\/2690"}],"wp:attachment":[{"href":"https:\/\/cv.nirc.top\/zh\/wp-json\/wp\/v2\/media?parent=684"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}