Jian Meng @ Cornell

I am a fifth-year Ph.D. candidate in the Department of Electrical Engineering at Cornell (Tech), advised by Dr. Jae-sun Seo. I received my Bachelor of Science degree in Electrical Engineering from Portland State University, advised by Dr. Christopher Teuscher.

I’ve also spent some great time on collaboration/internship with multiple top-tier industrial research facilities and universities, including Meta Reality Lab, Texas Instrument, ARM, KAIST, Georgia Tech, and University of Pennsylvania.

My update-to-date resume can be found at here

Education

2023 - present: ——— Ph.D., Electrical Engineering, Cornell University, New York City, USA.

2019 - 2023: ——— Ph.D., Electrical Engineering, Arizona State University, Tempe, USA.

2016 - 2019: ——— B.S., Electrical Engineering, Portland State University, Portland, USA.

Research Focus

  • Energy-efficient contrastive unsupervised learning.
  • Hardware-aware neuromorphic algorithm design.
  • Neural Network compression and sparsification.
  • Spiking neural network; Event-based computer vision.
  • Energy-efficient neural network accelerator hardware design.

Open-sourced Tools:

  • Torch2Chip: End-to-end solution for deep learning model deployment with customized compression algorithm

    [Paper] [Code]

Selected Publications

The full publication list is available here

*= Equal contribution

  • [ECCV’24] Jian Meng, Yuecheng Li, Chenghui Li, Syed Shakib Sarwar, Dilin Wang, and Jae-sun Seo, “POCA: Post-training Quantization with Temporal Alignment for Codec Avatars,” ECCV, 2024. [Collaborated with Meta Reality Lab].

  • [IJCNN’24] Ahmed Hasssan, Jian Meng, and Jae-sun Seo, “Spiking Neural Network with Learnable Threshold for Event-Based Classification and Object Detection,” IEEE International Joint Conference on Neural Networks (IJCNN).

  • [ICONS’24] Ahmed Hasssan, Jian Meng, Anupreetham and Jae-sun Seo, “IM-SNN: Memory-Efficient Spiking Neural Network with Low-Precision Membrane Potentials and Weights,” ACM International Conference on Neuromorphic Systems (ICONS), 2024.

  • [MLSys’24] Jian Meng, Yuan Liao, Anupreetham Anupreetham, Ahmed Hasssan, Shixing Yu, Han-sok Suh, Xiaofeng Hu, and Jae-sun Seo, “Torch2Chip: An End-to-end Customizable Deep Neural Network Compression and Deployment Toolkit for Prototype Hardware Accelerator Design”, The Seventh Annual Conference on Machine Learning and Systems, 2024.

  • [NeurIPS’23] Jian Meng, Li Yang, Kyungmin Lee, Jinwoo Shin, Deliang Fan, Jae-sun Seo, “Slimmed Asymmetrical Contrastive Learning and Cross Distillation for Lightweight Model Training”, Thirty-Seventh Conference on Neural Information Processing Systems, 2023.

  • [NeurIPS’22] Jian Meng*, Li Yang*, Deliang Fan, Jae-sun Seo, “Get More at Once: Alternating Sparse Training with Gradient Correction”, Thirty-Sixth Conference on Neural Information Processing Systems, 2022. (link)(code)

  • [CVPR’22] Jian Meng, Li Yang, Jinwoo Shin, Deliang Fan, and Jae-sun Seo, “Contrastive Dual Gating: Learning Sparse Features With Contrastive Learning,” Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022. (link) [poster] [Invited to IBM AI Hardware Research Forum]

  • [ACM TRET] Han-sok Suh, Jian Meng, Ty Nguyen, Vijay Kumar, Yu Cao, and Jae-sun Seo, “Algorithm-Hardware Co-Optimization for Energy-Efficient Drone Detection on Resource-Constrained FPGA”, ACM Transactions on Reconfigurable Technology and Systems.

  • [DATE’22] Wangxin He, Jian Meng, Sujan Kumar Gonugondla, Shimeng Yu, Naresh R. Shanbhag, and Jae-sun Seo, “PRIVE: Efficient RRAM Programming with Chip Verification for RRAM-based In-Memory Computing Acceleration”, Design, Automation and Test in Europe Conference (DATE), 2023.

  • [IEEE SSCM] Jae-sun Seo, Jyotishman Saikia, Jian Meng, Wangxin He, Han-sok Suh, Anupreetham, Yuan Liao, Ahmed Hasssan, and Injune Yeo, “Advances in Digital vs. Analog AI Accelerators,IEEE Solid-State Circuits Magazine, 2022

  • [ESSCIRC’22] Shreyas K. Venkataramanaiah, Jian Meng, Han-Sok Suh, Injune Yeo, Jyotishman Saikia, Sai Kiran Cherupally, Yichi Zhang, Zhiru Zhang, and Jae-sun Seo, “A 28nm 8-bit Floating-Point Tensor Core based CNN Training Processor with Dynamic Activation/Weight Sparsification,IEEE European Solid-State Circuits Conference (ESSCIRC).

  • [DAC’22] Fan Zhang, Li Yang, Jian Meng, Jae-sun Seo, Yu Cao, and Deliang Fan, “XMA: A Crossbar-aware Multi-task Adaption Framework via Shift-based Mask Learning Method,ACM/IEEE Design Automation Conference (DAC), 2022.

  • [DATE’22] Fan Zhang, Li Yang, **Jian Meng**, Jae-sun Seo, Yu Cao and Deliang Fan, “XST: A Crossbar Column-wise Sparse Training for Efficient Continual Learning,” IEEE Design, Automation & Test in Europe (DATE), 2022. [Best IP (Interactive Presentations) Paper Award]. (link)

  • [IRPS’22] Jian Meng, Injune Yeo, Wonbo Shim, Li Yang, Deliang Fan, Shimeng Yu, and Jae-sun Seo “Sparse and Robust RRAM-based Efficient In-memory Computing for DNN Inference” (IRPS), 2022. (link)

  • [IEEE MICRO] Jian Meng, Wonbo Shim, Li Yang, Injune Yeo, Deliang Fan, Shimeng Yu, and Jae-sun Seo, “Temperature-Resilient RRAM-based In-Memory Computing for DNN Inference,IEEE Micro, vol. 42, no. 1, pp. 89-98, January/February 2022. [Invited to IBM AI Hardware Research Forum]. (link) [poster]

  • [IEEE JETCAS] Arnab Neelim Mazumder, Jian Meng, Hasib-Al Rashid, Utteja Kallakuri, Xin Zhang, Jae-sun Seo, and Tinoosh Mohsenin, “A Survey on the Optimization of Neural Network Accelerators for Micro-AI On-Device Inference,IEEE Journal on Emerging and Selected Topics in Circuits and Systems (JETCAS), vol. 11, no. 4, pp. 532-547, December 2021.(link)

  • [FPT’21] Han-sok Suh, Jian Meng, Ty Nguyen, Shreyas K. Venkataramanaiah, Vijay Kumar, Yu Cao, and Jae-sun Seo, “Algorithm-Hardware Co-Optimization for Energy-Efficient Drone Detection on Resource-Constrained FPGA,” IEEE International Conference on Field-Programmable Technology (FPT), 2021. (link)

  • [FPL’21] Jian Meng, Shreyas Kolala Venkataramanaiah, Chuteng Zhou, Patrick Hansen, Paul Whatmough and Jae-sun Seo, “FixyFPGA: Efficient FPGA Accelerator for Deep Neural Networks with High Element-Wise Sparsity and without External Memory Access”, International Conference on Field Programmable Logic and Applications (FPL), 2021. (link)

  • [IEEE TCAS-II] Jian Meng, Li Yang, Xiaochen Peng, Shimeng Yu, Deliang Fan, and Jae-sun Seo, “Structured Pruning of RRAM Crossbars for Efficient In-Memory Computing Acceleration of Deep Neural Networks,IEEE Transactions on Circuits and Systems II (TCAS-II), vol. 68, no. 5, pp. 1576-1580, May 2021. (link)

Work Experience

  • 2024.5 - 2024.8 ——— Research Scientist, Meta Reality Lab
    • Manager: Yuecheng Li
  • 2023.5 - 2023.8 ——— Research Scientist, Meta Reality Lab
    • Manager: Yuecheng Li
  • 2019.9 - Present ——— Research Assistant, Seo Lab, Arizona State University / Cornell University
  • 2022.1 - 2022.5 ——— Teaching Assistant, Arizona State University.
  • 2021.5 - 2021.8 ——— System Engineer, Kilby Lab, Texas Instrument.
  • 2018.9 - 2019.5 ——— Teaching Assistant, Portland State University.
  • 2018.1 - 2019.5 ——— Undergraduate Tutor, Portland State University.

Professional Service

  • Reviewer

    • Reviewer of NeurIPS (2022, 2023, 2024), CVPR (2023, 2024), ICCV (2023), ICLR (2023, 2024), TNNLS (2022, 2023)
    • 2022 Reviewer of IEEE Journal of Exploratory Solid-State Computational Devices and Circuits
    • 2021 Reviewer of ACM Transactions on Reconfigurable Technology and Systems (TRET).
    • 2021 Reviewer of IEEE Transactions on Circuits and Systems II: Express Briefs (TCAS-II).
    • 2021 Reviewer of IEEE Journal on Emerging and Selected Topics in Circuits and Systems (JETCAS).

Teaching Experience