I am a PhD student in the Department of Computer Science at the State University of New York at Binghamton, advised by Shiqi Zhang. I primarily work on inventing Augmented Reality/Mixed Reality/Virtual Reality immersive technologies at the AIR Lab towards enabling fluid human-robot teamwork. I am a full-stack roboticist passionate about building end-to-end robotic systems to bridge the observability gap amongst the human−robot teams.

Previously, I have interned as an Applied Scientist at Amazon Robotics, and as a Research Intern at NEC Labs America working on challenging problems on human-robot collaboration and autonomous cars.

News

04/28/2022: I will be returning to Amazon Robotics as an Applied Scientist Intern for Summer 2022!
03/23/2022: Scholarship awared by AAMAS 2022 committee!
01/31/2022: Paper on context-aware robot planning under partial observability accepted to ICRA!
12/19/2021: Paper on learning visualization strategy for Augmented Reality accepted to AAMAS!
09/29/2021: Presented our Paper on learning to guide human attention in 360 video at IROS!
06/01/2021: Presented our Paper on AR-based human-multi-robot collaboration to ICRA!
05/23/2021: Joined Amazon Robotics as an Applied Scientist Intern!
03/25/2021: Patent on AR-based human-multi-robot collaboration accepted by USPTO!
12/01/2021: Co-offered a 3-week course on "AI and Robotics" for the Lyceum program wiht AIR and ACSR lab members.

Publications

  • Saeid Amiri, Kishan Chandan, and Shiqi Zhang, Reasoning with Scene Graphs for Robot Planning under Partial Observability, IEEE International Conference on Robotics and Automation (ICRA) - RA-L Option, 2022

    Paper
  • Kishan Chandan, Jack Albertson, and Shiqi Zhang, Augmented Reality Visualizations using Imitation Learning for Collaborative Warehouse Robots, International Conference on Autonomous Agents and Multiagent Systems (AAMAS) - Extend Abstract, 2022

    Paper
  • Kishan Chandan, Jack Albertson, Xiaohan Zhang, Xiaoyang Zhang, Yao Liu and Shiqi Zhang, Learning to Guide Human Attention on Mobile Telepresence Robots with 360 degree Vision, IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 2021,

    Paper Paper Webpage
  • Kishan Chandan, Vidisha Kudalkar, Xiang Li, Shiqi Zhang, ARROCH Augmented Reality for Robots Collaborating with a Human, IEEE International Conference on Robotics and Automation (ICRA), 2021, Xi’an, China

    Paper Paper Webpage
  • Yohei Hayamizu, Saeid Amiri, Kishan Chandan, Keiki Takadama, Shiqi Zhang, Guiding Robot Exploration in Reinforcement Learning via Automated Planning, International Conference on Automated Planning and Scheduling (ICAPS), 2021, Guangzhou, China

    Paper
  • Kishan Chandan, Xiaohan Zhang, Jack Albertson, Xiaoyang Zhang, Yao Liu and Shiqi Zhang, Guided 360-Degree Visual Perception for Mobile Telepresence Robots, The RSS Workshop on Closing the Academia to Real-World Gap in Service Robotics, 2020

    Paper
  • Kishan Chandan, Xiang Li, Shiqi Zhang, Negotiation-based Human-Robot Collaboration via Augmented Reality, The AAAI Fall Symposium on AI for Human-Robot Interaction (AI−HRI), 2019 Arlington, Virginia USA

    Paper

Research Experience

Applied Scientist Intern

Amazon Robotics

Research Intern

NEC Labs America

Research Assistant

Autonomous Intelligent Robotics Lab at SUNY Binghamton

Open-Source Projects

SUGAR2 Simulator

Simulator for Human-multi-robot teams and their communication mediated by Augmented Reality

Github Code

Patents

  • Shiqi Zhang, Kishan Chandan, Human-Robot Collaboration via Augmented Reality. US Provisional Patent 62/902,830

    Paper