My employment type:

Full-time employees

My work level:

Entry level

I have posibility relocation
I'm looking for a position:

Top Executive

Desired salary

90000 $ per year

Work experience

  • L
    Lead Software developer
    Blast AI 20.05.2023 - 15.09.2023

    • Led a team of 3 engineers to develop a web app that assesses users’ loan approval chances using machine learning. • Built a model leveraging OpenAI’s API to ascertain loan approval chances for a user using the prototype. • Data is processed by a model to generate a confidence score for loan approval likelihood. • The prototype secured funding from stakeholders and won a spot at the 2023 CogX startup summit in London.

  • R
    Robotics Software Engineer | Intern
    Void Robotics 01.03.2023 - 30.06.2023

    • Implemented sophisticated algorithms for the advanced navigation stack of a mobile robot using ROS2 Humble. • Utilized RViz for detailed real‑time sensor data analysis, accurate map creation, and efficient pathfinding processes. • Automated the development workspace using GitHub Actions, significantly improving operational efficiency.

  • G
    Graduate Research Assistant
    Collaborative Robotics Lab 01.06.2021 - 15.05.2022

    • Codeveloped pneumatic haptic interfaces to facilitate effective communication of robot learning processes. • Successfully implemented the Imitation Learning Algorithm (Ensemble DAgger) on a UR‑10 robotic arm. • Conducted user studies and surveys to find the best method for robot communication over 3 different baselines. • Enhanced learning outcomes by approximately 7% and reduced teaching time by about 20% compared to existing base‑lines, demonstrating significant efficiency improvements.

  • R
    Robotics Controls and GUI Intern
    Advanced Controls Systems Lab 15.05.2020 - 20.08.2020

    • Utilized C++ for motion planning of the Widow WX200 Robotic Arm within the Gazebo simulation environment. • Leveraged MATLAB as a dedicated ROS node to effectively operate and control the robotic arm. • Designed and implemented a user‑friendly GUI for robot manipulation across 3 env: Real‑world, Gazebo, and Simulink.

  • P
    Product Management Intern
    ABB 20.05.2019 - 09.08.2019

    • Conducted detailed competitive analysis to accurately determine market share and assess product quality. • Developed and significantly enhanced innovative training modules for future sales employees’ use. • Formalized and streamlined the product, technology, and research development processes effectively. • Successfully obtained yellow belt certification in lean six sigma training.


  • V
    Virginia Tech
    17.08.2021 - 10.05.2024 - M.Eng in Computer Science - M.S. in Mechanical Engineering
  • V
    Virginia Tech
    17.08.2017 - 15.05.2021 B.S. in Mechanical Engineering


C/C++ Computer skills General computer skills HTML JavaScript Python




  • P
    PaperPalooza (Computer Science Capstone Project)

    • Developed innovative web application, PaperPalooza, enhancing academic research and management. • Integrated APIs for plagiarism checks, citation generation, and scholarly searches efficiently. • Implemented AI chatbot for instant, user‑friendly academic support and queries. • Designed responsive UI for seamless access on both desktop and mobile platforms.

  • R
    Real‑World Language‑driven Zero‑Shot Object Navigation

    • Developed a language‑driven navigation agent with zero‑training for real‑world settings. • Utilized Intel Realsense camera on Viam Rover for object navigation to unseen objects. • Integrated ImageBind embeddings combining visual depth and textual data for navigation. • Achieved object navigation in unknown environments with minimal depth data error impact.

  • B
    BookstoreBrew: Interactive E‑Commerce Bookstore Platform

    • Created an e‑commerce bookstore platform using Vue.js, enhancing user book browsing and purchasing experiences. • Integrated SQL databases and RESTful APIs, optimizing data handling for real‑time inventory and user management. • Implemented Pinia Store for state management, ensuring a responsive and seamless interaction across the platform.

  • A
    Automotive perception using car’s dashcam

    • Implemented lane detection using Hough Transform, enhancing real‑time navigation systems. • Developed vehicle tracking system employing KLT approach for dynamic traffic management. • Applied YOLOv4 for accurate pedestrian and vehicle detection in surveillance applications.

  • A
    Autonomous Indoor Mapping and Navigation system

    • Used gmapping SLAM for accurate indoor Occupancy Grid Mapping. • Localized Turtlebot3 with AMCL; navigated using move_base, navfn, and dwa_local_planner. • Integrated advanced navigation for autonomous indoor exploration and mapping.

  • R
    Robotic Manipulation and Perception System in ROS

    • Programmed Fetch robot for precise object grasping on tables using ROS. • Utilized point‑cloud perception for enhanced robotic object recognition. • Integrated MoveIt! package for dynamic manipulation planning in ROS simulations.

  • T
    Tendon Based Actuation for Soft‑Robotic Bat head (Senior Design)

    During my senior year of undergraduate studies, I participated in a design project focused on creating a tendon-actuated soft robotic bat pinnae and noseleaf. As part of a team of seven mechanical engineering seniors, one faculty advisor, and two graduate students, my primary responsibilities included system integration, performance testing, selecting appropriate servo motors as actuators, and implementing code in the microcontroller. Additionally, I helped choose the right silicone material for the bat pinnae and noseleaf and determined the material properties of molded silicone using an Instron machine. Our innovative approach led to a more robust, approximately 28% smaller, and 61% lighter Batbot compared to previous iterations, closely mimicking bat pinnae and noseleaf movements by adding degrees of freedom. This hands-on experience provided valuable insights into teamwork, problem-solving, and system design.

  • A
    Agbot Challenge (Winner)

    • Helped design a control strategy for autonomous soil testing equipment. • Tasked with performing position and speed control of multiple dynamixel servos and DC motors using C • Created nodes in ROS, published messages / got notes to subscribe to messages, built ROS packages from source. • Connected Raspberry Pi and Arduino Mega via ROS for dynamixel, DC motors, and electronics control.


  • W
    Wrapping Haptic Displays Around Robot Arms to Communicate Learning

    Humans can leverage physical interaction to teach robot arms. As the human kinesthetically guides the robot through demonstrations, the robot learns the desired task. While prior works focus on how the robot learns, it is equally important for the human teacher to understand what their robot is learning. Visual displays can communicate this information; however, we hypothesize that visual feedback alone misses out on the physical connection between the human and robot. In this paper we introduce a novel class of soft haptic displays that wrap around the robot arm, adding signals without affecting that interaction. We first design a pneumatic actuation array that remains flexible in mounting. We then develop single and multi-dimensional versions of this wrapped haptic display, and explore human perception of the rendered signals during psychophysic tests and robot learning. We ultimately find that people accurately distinguish single-dimensional feedback with a Weber fraction of 11.4%, and identify multi-dimensional feedback with 94.5% accuracy. When physically teaching robot arms, humans leverage the single- and multi-dimensional feedback to provide better demonstrations than with visual feedback: our wrapped haptic display decreases teaching time while increasing demonstration quality. This improvement depends on the location and distribution of the wrapped haptic display.

  • W
    Wrapped Haptic Display for Communicating Physical Robot Learning

    Physical interaction between humans and robots can help robots learn to perform complex tasks. The robot arm gains information by observing how the human kinesthetically guides it throughout the task. While prior works focus on how the robot learns, it is equally important that this learning is transparent to the human teacher. Visual displays that show the robot's uncertainty can potentially communicate this information; however, we hypothesize that visual feedback mechanisms miss out on the physical connection between the human and robot. In this work we present a soft haptic display that wraps around and conforms to the surface of a robot arm, adding a haptic signal at an existing point of contact without significantly affecting the interaction. We demonstrate how soft actuation creates a salient haptic signal while still allowing flexibility in device mounting. Using a psychophysics experiment, we show that users can accurately distinguish inflation levels of the wrapped display with an average Weber fraction of 11.4%. When we place the wrapped display around the arm of a robotic manipulator, users are able to interpret and leverage the haptic signal in sample robot learning tasks, improving identification of areas where the robot needs more training and enabling the user to provide better demonstrations. See videos of our device and user studies here:


  • C/C++ - level 8/10
  • Python - level 8/10
  • Java - level 8/10
  • JavaScript - level 8/10
  • SQL - level 7/10

Views count - 7