wiki:Other/Summer/2024/rPV

Remotely Piloted Vehicles

Team Members: Dhruv Ramaswamy, Nandini Venkatesh, Daniel Mahany.

Advisor: Dr. Richard Martin

Project Objective

Low-latency networking is an emerging technology for use in remote sensing and control of vehicles and robots. In this project, interns will develop software for remote piloting vehicles with both mecanum and front wheel steering using low latency networks. Additional sensing, such as additional cameras, range sensors, and audio feedback will be added to aid the pilots. At the end of the project, interns will demonstrate remote piloting a vehicle attached from anywhere in the Internet with speed, precision, and accuracy, in a modeled urban environment. Interns will also evaluate the strengths and weaknesses of remote piloting interfaces for ground-based vehicles.

Week 1

https://docs.google.com/presentation/d/1KgfRrQT7PE0PJOVMWC2rKG9Tb7BbIIxIYtdgFP0HuoM/edit#slide=id.g2e9657b10b6_0_1187

  • Set up GitLab and ORBIT accounts.
  • Re-assembled Servo Steer car and wrote code to get it driving.
  • Accessed the Raspberry Pi terminal via SSH to run code wirelessly.
  • Explored camera options for Servo Steer car.
  • Researched methods for wireless data transmission between the microcontroller and a laptop.
    • Started establishing a client-server connection between a laptop and the Raspberry Pi on the Servo Steer car.

Goals for next week:

  1. Establish the client-server connection with our laptop and the Servo Steer car.
    • Begin developing client-side controls
  1. Set up mecanum hardware and electronics (eventually focus on maneuverability and efficiency)

Week 2

https://docs.google.com/presentation/d/15SKymwatZo4Q-aR_amo0FfayNvq5fdqTo6Vjl0Z1Jw4/edit#slide=id.g2eb1fe2e89d_0_410

  • Fully assembled hardware and electronics for the mecanum robot. We got the mecanum car to drive in all directions including strafing and moving diagonally.
  • Established the client-server connection with our laptop and the Servo Steer car.
    • Created server (running on Raspberry Pi) using Python to listen for UDP packets and interact with GPIO to control motors.
    • Created a client (Using electron and react.js) to make the user interface that sends packets with UDP to the server
  • Utilized ZeroTier to set up a VPN that creates a VLAN which, hypothetically, can allow us to connect to the server from anywhere in the internet.
  • Learned bash scripting and Linux to run server code and expedite development
  • Updated GitLab repository, used package managers (pip, npm), build tools (vite), and formaters (eslint)

Challenges/Issues:

  • Server efficiency (eliminated inefficient code - like "while True" - that blocks main thread and hogs CPU).
  • While assembling mecanum hardware we had to reverse engineer the pin mapping of our OOSYOO motor drivers.
  • Figuring out optimal architecture to minimize latency.

Goals for next week:

  1. Further develop the client/server software.
    • Begin developing client-side controls for steering (potentially implementing virtual joysticks)
    • Begin sending camera frames to the client side
  1. Adapt the client and server code from the servo steer car for use with the mecanum car.
  1. Experiment and test low latency networks and ZeroTier to reduce lags
  1. Add features that improve robot functionality and aid the user(object detection, sensor integration, emergency stop)

Week 3

https://docs.google.com/presentation/d/1dE6sDB7KTBhIhoNsi63dWFy2VHPEhrmBQT2i97j46TE/edit#slide=id.g2ecb81f61a9_0_383

  • Adapted the client and server code from the servo steer car for use with the mecanum car.
  • Created a client-side web application using ……
    • Full movement controls for the mecanum car were added.
    • Experimented with various cameras and tested different pre-processing and post-processing algorithms in OpenCV to evaluate their impact on latency.
    • Integrated camera feed from the vehicles in the client's web application.
  • Configured the ZeroTier VPN to establish a connection between the client and server over multiple networks/hotspots.
    • This connection was sometimes inconsistent between multiple networks.
    • On Thursday, we successfully used ZeroTier to allow one of our team members to control the mecanum car remotely from home. This suggests that our previous issues could have been due to firewalls on or near the main lab network.

Challenges/Issues:

  • We experienced issues with certain cameras connecting to the Rasberry Pi.
  • We faced inconsistencies with ZeroTier, likely due to configuration issues.

Goals for next week:

  1. Further develop the user interfaces for both vehicles
  1. Confirm our suspicions about ZeroTier inconsistencies
  1. Add distance sensors (ultrasonic) for the emergency stop mechanism.
  1. Begin working on the academic poster.

Week 4

https://docs.google.com/presentation/d/1Dc46WQi3UIjuc6GAXkPZOoEWaIXUsZ9bxaexI6z-T0E/edit?usp=sharing

  • Updated UI for both vehicles.
    • Finished key mappings
    • Fixed small bugs
  • Added the emergency stop feature to the mecanum car.
    • The car now stops right before colliding with an object.
    • You are able to move in every direction except forward when you are too close to an object.

Challenges/Issues:

  • We experienced issues with the ultrasonic sensor and experimented extensively with different detection and stop ranges.

Goals for next week:

  1. Improve functionality of car
  2. Add sound transmission between the server and the client
  3. Work on academic poster and final presentations

Week 5

https://docs.google.com/presentation/d/1vQq6wfQSIRbO_n_My5IGZVRohouu_YvSkFY8rMYQKk8/edit#slide=id.g2ecb81f61a9_0_5015

  • Further Updated UI
    • Prevented dramatic servo turns to prevent RPI crash
    • Created a ping feature
    • Created a toggle safety mode button and added a list of keyboard shortcuts
    • Debouncing sliders/buttons/error messages (to prevent server overload of data)
  • Improved Video Streaming
    • Allows you to retry and fetch for video
    • Shows error states
  • Fixed bugs in python script
    • Changed pin factory for more accurate ultrasonic sensor readings
    • Debugged threading errors

Final Product

  • Final Presentation

https://docs.google.com/presentation/d/1D70dHCT41cXl2-ZLfhSjVh6BIAjm6CUfTcdDUls1Fhw/edit?usp=sharing

  • Academic Poster

https://docs.google.com/presentation/d/1sS2uvE9UVEM8LEIy41J68LChZfBXQS-A/edit?usp=sharing&ouid=104555442836944404727&rtpof=true&sd=true

https://gitlab.orbit-lab.org/dhruv-ramaswamy/remotelypilotedvehicles


Future Goals:

  • Reduce latency further for remote connections over the VLAN
    • Look into different methods of sending the camera feed
  • Improve Driver Usability
    • Easier keyboard shortcuts
    • Smart Driving Features
Last modified 5 months ago Last modified on Aug 6, 2024, 2:41:42 PM
Note: See TracWiki for help on using the wiki.