Wednesday, October 25, 2023

NEW Project Autonomous underwater vehicle!

 

I have been more and more interested in Autonomous Underwater Vehicles and ocean exploration. I live on the coast with direct access to the Atlantic Ocean, so it’s always been tempting to see what’s out there. In September I attended the a 2 week summer course on ocean technology to learn more about the problem space and investigate the industry.

To prepare for the course and showcase my talent I spent all of August 2023 building an AUV.

I started by reading scientific papers on AUV design. Essentially most AUVs on the market are torpedo shaped with fins for steering and a thruster for propulsion. I didn’t find the mechanical design part of the problem interesting, so I based my project off an open source design called amethyst auv [https://beobachtung3d.com/]. This is an excellent design that allowed me to progress quickly.

Since I had 1 month to build it, prior to the start of the course, the biggest problem for me was getting all the components in time. I had to buy all of the electronics from amazon rather than aliexpress, since it would take too long to arrive. This made the price of the auv much higher. And it also limited me in the design, since I could only use stuff that would arrive quickly, rather than pick the best suited component and just wait for it.

Another problem I encountered is the actual 3d printing. Since I wanted the chassis to be strong, I changed to PETG from PLA. This presented a couple of problems; the filament absorbs moisture at a much faster rate. I live in a live humid city due to the proximity to the water, which greatly affected the printing results. The pieces came out very droopy, and I had to bake the filament in the oven in order to dry it. I also had to dramatically reduce the print sprint and put blue tape on the print bed to combat the adhesion issues. All of this increased the time it took to complete the project.

In the end I wasn’t able to finish the whole thing, but I made enough progress to show it to marine engineers and scientists to get some feedback. I gained a lot of useful insights.

  • In order to make the petg fully waterproof I will need to coat it.
  • I should put brake lube on the oring that seals the waterproof compartment.
  • Ballasting is super important and presents a lot of difficulties.
  • Underwater positioning is a huge challenge and usually relies on underwater gps.
  • I have also learned that it is possible to model currents in gazebo using the boltzman lattice method.
  • Most AUVs cost thousands if not hundreds of thousands  of dollars.
  • You can use stereo camera vision with some algorithms to estimate how far an object is away from the robot.

 

 

I’ll probably add more information later. I have yet to fully capture all I’ve done in the 2 weeks of the course. And the AUV project is ongoing.

Sunday, June 18, 2023

Extended Kalman filter

 I'm trying to implement the Extended Kalman filter from the robot_localization package by clearpath, and I finally configured the launch file and the yaml file that holds all of the parameters. It turns out that all I had to do is to actually list the topics in the yaml file and voila! I have a readout of the pose estimate with covariance. However it seems that when I run the ekf noe along with all the other stuff the serial connection has a problem. At least the robot doesn't respond to the move commands anymore.

I'm honestly really tired of the serial problems as it's preventing me from working on the more interesting stuff like SLAM.

Here's a list of stuff I would like to work better:

PID. Currently I'm just using proportional control and it is very smooth. When I try to tune the PID controller better it usually ends up kinda jittery. So if this was better the error between the set velocity and the actual velocity would be less, which is also beneficial for the ekf. Because then I could use the control vel as an input. And enable that parameter to be used.


EKF. For some reason running this makes the robot irresponsive.

SLAM. I tried making a map of my house by driving the robot around, but there was an error and the map didn't work and it didn't show up in Rviz as I wanted.

Kinect: I'm having trouble with the depth camera and the stream in general. It seems that there isn't enough bandwidth on the USB bus if that makes any sense. It works fine on the pc, but the raspberry pi really struggles for some reason. Am I actually running into a resource limitation or am I just doing this in a really stupid way?

In any case, I finally finished my course on Data Structures and Algorithms and can now get back into working on this. uggghhhhhhhh. just do the thing!

On a happier note, I want to try loading my urdf model into Gazeebo. I think that could be fun, because I could assume that in a perfect world I wouldn't have hardware problems and could just drive the robot around.

Another thing I could start doing is contributing to open source communities.


Tuesday, March 28, 2023

Added the kinect

I finally soldered and installed the kinect to the front of the robot. 

The kinect boasts an impressive array of sensors including an depth camera, RGB camera and microphones. My intention is to use the kinect for SLAM and maybe teleoperation. The microphone would be able to listen to commands such as "move forward". Technically it can also be used for a bunch of fancy stuff like gesture recognition, since its original purpose was to be a gaming console that tracks player's movements.

Naturally I've had to elevate the LIDAR so that the kinect isn't blocking the laser. This is not ideal because now the lidar can't detect objects that are below a height of 25 cm. So its going to have a hard time detecting the small stuff I might have on my floor. This limitation makes it worthwhile for me to learn computer vision and use the kinect camera for object detection.



Monday, March 27, 2023

Adding URDF

So now that I have a whole bunch of sensors on the robot it's time to talk about transforms. 

Transforms are the relationships between different frames of reference on the robot. For example the center of the robot could be 5 cm off the ground, but the laser scanner is 10 cm off the ground. 

So your transform would be ( 0 0 5) to indicate that it is 5 cm up in the z direction (the convention is x y z).

However as you can imagine it gets messy trying to keep track of all of the transforms as the robot grows in complexity. What can end up happening is that you have transforms tucked away in different files, and its not obvious where to find them if you need to change one. 

Enter URDF!

URDF stands for Unified Robot Description Format and it is used to create a visual model of the robot as well as describe the relationships between links. Essentially it is an XML file with some dimensions and other values. The great thing about it is that you can use macros to make it really efficient to specify the details and have one central spot to edit the robot characteristics.

Recently I created my own URDF file to describe my robot. What I need to do next is to connect it with a robot state publisher that will publish the transforms for me.

Thursday, February 16, 2023

Current state of the robot

 So currently the robot can be teleoperated via keyboard and scan the room using the lidar. I also connected the IMU and 4 ultrasound sensors. 

Driving the robot around is really fun, but because the wheels are all controlled individually it doesn't go straight. Surprisingly the strafing mode works a lot better than just driving forward. 

I did some digging online and found out that one way to solve this problem is to control the linear and angular velocities individually. 

Below is a graph of what I currently have as a setup on the robot and what I want to have.


At first I didn't want to do this, because I didn't want the kinematics calculations to be on the Arduino. I'm going to have to find a linear algebra library or figure something else out. 

One thing that's kinf of weird is that Twist commands are in Float 64 format, but that doesn't work with ros-serial. It automatically converts it to Float 32 and a loss of precision occurs. 

One thing I'm really excited for is to implement an Extended Kalman filter. I learnt about it in school during my Bachelor's degree and it seems like an advanced concept with a lot of cool applications.

LIDAR experiements

  

 Working on the lidar I discovered that you first need to create the build directory before you cd into it.

$ git clone https://github.com/YDLIDAR/YDLidar-SDK.git
$ cd YDLidar-SDK/build

$ cmake ..

$ make

$ sudo make install

 

lidar is -60mm in the x dimension. 0 in the y. 1/2 height of robot in the z.


It can be launched with 

roslaunch ydlidar_ros_driver X4.launch

roslaunch ydlidar_ros_driver lidar_view.launch

Then you can view the results in rviz.

The next step is for me to establish the transforms from the lidar to the center of the robot. I can do that by defining a static tf broadcast or by specifying that in the URDF (
Unified Robotics Description Format) file. Then I can make the visualization better.




Sunday, January 8, 2023

odometry implemented!

 Hey guys, I haven't been keeping up with the blog, because I've been progressing too fast to write down what I've done. I have finally completed the odometry node on the robot and implemented a tf broad cast on it.

 I know that odometry isn't a reliable method for localization, so I've also bought an IMU. After some test I was able to connect it to the Arduino Mega using the SDA pins. I haven't calibrated it yet so I don't know how accurate it will be, but I can combine the readings from the IMU and Odometry to implement an Extended Kalman filter, which should make the pose estimation a lot better.

I'm very worried about the Serial communication between the Mega and the Pi. There are a lot of parameters here. It seems to be working so far though. 

ALSO! I finally bought a lidar, which sits on the robot like a little hat! This will let me map the environment and navigate obstacles. I'm really excited to use it!

NEW Project Autonomous underwater vehicle!

  I have been more and more interested in Autonomous Underwater Vehicles and ocean exploration. I live on the coast with direct access to th...