attempting to use s...
 
Notifications
Clear all

attempting to use sonar to make maps

34 Posts
3 Users
10 Reactions
1,626 Views
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

https://forum.dronebotworkshop.com/sensors-modules/characterizing-a-vl53l1x-tof-based-ranging-sensor/#post-51509

Continuation of the above discussion and hopefully future success using sonar data to make maps.

@tfmccarthy

I've pulled together various sketches and I'm testing them. I have to add serial port support to connect to the laptop. I've ordered the Adafruit "pan and tilt" that should arrive next week but I'm cobbling together a simple pan that should replicate the radar sketch. This to show how to get a "floorplan" map.

When I have the pan and tilt working, I plan to demonstrate producing a mesh to give a 2D result and then connect that to a solid modeling library to get a 3D result. Then a use SDL to add imagery to align with the model.

I first played with sonar back in Sep 2019!!

https://forum.dronebotworkshop.com/sensors-modules/object-detection-and-avoidance-using-ultrasonics/paged/3/



   
PeterTrad and Lee G reacted
Quote
TFMcCarthy
(@tfmccarthy)
Member
Joined: 1 year ago
Posts: 435
 

Posted by: @robotbuilder

I first played with sonar back in Sep 2019!!

Yes, I saw that one. I'm not sure if I added into my test set but I will.


The one who has the most fun, wins!


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@tfmccarthy

I'm focused on getting a high degree of confidence in data collection rather than how that data will be used.

Observing readings can give you an idea how the software might best deal with erratic or variable data. When I was writing code that involved reading the values from a touch pad for the C64 I noticed big glitches in the values which wouldn't give a very good reading if all you did was average the readings instead I found making only three readings and only averaging the two closest readings removed the glitches.

I have yet to set up my sonar hardware to get more readings in different areas of the house but in the meantime this is the old readings in text form instead of an image. The second two values are from the servo motor sweep back and I see there are some anomalies between the two sweep values.

 

  10            103.53         103.72         103.77         103.79
  15            103.55         103.72         104.29         103.95
  20            104.08         104.19         105.63         103.95
  25            104.55         105.08         104.86         104.92
  30            103.83         105.18         104.94         112.57
  35            103.93         103.02         112.47         136.58
  40            107.46         105.3o         136.65         105.76
  45            112.30         105.56         136.79         136.62
  50            131.40         139.24         138.50         138.95
  55             93.93         138.78          83.30          83.3
  60             87.72          87.17          22.47          22.91
  65             83.26          84.12          21.49          21.49
  70             22.45          22.43          20.97          20.94
  75             21.25          21.51          20.94          20.22
  80             20.92          20.97          20.03          20.48
  85             20.46          20.48          20.46          20.49
  90             20.46          19.89          20.96          20.99
  95             20.46          20.46          21.47          22.33
 100             20.96          20.92         113.21         113.58
 105             21.42          21.45         112.13         111.92
 110            110.63         110.86         112.18         116.67
 115            111.35         111.42         115.11         123.24
 120            123.48         112.69         144.69         144.45
 125            144.47         113.62         145.35         145.79
 130            145.07         150.27         147.16         143.89
 135            144.66         145.29         142.29         143.46
 140            143.53         144.16          17.97          17.53
 145            144.08         141.49          16.21          16.21
 150             18.38          17.73          16.21          16.31

 

sonarScan

This post was modified 5 months ago by robotBuilder

   
Lee G reacted
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

The image below shows our LIDAR returning distance values from four points one in each quadrant. As the lidar rotates around 360 degrees it takes a reading at every degree. These polar coordinates can be converted into cartesian coordinates to plot the points on a map. These point values are relative to the 2d coordinates of the center of the lidar. The absolute positions of the points on the map requires the absolute coordinate values of the robot be added to them. It also requires them to be rotated around the robot by the angle (direction) in which it is pointing. This shows the robot pointing right along the x axis at zero degree so the rotation would be zero.

sonarBeams

The map on the left below shows the absolute values of the walls and the position and direction in which the robot is pointing. The map on the right below shows the assumed position and direction in which the robot is pointing which it assumes it is in the center of the map and pointing in the direction of zero degrees because unless you give it the correct starting values it wouldn't know.

 

lidarBot1

When the lidar rotates it can plot the return values on a clear map (on the right). Because in this the assumed position and direction of the robot on the absolute map is "incorrect" the result is the points are rotated and displaced on the created map compared with the absolute map.

lidarBot2

Now as the robot moves about it can build up a complete map. You cannot see the complete map on the right because I didn't make it big enough to show the rotated and shifted result.

lidarBot3

Now all this presumes perfect odometry (always knowing exactly how far the robot has moved and in what direction). Dealing with accumulated odometry errors is the next part of the problem to be solved.

Also this mapping has to be autonomous. So for example after scanning one room it might use the data to find the door and move to another room.

 


This post was modified 5 months ago by robotBuilder

   
Lee G reacted
ReplyQuote
TFMcCarthy
(@tfmccarthy)
Member
Joined: 1 year ago
Posts: 435
 

Posted by: @robotbuilder

The image below shows our LIDAR

Did I miss the LIDAR part number you're using? I'm assuming it's 5v compatible (or thereabouts). I'm trying to gauge component size. You guys tend to build furniture size robots whereas I'm restricted to my desktop. :p

nitpick: the LIDAR data has a typo; should be "distance" not "direction", yes?

Any old North will do

sigh, the polar map image had me twisting in my chair. Been a while since I played with polar coordinates but here in the U.S., back when I learned it, the polar graph was the mathematical one, zero angle pointing right, angles increase counterclockwise

Polar graph paper

Ah, but then there's the navigation orientation (bearing, heading) with zero angle pointing up and angles increase clockwise. When I looked it up it isn't fixed. I've just never seen it pointing down. So, I have a crick in my neck.

An observation of a 360-degree LIDAR is that it's a sort of "one-shot" range finder. The example is after taking readings from room A

LIDAR ex1

After taking the initial readings, further readings from room A provide no additional information Rooms B and C remain unknown regions and moving about in room A doesn't change the layout unless there's a serious resolution issue.

The same thing will occur once you completely enter room B. A single reading completes the picture. This works for a simple layout, as shown; the space is open, with no mazes, or objects.

I think an important issue is the ability to detect doorways, for example,

LIDAR ex2

Here the measured distance is for point C. We can see the point is outside of the room layout, but how is that detected? If the previous measured point is A, then the line segment A-B is the height of the right triangle and determines the threshold width of an opening that the robot can pass through. If it's below the threshold, then line segment A-C is a wall (for all intents and purposes) otherwise it's an opening to an unknown region the robot can explore.

The further I get into this the more issues there seem to be and I'm concerned about the number of calculations needed. (The worst one I have thought of so far is how to detect "Have I been here before?". I think I need a North Star.) Also, the amount of memory used to store the layout needs serious consideration. A 360 LIDAR uses a single level, and that's a limitation. Bill covered a smaller one that could be used as a replacement of the sonar sensor I'm using but that's getting way ahead of the cart.

These are really unorganized thoughts so don't look too deeply.


The one who has the most fun, wins!


   
Lee G reacted
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@tfmccarthy 

Did I miss the LIDAR part number you're using? I'm assuming it's 5v compatible (or thereabouts). I'm trying to gauge component size.

It is a simulation....

I am trying to get the code and math working. I don't actually have a lidar.

Of course the real fun is impementing it in real hardware where the theory means the practical application which can throw up issues not part of a simulation.

nitpick: the LIDAR data has a typo; should be "distance" not "direction", yes?

Brain glitch! Thank you for pointing it out.

Been a while since I played with polar coordinates but here in the U.S., back when I learned it, the polar graph was the mathematical one, zero angle pointing right, angles increase counterclockwise

I am using screen coordinates. (0,0) top left corner. y axis goes down not up a result of the fact that is how we type text. left/right top/down. You can flip the image before displaying it. I am just used to leaving it that way.

Not sure about the door room issue you are talking about.  I will code the simulated robot to move about autonomously to show how it can find doors and so on ...

(The worst one I have thought of so far is how to detect "Have I been here before?". I think I need a North Star.)

Indeed that is the whole issue that results from the fact odometry that tracks the position and direction or the robot accumulates errors. In the simulation the "odometry" is perfect and I will have to introduce errors to work out how to deal with them which is much what SLAM is about.

Time of flight laser measurements give you distance for any direction but you can fuse that with vision as well. Trying to do all this is just a mental challenge for me. The boffins are well ahead of the game.

This example uses an inertial measurement unit (IMU).

Main ingredient of SLAM is loop closure.

To accurately know where we were we also have to fuse the IMU to tell you where you should be.


This post was modified 5 months ago 2 times by robotBuilder

   
Lee G reacted
ReplyQuote
TFMcCarthy
(@tfmccarthy)
Member
Joined: 1 year ago
Posts: 435
 

Component Mount Problem/Rant

I can't figure out the intended method of mounting this device.

mount 1

It appears that you should use screws since there are mounting holes for them. The screw would be a 1.2 mm and, believe-you-me, that's small. I can't see how to insert them from the back without interference from the transmitter/receiver. If you insert them from the front you discover there's no room on the PCB edge for purchase other than directly below the mounting hole. So you need some sort of spacer to cover the 1.2 mm hole or the 1.2 corner of a piece of wood. Both of those are a big ask from my POV. I haven't had much luck with the screws as they keep stripping the wood thread as there's so little material to work with. And disassembly is problematical.

I can't figure out how this is intended to be done so that you end up with a firmly secured sensor. The only example I have is from Elegoo that uses a custom plastic mold that has a clip for the edge. Even that has some movement flexibility. (My suspicion is that millimeters count with this device.) The example from Elegoo uses a plate with holes for the transmitter and receiver which I think serve to stabilize the sensor. I've used a form of that for my mount, but I'm not fully satisfied with it.

I'm hoping someone has a reliable way to do this that doesn't require printing a plastic part.


The one who has the most fun, wins!


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@tfmccarthy 

(My suspicion is that millimeters count with this device.)

Maybe not. Why not just try and see. The online examples they just glue it to the servo motor. I used small breadboards. Maybe not all that firm but I don't think in practice they move. In another case I just drilled holes in wood to push the cylinder parts into and it seemed to hold firm.

robotBase 1
597 sonarScanner2

This post was modified 5 months ago by robotBuilder

   
ReplyQuote
huckOhio
(@huckohio)
Member
Joined: 6 years ago
Posts: 323
 

@robotbuilder 

Here are a couple mounts for the HC-SR04 devices.  Maybe they'll work.



   
ReplyQuote
huckOhio
(@huckohio)
Member
Joined: 6 years ago
Posts: 323
 

@robotbuilder 

I found another mount...

 



   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@huckohio 

I think those are 3d printer files?

Although I guess you could always get someone who has a 3D printer to print them out for you.

 



   
ReplyQuote
TFMcCarthy
(@tfmccarthy)
Member
Joined: 1 year ago
Posts: 435
 

@huckohio 

Thanks. These will be useful to experiment with. I don't have a printer, but I hope I can replicate them using other material.

I guess the mounting holes are decorative. 🙃 


The one who has the most fun, wins!


   
ReplyQuote
TFMcCarthy
(@tfmccarthy)
Member
Joined: 1 year ago
Posts: 435
 
Title

Model #1: Evaluation Build

The first build is for the purpose of evaluation of the model structure and sensor operation. Model #1, (which I dub "The Sonic Scanner") looks like

Sonic Scanner2

The model consists of

  • HC SR04 ultrasonic
  • ESP32 WROOM-32
  • Adafruit Mini Pan-Tilt kit
  • 5v breadboard power supply
  • Excessive expectations
  • Limited brain power

There are 2 important measurements needed for the evaluation tests: the sensor elevation and horizontal origin offset.

dimension65

These measurements are within .5 cm (approximate). The elevation is used to position the target within the sensor field of vision, and the horizontal origin offset is used to set the "true distance" of the target. Neither of these measurements is very precise but accurate enough for the purpose of evaluation.

Similarly, the sensor orientation isn't very precise either. Setting the sensor to be perpendicular to the target depends on the servo precision (very low) and the sensor mount precision (imaginary). So they're relative. But again, they shouldn't affect the evaluation tests.  (sez me.)

The development software is

  • Arduino IDE v 2.3.6
  • Arduino CLI v 1.3.0
  • Visual Studio 2022 v 17.13.7
  • ESP32 board manager v 3.3.0

The evaluation sketches are:

RandomNerds Tutorial

DBWS demos

    • HC-SR04-Basic-Demo.ino
    • HC-SR04-NewPing.ino
    • HC-SR04-NewPing-Duration.ino
    • HC-SR04-NewPing-Iteration.ino
    • irobotbuilder's 2017 sketch

How To Mechatronics

Evaluation Framework

Rather than use separate sketches I combined them into a single sketch, with the imaginative name HC_SR04_Ultrasonic.ino. Each sketch is wrapped in a namespace within the master namespace dbws::hc_sr04,

namespace dbws {
    namespace hc_sr04 {
        namespace esp32 { ... }
        namespace basic { ... }
        namespace newping {
            namespace basic { ... }
            namespace ns_duration { ... }
            namespace iteration { ... }
        }
        namespace irobotbuilder { ... }
        namespace radar { ... }
    }
}

Each sketch can then be placed in a separate module simplifying which sketch is used for a particular run. The sketch is built using the Arduino CLI command line tool to compile, upload and monitor the sketch, which mimics the Arduino IDE.

Other than setting common constants (SOUND_SPEED, TRIGGER_PIN, ECHO_PIN), a template function to simplify output to the Serial device, and outputting an identification string of the sketch, the sketches are unmodified.

Skecth Output

The purpose is to produce a set of sample output data of each sketch. Each sketch starts with the sensor unpowered. Then power is applied to the sensor to get readings. The target is a piece of cardboard paper approximately 20cm away.

RandomNerds Tutorial

...
dbws::hc_sr04::esp32::setup
Distance (cm): 0.00
Distance (inch): 0.00
...
Distance (inch): 0.00
Distance (cm): 21.88
Distance (inch): 8.61
Distance (cm): 18.38
Distance (inch): 7.24
Distance (cm): 17.61
Distance (inch): 6.93
Distance (cm): 17.90
Distance (inch): 7.05
Distance (cm): 18.21
Distance (inch): 7.17
...

DBWS Basic

...
dbws::hc_sr04::basic::setup
Distance = Out of range
...
Distance = Out of range
Distance = 18.56 cm
Distance = 18.56 cm
Distance = 17.77 cm
Distance = 17.82 cm
Distance = 17.53 cm
...

DBWS Newping Basic

...
dbws::hc_sr04::newping::basic::setup
Distance = Out of range
...
Distance = Out of range
Distance = 17.00 cm
Distance = 57.00 cm
Distance = 20.00 cm
Distance = 25.00 cm
Distance = 30.00 cm
...

DBWS Newping duration

...
dbws::hc_sr04::newping::duration::setup
Distance = Out of range
...
Distance = Out of range
Distance = 19.12 cm
Distance = 68.22 cm
Distance = 19.60 cm
Distance = 68.22 cm
Distance = 19.59 cm
...

DBWS Newping iteration

...
dbws::hc_sr04::newping::iteration::setup
Distance = Out of range
...
Distance = Out of range
Distance = 15.59 cm
Distance = 23.02 cm
Distance = 17.89 cm
Distance = 17.82 cm
Distance = 17.82 cm
...

irobotbuilder

...
dbws::irobotbuilder::setup
Distance = 17.84 cm   10
Distance = 33.84 cm   15
Distance = 35.35 cm   20
Distance = 36.12 cm   25
Distance = 35.71 cm   30
...
Distance = Out of range
...
Distance = Out of range

How To Mechatronics

...
dbws::hc_sr04::radar::setup
15,0.00.
...
20,0.00.
21,36.00.
22,36.00.
23,36.00.
24,35.00.
25,36.00.
...

commentary

Disturbia. Nobody reports the sensor reading, i.e., time duration in microseconds.

All but 2 algorithms produce fractional values. In the case of Newping::basic, the method used to read the sensor is Sonar::ping_cm which returns a whole number of centimeters. In the Mechatronics case, it's a calculation error, e.g.,

distance = (duration * SOUND_SPEED) / 2;

The integer division truncates the fractional part.

This framework can serve as the basis for further experimentation and analysis.


The one who has the most fun, wins!


   
Will, Lee G and robotBuilder reacted
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@tfmccarthy 

The target is a piece of cardboard paper approximately 20cm away.

You have been a busy boy 🙂

Ultimately of course we need to collect distance data regardless of the surface material and things like chair legs. A nice flat wall is ideal and maybe if the sensor is high enough it will miss the clutter and just reflect the four walls of a room.

A single sound pulse can return pulses of data from more than one source over a time frame.

Bats use sonar and create a 3d view on one reflection into their funny shaped ears. They can scoop up a flying insect out of the air.

I have seen some people who are blind have taught themselves to use clicks to "see" the world around them.

Reading about the use of sonar (ideal underwater apparently) it is computationally heavy to massage the data into something useful.

It is common for those who like to fish to have a sonar device to display the fish under the boat.

 


This post was modified 5 months ago 2 times by robotBuilder

   
ReplyQuote
TFMcCarthy
(@tfmccarthy)
Member
Joined: 1 year ago
Posts: 435
 

Posted by: @robotbuilder

Ultimately of course we need to collect distance data regardless of the surface material and things like chair legs. A nice flat wall is ideal and maybe if the sensor is high enough it will miss the clutter and just reflect the four walls of a room.

Yes, irregular and hidden surfaces are issues relating to resolution and surface coverage. Resolution will be device dependent and limits performance of map building, e.g., for low res, build the map in small, iterative passes; similar to the "fog of war" mode in WarCraft 1 & 2. Hidden surfaces are, I think, similar, but I haven't thought it through. Of course, a single elevation of the range finding is sub-optimal and why I choose the mini Pan-Tilt kit; I want "wall-to-wall, floor-to-ceiling" scanning.

A single sound pulse can return pulses of data from more than one source over a time frame.

I'm not sure what this means. A single pulse won't produce a second source, but the echo will come back over time. What do you mean here? You don't mean background 40KHz noise? (I have thought about a test for what "silence" means.)


The one who has the most fun, wins!


   
ReplyQuote
Page 1 / 3