Notifications
Clear all

Object detection and avoidance using ultrasonics

44 Posts
5 Users
2 Likes
8,108 Views
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
 

After fiddling with this for a while I finally got a visual scan.   This isn't working properly though.  So I have a lot of troubleshooting to do yet.  Both on the scanning software and on the HC-SR04 wiring.  I either have a bad sensor or some bad wiring.

But at least I got something that looks promising.

Frist Scan

It's definitely not working right yet.  So it's a good thing I'm looking at this data visually, otherwise I would have no idea that I'm getting erroneous data.   There's no way that this represents anything meaningful.  So until I get some nice pictures like the fellow in the video got there's no point in trying to work with this data.

Being able to visually see the data you're getting can be really helpful.

So I'll see if I can get this all worked out and running smoothly until I get some dependable data that makes sense.

I'm also not getting the full screen of the radar so I'll need to look into that as well.  That's probably just a matter of changing the screen size on the Processing software code.  I'll get it all squared away I'm sure.

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
 

All fixed up already!

Second Scan

Much better!  I got the whole radar screen on the monitor and the distance data is making perfect sense now.  I'm a little disappointed in the short range of the HC-SR04 sensors though.   I thought they were capable of reaching out quite a bit further than this.

In fact, I'm not even sure if this is a limitation of the sensors?  Or is this a limitation caused by the fellow who wrote this Radar program?    I'm hope it's the latter. 

In fact, I just looked at the data sheet for the HC-SR04 sensor.   These are supposed to detect distances out to about 13 feet or 4 meters.   This fellow's radar screen only shows distances out to about 1 foot or 30cm.

I'm looking to use these to map distances out to about 5 or 8 feet (1.5 to 2.5 meters).  Clearly this is well within the sensors range capabilities.  So it looks like I'll need to modify this Radar program already to display what this thing is actually seeing at those distances.

In the end I'll write my own graphics program for this in C# anyway.   But I thought I'd use this radar screen for now simply because it's already written.

So that's my next adventure.  Get this thing displaying what it actually sees out to about 8 feet or 2.5 meters.  That's the range I'd like to be able to detect. 

I mean, I'll still have the ability to see things closer, but I want that far range mapping too.

So lots of work to do yet.  But at least I got it up and running.

Thanks Steve (@pugwash) for inspiring me to finally sit down and do this.  I've been putting it off way too long.

It's pretty cool.   I figured it would need to be modified though.  But I'll do it.   The first plan is to see if I can easily modify this current radar plotting sketch to plot the ranges I'm interested in.  I'd like to visually see what kind of resolution and repeatability I get at those longer ranges.  If it's pointing at distant objects that are sitting still it should produce a nice consistent pattern on the radar.

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 
Posted by: @robo-pi

In fact, I just looked at the data sheet for the HC-SR04 sensor.   These are supposed to detect distances out to about 13 feet or 4 meters.   This fellow's radar screen only shows distances out to about 1 foot or 30cm.

I turned the sonar toward the back wall and it measured 255cm which when I checked with a tape measure it was spot on. 

I haven't bothered with the Processing code yet.  I will see if I can write a slow scan and use the Arduino IDE's Serial Plotter option to see what kind of data it returns for different surfaces.

This is my current setup.

sonarScanner

   
ReplyQuote
(@pugwash)
Sorcerers' Apprentice
Joined: 5 years ago
Posts: 923
Topic starter  

@casey

Are you using the original sketch on the Arduino?

I have the problem that the turning speed is incredibly slow. I have tried to alter some parameters in the sketch but to no avail, Furthermore, as I mentioned above when turning through 60°, the servo needs about 700 ms, but the spec I found shows that it should only need 120 ms for a 60° turn.

I am talking about turning without stepping by 1° for the values I measured.

I am not sure whether I have a bad batch of SG90s or not!


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 
Posted by: @pugwash

@casey

Are you using the original sketch on the Arduino?

I tested the servo motor with this sketch,
https://www.arduino.cc/en/Tutorial/Sweep
which was about 3 seconds a scan each way.
I tested the sonar with Bill's first example,
https://dronebotworkshop.com/servo-motors-with-arduino/
I have yet to combine the two into a proper sonar scanner program.


   
ReplyQuote
(@pugwash)
Sorcerers' Apprentice
Joined: 5 years ago
Posts: 923
Topic starter  

@casey

Thanks, I'll give it a try!


   
ReplyQuote
(@pugwash)
Sorcerers' Apprentice
Joined: 5 years ago
Posts: 923
Topic starter  

@casey, @Robo-Pi

Tried it out and the angular velocity increased to something usable.

But these cheapo servers for 1€ each are not the best. If I set the low value to 45 and the high value to 135, I am only getting an angular movement of about 45°. So I guess I will have to source some better quality servos after I have finished just playing with them.

Let me explain my line of thinking.

With a moveable SR04 on the front and static SR04s on each side. I was thinking of dividing the scan in front of the robot into two zones, for want of a better wording umbra and penumbra. If an object was found in the umbra zone the robot would turn until the object was in the penumbra zone and then move forward. Keeping the edge of the object more or less on the zone boundary would then follow the contours of the obstacle at a constant distance from it, a distance which can also be confirmed by the static mounted SR04 on the side facing the obstacle.

I was thinking about using variables like

boolean inUmbra, inPenumbra

Let me know if you think this sounds absolutely crazy ? 


   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
 
Posted by: @pugwash

Let me explain my line of thinking.

With a moveable SR04 on the front and static SR04s on each side. I was thinking of dividing the scan in front of the robot into two zones, for want of a better wording umbra and penumbra. If an object was found in the umbra zone the robot would turn until the object was in the penumbra zone and then move forward. Keeping the edge of the object more or less on the zone boundary would then follow the contours of the obstacle at a constant distance from it, a distance which can also be confirmed by the static mounted SR04 on the side facing the obstacle.

I was thinking about using variables like

boolean inUmbra, inPenumbra

Let me know if you think this sounds absolutely crazy

I'll be interested to see the coding techniques others are using.    I've noticed on my scans that readings aren't  perfect.  Nothing new there.  Real systems are seldom perfect.   But what I'm noticing is that every once in a while I get totally bogus readings that I know aren't right.    On the  radar screen this shows up as a solid green line where it should have turned read detecting a solid object.

For this reason individual readings cannot be trusted.  To make it work dependably several readings will need to be taken over a short span of the scan and then averaged.  With the average value being taken to be the actual distance to the object.  In fact, it's not even that simple.  Averaging alone could still give erroneous readings because the readings that were "dropped" are dramatically outside of the actual distance.   Thus including those extreme false readings in the averaging process will distort the final calculations.  So it will most likely require a far more sophisticated averaging technique. 

The idea I have is to average over a small set of measurements. say maybe 6 readings.  And if those six readings contain one or two readings that are dramatically different then just drop the extreme readings and average over what's left.

This averaging process will of course, round over any sharp corners that might actually exist.   But I think that overall it will give a more dependable distance measurement.   I'm glad you got me starting on the Steve. I can see where I'll need to do quite a bit of work with this before actually mounting it on a robot.  Using this little Mega scanner as a prototype I can iron out all these problems so when it comes time to install it on the robot I'll have all these bugs worked out.

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
(@pugwash)
Sorcerers' Apprentice
Joined: 5 years ago
Posts: 923
Topic starter  

@robo-pi

I think that for collision avoidance or navigating around an obstacle, reliable readings in the 20cm to 30cm range would be critical and I would be looking at repeatability rather than averaging i.e. two matching results from three readings.

It seems that you are attempting to map out a position in terms of what is on the bots "visual" horizon. Am I correct in that assumption?

My progress is slow.

The problem with just getting the servo up to speed held me up for a day, but that is fixed now.

At the moment I am trying to install the .NET package on a Mac. Once I get that up and running I will try to get the Processing Program to work. I have reluctantly gone down this road as C# is totally foreign to me. Initially, I thought about throwing a Python script together, to handle the graphics.

If that is then successful, I will start sorting out the Arduino sketch to handle the Ultrasonic readings.

And the final part will be a row of LEDs, mounted on a curved PCB, in sync with the Ultrasonic modules motion as I wrote that ages ago. Suggestion by @Etinkerer

No, I won't, I am not that mad yet ? ? ? 


   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
 
Posted by: @pugwash

It seems that you are attempting to map out a position in terms of what is on the bots "visual" horizon. Am I correct in that assumption?

It is true that we are all approaching the use of this sonar sensor from vastly different perspectives and for different purposes as well.   It's all good.

My purpose is related to a specific navigation system where my robot will hopefully know exactly where it is at all times. Because of this these sensors will only be required to verify the surrounding environment and make small updates and corrections for the precise location of the robot relative to the main floor plan.

As an example, let's say that my robot starts out in my robot lab and the mission is to go through a specific doorway, down a hallway past a couple of other doorways and through a doorway into the kitchen.

To begin its journey it first, already knows that its in the robot lab.  It already has a blueprint of the floor plan of the lab and knows which door leads to the kitchen.    All this distance sensor needs to do is provide detailed distance measurements so the robot can verify it's precise orientation in the lab, recognize the doorway that leads to the kitchen, and then navigate through that doorway without banging into any walls.  So this distance sensor will be working in harmony with a floor plan the robot already had.   It's simply going to be used to update the precise location of the robot within the lab until the robot finally passes through the correct doorway. 

Once it does this, it will then know that it's now in the hallway and will then change to using the floor plan of the hallway.  From there it will use the sensor data to figure out when it has reached the proper doorway leading to the kitchen.  Then it will use the sensor data to navigate through that doorway and so on.

So at every step of the process the robot will have a good understand of where it is at within it's known world.

The distance sensors are only used to verify its precise location within that known world.

So that's my ultimate goal.

There's actually far more to it than this.  Before I get this far I'm going to have it map out the entire house and build this floor plan itself.   I'll have that process displayed on my computer screen so I can see the floor plan that the robot builds for my house.  So this is where I'm starting with my robots.

I know a lot of people are trying to jump right into building a fully autonomous robot that can deal with never-before-seen terrain.   And that's fine.   When my robot builds the initial floor plan for my house it will basically be starting from scratch in uncharted territory, but as it builds the floor plan it will get to know the territory until it finally knows exactly where it can go throughout the entire house.

So my navigation system will include this floor plan.  And all of the sensors will really only need to be used to verify its location and orientation.   And of course this will eventually include detecting unpredictable obstacles.   But the robot first needs to know what the pristine floor plan looks like.  Only then can it recognize when something is out of place.

So I have a larger scheme that I will be incorporating this scanning sensor into.   It's not going to try to figure out what's going on via the distance sensor input alone.   That could be a nightmare.  Unless your sole goal is obstacle avoidance.  But that's not navigation.   That's just obstacle avoidance.   If all you're doing is avoiding obstacles, then how does your robot know where it's at, or where it's trying to get to?

So yes, I'm working on a complete navigation system so the robot knows where it is at all times.

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 
Posted by: @robo-pi
The idea I have is to average over a small set of measurements. say maybe 6 readings.  And if those six readings contain one or two readings that are dramatically different then just drop the extreme readings and average over what's left.

Study the data. You might find simply comparing a reading with the one before it and the one after it will reveal a blip and simply replace that with an average of the two on either side of it. Also check if there is any electronic noise that could be eliminated with a capacitor or separate power supply?

You might be able to use distance data to recognise not only where you are in a room but which room it is, providing the rooms are of different dimensions, and also be able work out the orientation of the robot to those walls by the nature of the return signals. A sonar scanner set up higher than any furniture might be required.

I really need to make a portable version of a sonar measuring tape using a LCD display so I can walk around and turn around and see how the reading change.

The bottom line is, as I know you are aware of,  is to study the data to work out the algorithms required.

 


   
ReplyQuote
(@pugwash)
Sorcerers' Apprentice
Joined: 5 years ago
Posts: 923
Topic starter  

@robo-pi

Trans Atlantic cooperation is always good news,

Sharing and discussing alternative views,

Now you've come around,

To working with sound,

I am honoured to be your inspiring muse.

 

Good Night! ? ? ? 


   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
 
Posted by: @casey

Study the data. You might find simply comparing a reading with the one before it and the one after it will reveal a blip and simply replace that with an average of the two on either side of it.

Yes, that's a good idea.   A single blip that's way off from surrounding blips is going to be an erroneous reading because it's not likely the the actual distance would  have change that dramatically in that short of a scan unless we're scanning a chain link fence. ? 

Posted by: @casey

Also check if there is any electronic noise that could be eliminated with a capacitor or separate power supply?

Yes that too.  In fact, I think their might still be some bad connections between the sensor and the Mega.  I'll be looking into that.  On the final robot I'll probably be using an STM32 board anyway since they run at 72Mhz.   And then I can solder the connections to the sensors directly eliminating any possible bad connections.

Posted by: @casey

The bottom line is, as I know you are aware of,  is to study the data to work out the algorithms required.

Agreed.  In fact, I'll definitely be analyzing the actual numerical data as well.  I just liked the idea of this visual radar screen to get a feel for how well it's doing overall.   In fact, I'm actually thinking about eventually taking the final data into a Raspberry Pi.  Then I can analyze it there using Numpy and all manner of fancy mathematics.  Numpy handles large arrays of data very quickly.

Posted by: @casey

You might be able to use distance data to recognise not only where you are in a room but which room it is, providing the rooms are of different dimensions, and also be able work out the orientation of the robot to those walls by the nature of the return signals. A sonar scanner set up higher than any furniture might be required.

That's an interesting thought too.   Especially using Numpy arrays and matrix algebra.    You could save a matrix scan of a room and then compare that scan to a scan taken at a later time.   Then you could calculate the probability that you are in the same room.

The possibilities are endless.   This also makes it difficult for the programmer because there are so many methods that can be used the hardest part is often deciding which method to choose.   I'd like to try them all, but life is short. ? 

Since I'm already using Numpy and Linear Algebra for Neural Networking I'll probably be tempted to go that route for analyzing room scans as well.    Room scans will only be 2D by the way since we're typically just scanning a horizon of data points on a single plane.  But that could be enhanced as well by adding either another servo or stepper motor to take a few different horizontal scans at slightly different vertical angles.

It's going to be fun.  I've got a very long way to go before I get that involved with analyzing the data.   If fact, I think I'd like to set this up on an STM32 blue pill first thing.   That will give me both more processing speed, and more memory to work with.   Plus I won't mind soldering the HC-SR04 directly to the I/O pins.   So that's probably the next thing to do.

In the mean time, I'm still wrestling the tree that fell on my house.  I was just up on the roof cutting off branches.  Gotta get back out there and cut some more.

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
 
Posted by: @pugwash

@robo-pi

Trans Atlantic cooperation is always good news,

Sharing and discussing alternative views,

Now you've come around,

To working with sound,

I am honoured to be your inspiring muse.

 

Good Night! ? ? ? 

@dronebot-workshop

Look out Bill.  The poets are multiplying!

It won't be long now before this becomes the DroneBot Worshop Poetry Consortium. ? 

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
(@pugwash)
Sorcerers' Apprentice
Joined: 5 years ago
Posts: 923
Topic starter  

@robo-pi

You obviously know more about this than I do!

I downloaded the Processing code from the website, but it does not have a main() loop, therefore I surmise that this runs in another framework. I also notice that the code references java libraries and I was wondering why not C# libraries.

I think I am going to give up on the C# stuff as I had an extremely frustrating day yesterday, just trying to get the "Hello World" app to run!


   
ReplyQuote
Page 2 / 3