Inqling Junior - Ro...
 
Notifications
Clear all

Inqling Junior - Robot Mapping, Vision, Autonomy

240 Posts
10 Users
87 Likes
18.7 K Views
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @robotbuilder

@inq 

It will be interesting to see what kind of data you get from the TOF.

In the mean time I have been reading up about SLAM algorithms and wonder what kind of computing power must be in a robot vacuum cleaner to perform the task. I have updated the simulation to test out different algorithms to navigate and map a house. Although the "lidar" data is clean at the moment I will add realistic noise to test those algorithms. Also the simulated robot has perfect odometry which means I will have to add error to that as well to make sure the algorithms can work even if the robot is not exactly where it should be or even if it is moved physically by a human.

In the snapshot below the actual simulated world is on the left and the "lidar" data seen by the robot is on the right. When the robot turns the actual data on the right will rotate. My previous simulated lidar code I posted actually returns the x,y position of the hit point when in fact the only data you have is the direction and distance of the hit point so the x,y position has to be calculated before it is plotted.

To enlarge an image, right click image and Open link in new window.

simBot8

Your post is requiring a LOT more thought... that I haven't gotten to yet.  Let me see if I understand what you have here...

  • First off... is the simBot8 a program you've already written?!  
  • The left side is a hand-generated set of lines.
  • I see robot position
  • The right side it computer program, ray traces being sent out to intersect walls.
  • It even looks like the data points have a consistent angular increment.
  • As you described earlier with your ray-trace game.

If I'm reading the above right... this is 90% of the problem.

 

Your making me have to do a rethink.  Me designing a 30 minute tail-dragger versus another week of self-balancing (might work) logic may need to get back-burner'd.

I'll be glad to share my bone with you.

image

 

Posted by: @robotbuilder

It will be interesting to see what kind of data you get from the TOF.

Let me see about getting some data out of this sensor.  Would this be of interest to you?

  • Multiple samples perpendicular to a plain white wall at different ranges - Gives baseline sensitivity, noise content is best conditions.
  • Some oblique to wall.
  • Some with furniture and obstructions.  With photos to correlate.

Let me know if you're interested.

VBR,

Inq

 

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
byron
(@byron)
No Title
Joined: 5 years ago
Posts: 1121
 
Posted by: @inq

I finally thought...  I'll be able to add the orientation data to the Math and just let it shotgun the whole area.  😋  Make sense?

For me... this is the good part.  The previous 8 pages, I file under the "No pain... no gain!" category.

I guess it makes sense.  I'm good for a bit of trigonometry when I can look up a known formula like a compass bearing calcs from GPS data etc. but having never had to use advanced maths since I left school (and I cant really remember doing much back then either) the sort of maths required to adjust your bots scan data is beyond me. 

I do have some doubts as to the effectiveness in terms of  the resolution you will get from your sensors to enable mapping, as opposed to the more simple obstacle avoidance data.  But I presume (as the presumer has been slated as adequate 😎 ) your early experimental data will settle this.

So for me its inq's pain is my gain 😀 (maybe) and thanks for putting us all in the front seat of your robot arena  

 


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

@byron - That I have an audience of what... maybe two people interested. 

  • $3 ESP8266
  • $7 stepper motor
  • $1.50 driver
  • $2 plastic

... knowing someone else in the whole world is interested... priceless!

Thanks!

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042
 

@inq 

Honestly I never really appreciated what this VL53L5CX was!! I was confusing it with the TF Mini LIDAR. What I was doing really isn't relevant to its use to map or navigate a room. I don't know enough about it yet to make some kind of simulation version for a simulated robot base.

I am going to have to learn more about it before I make any further comment.

https://forum.digikey.com/t/getting-started-with-the-vl53l5cx-tof-sensor/20900

 


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042
 
Posted by: @inq

@byron - That I have an audience of what... maybe two people interested. 

When you have this sensor and robot up and running and on utube I sure you will get some interest.

Have you any sources showing its use for mapping and navigation?

It appears like a very low resolution camera that returns a depth map of up to 4m

 


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042
 

@inq 

Shows I didn't really read your post very carefully. I went back to look again to re-read the post with your video of your experiments. I didn't really understand at the time what you were showing.  I will give it some more thought.

https://forum.dronebotworkshop.com/user-robot-projects/inqling-junior-robot-mapping-vision-autonomy/paged/3/

 


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042

   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042
 
Posted by: @inq
Posted by: @robotbuilder

It will be interesting to see what kind of data you get from the TOF.

Let me see about getting some data out of this sensor.  Would this be of interest to you?

  • Multiple samples perpendicular to a plain white wall at different ranges - Gives baseline sensitivity, noise content is best conditions.
  • Some oblique to wall.
  • Some with furniture and obstructions.  With photos to correlate.

Let me know if you're interested.

VBR,

Inq

Difficult to know what sort of data I would really need.

Although it says the VL53L5CX can be used for things like scene understanding, complex scene analysis and 3D room mapping I can't find any examples of that use on the internet.

If you moved your hand closer would you get more squares covering your hand (higher resolution) and if you move your hand away from the sensor do you end up with only one square covering the hand?

 


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @robotbuilder

@inq 

Shows I didn't really read your post very carefully. I went back to look again to re-read the post with your video of your experiments. I didn't really understand at the time what you were showing.  I will give it some more thought.

https://forum.dronebotworkshop.com/user-robot-projects/inqling-junior-robot-mapping-vision-autonomy/paged/3/

 

In that very limited test, I came away with an impression.  I don't have any experience with Lidar, so I can't compare the technologies, but from your simulations above, I get the impression that these rotating Lidar units are using a laser beam like a laser pointer...

I was originally expecting this VL53L5CX unit to have 64 beams and if I moved it close to a wall, I'd see the grid's 64 points...

One of my phone cameras sees into infrared.  I used it on the https://inqonthat.com/inq-speed-racer/ project and could see the infrared emitters.  Looking directly at the VL53L5CX, I can see the purplish hue as it is the same frequency as the race car emitters.

But this think, I think works a little differently.  I don't think the infrared is focused.  I think it just blasts out the light pulses over the whole FOV area and the ToF receiver is the one that segments out the 64 regions it is looking at.  That would explain the limited range of 4 meters... it gets too dim by the square of the distance.

It might also explain some of the other data that comes back that I mentioned in the test post.  I'm thinking that a partial blockage of a square... returns both the ToF distance, but also a strength that can estimate the percentage of the square's area being reflected back.  

Anyway... I think, I'll pause the high-wire balancing act and test this more thoroughly and get you some data while I'm at it.  I'll also read the links in your next post.  😉 

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

@robotbuilder - I've looked through some of those links on the VL53L5CX sensor.  Looks like there are many things that can be done.  The gesture aspects might be something for later on.  I liked the business card following robot... and is on the shorter term list of things after the mapping.  I think having it follow me will be a hoot to take it to schools for class demonstrations trying to get more kids involved in my area.

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
frogandtoad
(@frogandtoad)
Member
Joined: 5 years ago
Posts: 1458
 

@robotbuilder

Posted by: @robotbuilder

@frogandtoad 

Don't get me started on the economic politics of today and our foolish reliance of imports instead of being self sufficient even though that might be more costly in the short term!

As for computing power I have the laptop to play with until a RPi become affordable or even available.

 

Mate... our $dollar is currently sitting at ~67 cents USD ... we have been slammed for far too long, and it's time we all revolt from out pathetic globalist NON LEADERS - throw the #$%^& out!

 


   
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 3 years ago
Posts: 6895
 

@robotbuilder  I can't find any examples of that use on the internet.

Is the example 3D depth map not what you are looking for?

 

 

 

 

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
Sure you can learn to be a programmer, it will take the same amount of time for me to learn to be a Doctor.


   
ReplyQuote
byron
(@byron)
No Title
Joined: 5 years ago
Posts: 1121
 
Posted by: @inq

I liked the business card following robot... and is on the shorter term list of things after the mapping.

Another good example of robot following something that it recognises is the dronebot video on the pixy2.  This is based on camera image technology rather than your sensors but its another cool way to do a class demo to interest kids 

https://forum.dronebotworkshop.com/user-robot-projects/inqling-junior-robot-mapping-vision-autonomy/paged/9/#post-31434

I especially like this demo:


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042
 
Posted by: @zander

@robotbuilder  I can't find any examples of that use on the internet.

Is the example 3D depth map not what you are looking for?

Which one where? All I see is the 8x8 depth maps?

 


   
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 3 years ago
Posts: 6895
 

@robotbuilder I don't know if it's 8x8 or what, I just saw 3D and thought you might be interested.

Screen Shot 2022 07 12 at 10.54.37

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
Sure you can learn to be a programmer, it will take the same amount of time for me to learn to be a Doctor.


   
ReplyQuote
Page 9 / 16