Inqling Junior - Ro...
 
Notifications
Clear all

Inqling Junior - Robot Mapping, Vision, Autonomy

240 Posts
10 Users
87 Likes
18.7 K Views
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042
 

@inq 

After a bit of a google search:

It is not a commercial product yet.

It also comes with computer vision tech that allows it to identify boxes without extensive training.

(Identifying boxes is not all that complex)

...putting robots to work in the real world has been far more challenging than a lot of incredibly intelligent people thought.

It uses an onboard vision system to track which objects go where, and to judge how to grasp and place each box. It uses a robotic technique called “force control” to nestle each box up against its neighbors.

A caveat: Although the technology is impressive, we’re still a long way from it being deployed in an actual warehouse, especially around humans. That would involve a level of complexity that robots haven’t yet mastered. And while the Boston Dynamics videos are always fun, they’re not quite as effortless as they seem. Each one is created with carefully pre-programmed movements and will take many, many takes to get right before it’s shared.

 


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @robotbuilder

@inq 

After a bit of a google search:

It is not a commercial product yet.

It looks like they're giving it a shot (trying to sell one).  But... it appears to be more of the style you were suggesting.  Big, stable base!  No Micky Mouse balancing. 😉 

https://www.bostondynamics.com/products/stretch

 

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
frogandtoad
(@frogandtoad)
Member
Joined: 5 years ago
Posts: 1458
 

@robotbuilder

Posted by: @robotbuilder

The reason I wouldn't bother with a self balancing robot is because it complicates things and my interest was AI not the ability to balance. A wobbling robot for example adds to the complexity of grabbing images (by camera or LIDAR) or trying to pick things up. Solvable but expensive and adds nothing to the AI of the machine. In a sense you are also simplifying things by using stepper motors to avoid the complexity of using encoder input and modifying pwm output to achieve a goal like going in a straight line.

To be fair, the OP has no issue with such challenge / complexity... there is nothing wrong with having goals, ambitious as you may think they are:

Powered by: Raspberry Pi 3:

Self Balancing Robot

Cheers


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042
 

@frogandtoad 

Unfortunately I am hopeless with acronyms so I don't know what OP stands for.

I am not sure why you think I have an issue with having goals no matter how ambitious they are?

Self balancing robots are really fascinating even though the engineering is beyond me and I would have to have followed someone else's build to make one myself. I am no James Bruton who is young and able to build these wonderful machines which are beyond my pay packet and engineering skills.

How about using just one wheel?

 

 


   
Inq and frogandtoad reacted
ReplyQuote
frogandtoad
(@frogandtoad)
Member
Joined: 5 years ago
Posts: 1458
 

@robotbuilder

Posted by: @robotbuilder

@frogandtoad 

Unfortunately I am hopeless with acronyms so I don't know what OP stands for.

I am not sure why you think I have an issue with having goals no matter how ambitious they are?

Self balancing robots are really fascinating even though the engineering is beyond me and I would have to have followed someone else's build to make one myself. I am no James Bruton who is young and able to build these wonderful machines which are beyond my pay packet and engineering skills.

How about using just one wheel?

 

 

OP stands for (in most cases) Original Post.

It's not that I think you really have an issue with it, it's just that your response sounded a little discouraging for the OP to pursue such a goal (negative vs positive).

As far as I'm concerned, pushing the limits is a good thing, that's how progress is made, and I support all and any who push the limits for the betterment of all (humanity of course) 😉

Cheers


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042
 

@frogandtoad 

The title of the thread was robot mapping, vision and autonomy, subject matter for any robot. Realising it with a self balancing robot is fine and indeed I think a Segway robot base would fit well in a domestic environment. But it was a side issue to the title of the thread, and what I thought the thread was about, which could apply to any robot base.

Self balancing and other sophisticated motor programs exist in humans but they are reflexes not higher level functions like mapping and vision or autonomous high level behaviours.

 

 


   
Inq reacted
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @robotbuilder

The title of the thread was robot mapping, vision and autonomy, subject matter for any robot.

You presume, I can remember what I wrote yesterday...  🤣 

You have successfully stabbed any pretense I had at following my own OP (Original Post).  I diverged from that premise and should be flogged.  🙄 ... 😆 

I frankly don't think I have anything to offer the betterment of all humanity and at my age, I think humanity is totally... bleeped.

I am simply flying to my own tune (apologies for mixing metaphors) and dragging you along in my own Magic Capet Ride.  

VBR (and truly meant)

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

@robotbuilder - VBR = "Very Best Regards" - and if I add it, it is sincerely meant.

P.S. - I'll let someone on drugs tackle the one-wheel version.

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

UPDATE

I've been testing sub-systems on breadboards.  I've played around with the primary driving stepper motors and the micro motors that will be used to turn the range finding sensor.  I've also decided to simplify slightly and just have 1 DOF on that sensor turning around.  The VL53L5CX sensor has enough up/down range digitally, that I see no need to add a pitch axis.  Looking at the floor or ceiling will be for a later robot.

This update is primarily about experimenting with the sensors that will be used in Inqling Jr.  Recapping:

  • GY-89 Hosts three sensors on one PCB breakout handling 11 DOF.
    • L3GD20H - 3 DOF Gyroscopic Sensor.  In the following video, it is running at 50 Hz.
    • LSM303DLH - 3 DOF Accelerometer and 3 DOF Magnetometer.  In the following video, it is also running at 50 Hz.
    • BMP180 - 2 DOF Barometric Pressure and Temperature.  This is not running and I have no expectations of enabling it for Inqling Jr.
  • VL53L5CX - ToF Infrared Range Finding Sensor.  It has 64 zones in an 8 x 8 square 45 x 45 degree FOV.  In the following video, it is running at 15 Hz.

I'd like to mention some pros/cons/TBDs about my experience so far with the VL53L5CX sensor.  The video is graney as it was to show its low light performance as I'll explain in a moment.  

  • Pros
    • Being a ToF sensor using light versus sonar, it is basically instant data.  No firing and waiting for the single range number like on an HC-SR04.
    • Having a 8 x 8 grid requires only one sensor and one "snapshot" to catch all forward obstacles at once.
    • It has an up/down range that will keep from bumping into low hanging items, yet allow it to travel under say... dining room chairs.
    • Placed at appropriate angle to direction of travel, I'm hoping that I can have it do wall following while observing for forward obstructions... without having some kind of back and forth scanning motion.
  • Cons
    • Being infrared, it doesn't like incandescent or halogen type lighting.  Likewise, I'd expect it to be near worthless outside.
  • TBD
    • There is a lot more data that comes back at every sample.  So far the data sheet and other Internet research isn't really telling me what all the data is and what use it may be.  I can make a few guesses, but... they're just that until I mess with it some more in different conditions.  These were found in the header file, but don't seem to have any reference in documentation I have found so far.  Some examples of returned data (these are 8x8 = 64 values for each of the following)
      • Ambient noise per spad - I'm guessing a spad is one of the 8x8 "pixels".
      • Number of valid targets detected for 1 zone. - I don't know if a "zone" is also a pixel or the whole 8x8 region.
      • Enabled spads???
      • Signal returned to the sensor - Maybe this is simply the strength, like percentage that an object fills one of the "pixel".
      • Sigma of the current distance - estimate of potential error in the distance value???
      • Distance!
      • Estimated reflectance?
      • Status measurement validity?
    • This is section seems optional and sounds like it can detect motion of objects in the sensor's processor. 
      • global_indicator_1?
      • global_indicator_2?
      • status?
      • nb_of_detected_aggregates?
      • nb_of_aggregates?
      • spare?
      • motion[32]?

In other words, lots of data to dig into.  In the following video, I have all the sensors running on an ESP8266 and pumping data out to the client-side webpage.  I wave a stick over the sensor, to show the reading changes (numbers and color ranges).  At about the 30 second mark, I turn on one of those indirect, ceiling facing, halogen bulb floor lamps.  The sensor can not see the bare bulb.  Note how the readings seem to be retained even though the obstruction is no longer in the way.  Turning off the halogen light and the sensor quickly updates.  I also note that LED lights don't seem to bother it as expected.

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
Inst-Tech reacted
ReplyQuote
byron
(@byron)
No Title
Joined: 5 years ago
Posts: 1121
 

@inq

a couple of belated comments on your bot project.  

I find PETG is better than ABS, just as strong, but easier to print and less of a pong, though if you already have a reel of ABS then I guess you would want to use it.

Using PID to go in a straight line is useless for outdoor bots as they get tossed about the place and wheels will not always be in good contact with the ground.  RTK GPS (base station combined with the bots mobile GPS) with a compass is good for navigation (1cm precision) but RTK is relatively expensive, and beware of trees and buildings obscuring the GPS reception.  For indoor bots bumping over carpets and the like can also easily get the wheels to spin and throw the encoder count out. Fine for a nice flat floor, but beware of driving over breadcrumbs.

One way of navigating a bot could be to use the likes of openCV, utilising static camera's that can sense a coloured blob on the bot, or a camera on the bot sees coloured blobs, or shapes etc cunningly place about your house.  I put up a brief example of my experiment on this way back when as I wanted a test bed to track a bots bearing to check on my bearing calcs.  The camera was placed on the ceiling looking down and the bot had to travel within the camera vision below.  I think @robotbuilder has a brief play with my example openCV too and could identify his bottle of sauce but I don't think he took this any further. (and his sauce bottle probably got taken back by the house cook for a tasty meal).  But my bit of code was in python so it would be no use for you.  I mention this a just some food for thought.

Unless one has a good plan for getting the bot to navigate from where it is to where you may want it to go it will end up just being an aimlessly wandering bot, running scared of unforeseen objects it may find in its path.  Give the bot a voice and will it sing of a 'wandering, wandering star', or 'I know where I'm going,  I know who's going with me' 

So a few peanuts from the gallery, but throw some back, don't eat them all. 😎 


   
Inq reacted
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042
 

@byron 

I think @robotbuilder has a brief play with my example openCV too and could identify his bottle of sauce but I don't think he took this any further.

That was on page 2 of this thread,
https://forum.dronebotworkshop.com/user-robot-projects/robot-navigation/#post-17766

However I don't think this thread is about vision or mapping anymore.

 


   
Inq reacted
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
 
@byron, @robotbuilder - Vision being in the title, I have always used it very loosely to mean this Time of Flight sensor.  I usually put "vision" in quotes above to indicate that loose association.  I tried to define "vision" this broadly in the opening salvo:
Posted by: @inq

Inqling Junior 

The major goals for Inqling Jr. will be to study a form of "vision" with mapping, memory and navigating in a learned environment. 

Now, I'm finding that might have been too loose. 😆 In general, I was using the term to mean the robot "senses" its environment.  In this context, it would include ToF/Lidar and even ultrasonic sonar sensors. 

I very much want to get into OpenCV and I thank you for the links, but that will be for a more ambitious robot down the road.  I have only read the introductory pages for OpenCV and knowing that it requires much more CPU power, I'm deferring that until I have completely swamped one or more ESP8266/ESP32's.  

Continuing the singing metaphor, my robot will be singing a different tune!

For exactly the reasons you (@byron) mentioned... I don't see encoders as a useful addition... as long as speed and torque (from an earlier @robotbuilder post) don't become a limitation. We live is a far more complex world that a smooth linoleum floor.  Even my nicely laid tile floor totally disorients robots at the first seam.  Until the first slip or road bump, encoding and stepper motors perform the same "localized" function of, "My 'legs' are locomoting me how I want."  The only difference in my slow/weak/tiled environment, encoding requires constant monitoring and PID code to straiten (or curve) and steppers do not!

My "science project" is to handle the environment and navigation more like a human.  Yes, we typically use vision (OpenCV) to see and identify items in our environment... but only after they are in our vision.  Identifying a blue block, it must be within our vision.  In my Inqling series, I'm want to tackle the "getting the object to my FOV".  IOW, My beer isn't in the floor in the middle of the room where Inqling starts.  I want Inqling to know that my beer is in the fridge and to get to the fridge in the kitchen no matter which room it starts.

IMO, this requires Robot Mapping, Vision, Autonomy as in the Inqling Junior title.

... and the tune is Rock and Roll! 😉 🤩 

VBR,

Inq

 

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
byron
(@byron)
No Title
Joined: 5 years ago
Posts: 1121
 
Posted by: @inq

My beer isn't in the floor in the middle of the room where Inqling starts.  I want Inqling to know that my beer is in the fridge and to get to the fridge in the kitchen no matter which room it starts.

Posted by: @inq

... and the tune is Rock and Roll! 😉 🤩 

And this maybe what happens when the bot gets to the beer fridge 😀 

 


   
Inq reacted
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2042
 

@inq 

My "science project" is to handle the environment and navigation more like a human ...
In my Inqling series, I'm want to tackle the "getting the object to my FOV". IOW, My beer isn't in the floor in the middle of the room where Inqling starts. I want Inqling to know that my beer is in the fridge and to get to the fridge in the kitchen no matter which room it starts.

First it has to recognize which room it is in. It would handle the environment more like a human if it could look and see what room it is in and then it could look and see where the door was (and if it is open or closed) and then it could navigate to that door. It might look and see it is already in the kitchen because it can see the fridge! Then it has to look and see if the beer is in the fridge and not already taken or moved to another part of the fridge.

Although the VL53L5CX may be a very useful sensor for many things I think using it to recognize a can of beer or any other object will be difficult.  I don't know if it can be used like sonar to map a room and even use the returned distances to recognize the room?

 


   
Inq reacted
ReplyQuote
(@davee)
Member
Joined: 3 years ago
Posts: 1653
 

Hi @Inq,

   I have no experience about this question ... so just tossing a few random thoughts for you to mull over.

Indoors and outdoors are quite different .. trying to find a solution for both is probably presently 'too hard' (on ordinary peoples' budgets). So suggest, pick one and see if you can make some progress.

------------

If you pick indoors .... on the impression your (beer) fridge is more likely to be indoors ...

People navigate indoors by recognising objects ... like doors, windows, etc. (Hence large hotels with hundreds of identical doors are a nightmare for humans, only mitigated by numbers on the doors!)

However, people know that a room door is 'fixed', a table or trolley is not, and even a bookcase can moved ... I suspect 'simple' AI systems will struggle to realise which things are 'solid' navigation points.

So maybe, some kind of 'help' is needed. Here you may need to be inventive, or tolerant of 'invading' objects, or both.

2nd World War aircraft position finding used the angles of two radio transmitters ... GPS, which does height as well, uses timing (ie distance) from 3 or more radio transmitters. Neither are immediately suitable indoors, but maybe variations on measuring distance or time from known points is applicable, at least to get a 'rough' position ... then use other sensors and sensor fusion techniques to 'tune' the position more accurately.

In principle, the recent (5.3?) Bluetooth can provide 'centimetre' distance from, and by switching, multiple antennas, angles to, 'known position' BT nodes ... so two fixed nodes in a room should a triangulation basis. Companies like ST are promoting low-cost chips, but convenient boards at $5 each have yet to appear ...

https://www.eenewsembedded.com/en/next-generation-bluetooth-soc-with-with-latest-positioning-capabilities/  

Alternately, a vision system could be helped by strategically placed 'targets', probably near the corners of the room at near ceiling height. I am imagining something like a circle with a 'bar code' like patten inside to indicate which one it is.

The problem with this is the vision sytem needs to be fairly high resolution .. or the targets will need to be massive!

As an alternate, maybe the targets could be replaced by infra-red transmitters in a similar position, which are available from remote control needs. Then the robot could detect the light, but would need  to determine the angle. Could you build a rotary scanner, possibly with a parabolic dish like the classical radar dishes, and thereby determine the angle? Definitely scope for some smart 3D printing!

--------

All of the above is totally the stuff of dreams or nightmares .... take care.

Best wishes, Dave


   
Inq and Inst-Tech reacted
ReplyQuote
Page 3 / 16