Robot Base - new bu...
 
Notifications
Clear all

Robot Base - new build

126 Posts
7 Users
18 Reactions
21.4 K Views
byron
(@byron)
No Title
Joined: 7 years ago
Posts: 1217
 

@robotbuilder

Thanks for your thoughts on how to be the scourge of the bunnies.

The original gem of an idea was to have a bot to chase the bunnies, but it never got to that stage.  The bot started off being controlled by an arduino R3 with a motor controller attached and then mutated into being controlled by a Rpi, still linked to the arduino and motor controller.  GPS was used for guidance, but it was just at an experimental stage and not a proper autonomous bot, and rather wayward in its navigation abilities.   Way back Bill was going to do an article on GPS RTK and I put things on hold, but various events of course meant that article never got done and my bot also sat forlornly on the self.  

The bunnies then had a bumper year it seems and my wife got rather annoyed with them chomping her flowers and scrapping their little holes in our garden so I had to cull them a bit by shooting some.   I did get a Dji mini drone that is both good for surveying the property surrounds and indeed giving the bunnies a scare, but they come back and I keep them in check with my air rifle.

But there on the shelf sat the old bot, and I whilst it would not be really of much use for bunny chasing I thought I would have a go and getting an autonomous bot working just for the challenge of it and to finish off what I started.   So I got a nice new motor controller, thought I might have a go with one of the new AI camera's mounted on it to have a play with, and started out anew.   But I found myself dabbling at this task here and there and though that unless I found the time to have a decent amount of time to spend on it, it would not get anywhere very fast so I've temporarily put it to one side until I can dedicate  a good two or three of weeks dedicated to making a decent start on this project. 

But until then I do a bit of planning in idle moments on how the project should go.  The first thing is not to get too ambitious  at first and to do the project in easy (!?) steps.   As I mentioned I've decided its back to GPS RTK for finding out where in the world the bot currently is and to set its next target position.  The plan is then to proceed with the first small step to aim to make the current skid chassis bot do some accurate 'patrol' routes.  Just that, it wont be a proper bot to start with.  Then I probably start over again with a bot chassis that has Ackerman steering and adapted from one of these remote controlled model cars.  After that augment the bot with the obstacle avoidance and communication abilities etc and progress along the path of producing a proper autonomous bot. I'm not expecting all that will be done in 2 or 3 weeks of course.

So there you are, probably more than you wanted to know about the future plans for an ex bunny chasing bot, it will become just another purposeless bot.  Oh now I remember, I also want to give the bot the ability to fire a squirt of water,  why, because it gives me a child like delight. 😎 



   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@byron 

In Australia they build rabbit proof fences.



   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@thrandell  FYI @byron

Hi Tom,

The robot base is back on the bench and I am ready to mess about with it again.

You asked in another thread about sharing code for going straight. I did have some ad hoc experiments but not a robust working solution worth sharing.

The idea was to use odometry to monitor the position and direction of the robot and apply action to bring it to some goal position which can be a position along a straight line?

https://forum.dronebotworkshop.com/user-robot-projects/encoder-odometry-test/#post-49622

It was actually your posts earlier that piqued my interest in playing with odometry code.

https://forum.dronebotworkshop.com/user-robot-projects/hide-and-seek-robots-request-for-comments/#post-21093

However as I have oft written I am interested in using visual navigation and will be concentrating on that for a while. A robot base can visually align with edges between the floor and walls and look for clear areas in front of the robot.

floorView
floorWall


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@thrandell  FYI @byron

Looking back through this thread I see I did post code on going straight where you start with the same PWM value for both motors and using the encoders the PWM value is adjusted automatically to match the same speed as the other motor. In my case I set them at PWM = 200 and the other PWM adjusted to a value of 210. Thus with PWM1 set to 200 PWM2 will end up with a value of 210 to maintain a straight path.

https://forum.dronebotworkshop.com/user-robot-projects/robot-base-new-build/paged/2/

However the logical method would be a speed control function using PID. That way you can simply set the speed of each motor to be the same and the PID control will make it so. There are examples on the internet on how to do this including I think in these forum. Maybe as you suggested there is a PID library for the Arduino. In general I avoid libraries if I can write my own but it may work for you if you need it in your swarm bots. It is something I need to find time to use myself 🙂

 



   
ReplyQuote
THRandell
(@thrandell)
Brain Donor
Joined: 5 years ago
Posts: 302
 

Hi @robotbuilder

I’m glad to hear that your project is back on the workbench.  Due to the carelessness of a neighbor my garage ceiling was flooded, so I’ve temporarily lost the use of my workspace…

I guess that I miss-understood your idea of using encoder output to steer a robot. I had a hard time trying to figure out how you planned to do it.  At first I thought that it might work along the lines of a proportional controller, with the difference in the encoder outputs used to steer the motors in a straight line.  Then I started to think about how you could use that controller to turn and I couldn’t quite puzzle out how to do it. 

I’ve been using odometry with my robots to control turns and distance travelled.  When I want to turn I determine the direction and the rotation in radians (the amount is fixed) and then monitor the change in theta.  After the turn is made I read the sensors again to see if the robot has cleared the obstacle and if need be do it all over again.  I guess in your case, if you can determine via the camera your target ’s offset from your robot’s centerline then you could do the same thing.  Do you count pixels to determine where your target lies on the picture frame?

With my swarm bots I’m also trying to get my robots to identify an object and move toward it.  It’s a whole different ball game from obstacle avoidance.  To get good turning precision I’d need a ton of sensors or maybe just two that move via servos, but at this point I’m happy with what I’ve got to work with.

 

Tom


To err is human.
To really foul up, use a computer.


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@thrandell

Do you count pixels to determine where your target lies on the picture frame?

Not really. Perhaps I will start a thread showing how vision can be used to locate and navigate a robot. Encoders are not required because the robot can see where it is and the direction it is pointing. If it was a swarm bot it could see where other bots are located and their orientation. Vision is rich in information and you can pick and choose what bits to use.

 



   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@byron

@thrandell

Had time this weekend to modify the robot base so it now needs a new name 🙂

To enlarge an image right click on image and choose Open link in new window.

botBase1
botBase2

In the above I was testing code for one of my possible navigation methods using the target image which I posted about some time ago. The target size in the image gives an estimate of distance to the target and the height to width ratio is a measure of the angle to the target. If the target is higher or lower than the camera than that angle would also give a distance measure.

target1

 



   
ReplyQuote
TFMcCarthy
(@tfmccarthy)
Member
Joined: 1 year ago
Posts: 435
 

Posted by: @robotbuilder

Had time this weekend to modify the robot base so it now needs a new name

Message from the Robot Devil: Call it "VanityBot"


The one who has the most fun, wins!


   
ReplyQuote
byron
(@byron)
No Title
Joined: 7 years ago
Posts: 1217
 

@robotbuilder  stall-e has shrunk, now its a neel-e.  No place to rest a weary bum. You may be better off putting your laptop on a desk and communication to a nice raspberry pi (running python of course) on your bot via wifi.  Bending down that low in old age is likely to cause some damage 😎   Good progress, hope to see more, but I recommend you find a good masseur, just in case...



   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@byron 

I had thought about using my other laptop computer to connect to the onboard computer for remote control but am not sure how to go about it. I know there is software to allow someone anywhere in the world take control of your computer via the internet.

I noticed they also talk about a "headless" RPi controlled from your own computer but again haven't got around to finding out how to go about it but might do so if I get back to using the RPi.

The RPi I have now is probably too slow to run my software.

 



   
ReplyQuote
THRandell
(@thrandell)
Brain Donor
Joined: 5 years ago
Posts: 302
 

Posted by: @robotbuilder

Had time this weekend to modify the robot base so it now needs a new name 🙂

Well, this is a bit tougher to name than Stool-e.  Still unused are Oakenshield and Evergreen?!

It reminds me of a Suit Valet my Dad used when I was a kid.

vintage suit valet

Since it has but one eye here are my suggestions:

Valet

Cyclopes

Odin


To err is human.
To really foul up, use a computer.


   
ReplyQuote
TFMcCarthy
(@tfmccarthy)
Member
Joined: 1 year ago
Posts: 435
 

Posted by: @thrandell

It reminds me of a Suit Valet my Dad used when I was a kid.

"Oh look, dear! Mr. Randell is getting dressed in the driveway again."

 


The one who has the most fun, wins!


   
THRandell reacted
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 7 years ago
Posts: 2392
Topic starter  

@byron 

This is not the final robot base, I change it to better serve its purpose as I play with motor control and sensor inputs so giving it a name based on appearance is prabably a bit premature.

 

@byron

In Bill's thread on Two Raspberry Pi AI Cameras you wrote:

I stuck just my finger into the camera view and it recognised it as a person

Shouldn't it have recognized a finger?

https://sandeep-bhuiya01.medium.com/disadvantages-of-cnn-models-95395fe9ae40

https://www.engati.com/glossary/convolutional-neural-network

I did 3D print an orange disk to make a start on seeing how I could get this simple object to be pick out by the camera

You could probably do that with OpenCV, no teaching of a neural net required.

 



   
ReplyQuote
byron
(@byron)
No Title
Joined: 7 years ago
Posts: 1217
 

Posted by: @robotbuilder

I change it to better serve its purpose

And there was I thinking you must have chopped a lump off for a bit of fire kindling.  

Posted by: @robotbuilder

Shouldn't it have recognized a finger?

The recognition of a person from a finger example is a good way of illustrating how the 'models' are build up and can be made to recognise the whole from just a small part.  But I see no problem if you want to train your detection models to include a Robo finger.  The Robo finger writes and having writ moves on.

Posted by: @robotbuilder

You could probably do that with OpenCV, no teaching of a neural net required.

These object detection algorithms and OpenCV work hand in hand,   OpenCV on its tod can certainly recognise a disk, though it seems a bit more of a convoluted process to get it to recognise the disk at varying distances and angles etc.  My understanding is that the process of achieving this goal is better served by these latest object detection algorithms using a learning or teaching process.  But regardless, its best to start off with a very simple object to see what the process involves.  I think lots of pictures have to be taken and a flavour of whats involved can be see in:

There are also good examples to be found of using just a normal camera with OpenCV and the likes of ultralytics yolo object detections modules to be found.  But the processor needs a bit of grunt just to do the object detection process (and I'm not talking about the training process which will need the likes of a proper PC). It seems the likes of an rpi4 would struggle a bit and an rpi5 would be best.  But thats where the rpi AI camera comes into the picture as it does all the grunt work and you can just use the pi to sieve through the feedback of whats been detected as per the example programs that @DronebotWorkshop showed in his video.  If the process of getting the object to be run on the AI camera needs a neural net network algorithm then the teaching of the object recognition via a neural network is what must be done.



   
robotBuilder reacted
ReplyQuote
TFMcCarthy
(@tfmccarthy)
Member
Joined: 1 year ago
Posts: 435
 

@robotbuilder, @byron

I'm coming late to this project, and I've been reading through the thread to try and catch up. I gotta say, tracking this thread is a real workout! I thought I was no "slouch" to AI, but you fellers cover a whole bunch of ground and jump off in different directions at the drop of a hat! It's been difficult to keep up; I'm winded. And I'm still 45 posts behind!

@byron

re: "Raspberry Pi AI Camera - Deep Dive" link

Do you know if he using "Visual Studio" or "Visual Code"? He says it's "Visual Studio" but I think it's "Visual Code" because he can upload to the camera. If it's the former, I want to find out how he did it.


The one who has the most fun, wins!


   
ReplyQuote
Page 5 / 9