FYI @byron
It looks as if your "wooden stool" is re-purposing the motorized wheels and encoders from a robot vacuum cleaner. Did you ever figure out your problem with getting 800 clicks per wheel rotation?
Tom
Not really I gave up on those motors as I couldn't get consistent encoder readings but the encoders on the motorized wheels I am now using seem to be working perfectly. The motor speeds of both motors (number of encoder reading per unit of time) for a given PWM value are also close.
I wrote to soon! When I tried to use the encoder counts to get the robot to travel some desired distance I found problems.
When I turned the wheel around once by hand the software encoder count was 564.
When I turned the encoder disc by hand until the wheel did one rotation it required 47 turns? I don't know how that turns into 564 apart from 47 x 12 = 564.
If I wanted the wheel to rotate a certain number of times I assumed that would be some fixed value multiplied by the number of encoder counts. So I experimented with the count value to get the wheel to rotate once. It turned once using the code below with a value of 520 not the 564 I expected. This was consistent in that it always turned once with that value it was not erratic in its counts per single rotation.
// if button1 being pressed start motor
if (digitalRead(btn1) == 0){
while (digitalRead(btn1) == 0){} // wait for press release
counter1 = 0; // zero encoder count
startMotor(240); // calls function to start motor with PWM=240
while (counter1 < 520 ) {} // rotate until counter1 = 520
stopMotor();
Serial.println(counter1);
}
So I thought ok I can turn it a given number of times using a variable and to multiply by the 520.
ROTATIONS = 5
while (counter1 < (520 * ROTATIONS ) {} // rotate until counter1 = 520 * ROTATIONS
However that was not the case. I had to increase the count value for each number of desired rotations!
520 * 1 = 1 rotation
542 * 2 = 2 rotations
548 * 3 = 3 rotations
552 * 4 = 4 rotations
554 * 5 = 5 rotations
555 * 6 = 6 rotations
556 * 7 = 7 rotations
557 * 8 = 8 rotations
557 * 9 = 9 rotations
I could see a pattern and this equation seemed to compute the above required value for a given number of rotations.
( (560 * rotationCount) - 40/rotationCount) / rotationCount
What I can say is: The results in terms of always turning the same amount for any given encoder counts seems to be consistent. Comparing the encoder counts can be used to keep the robot travelling straight but may be a problem with encoder based odometry.
The encoders use two hall effect sensors and thus can provide information as to the direction the wheel is rotating. I just use the one encoder signal from each motor as I know the direction.
Maybe it has something to do with a delay of some type in counting pulses after starting the motors.
However with a work around the encoders seem to give consistent if strange results and I can use them for speed control of the motors.
Hope to work on the robot base some more this weekend.
Well, that’s an interesting result. I’m assuming that your testing is done with no load on the motors. I’m not sure what to make of it, and it’s the same for both motors…strange.
For what it’s worth my approach to odometry has always been to use the robot’s wheel base, the distance per click and the number of clicks per encoder. With that you can figure out the x and y location and the pose. To figure out distance per click I need the wheel diameter, the motors gear ratio, and the encoder clicks per revolution. Here’s the formula for distance per click: pi * wheel diameter / gear ratio * encoder clicks per revolution (12 in this case) If your wheel diameter is in millimeters then the result is in millimeters per click. Your gear ratio seems to be 47:1, so you could try the same approach if you wanted to go down that rabbit hole.
As far as plotting a straight path for a robot I got started by mapping each motor’s speed to the same PWM value instead of counting wheel rotations and I haven’t tried anything else. Does your equation work the same for each motor at different PWM values? Good luck this weekend.
Tom
To err is human.
To really foul up, use a computer.
Those encoder readings, with the disparity between each wheel revolution doesn't seem logical. I expect you've probably already thought of and tried some of the suggestions I will make below. But in case. 🙂
I've been using encoder readings myself over the past few weeks. Two of the 4 motors on my bot had encoders and I was reading the encoder outputs to see how those two motors, for the same PWM input, would vary in their respective encoder counts (which I think is to expected from these sort of cheapish hobby type of motors) and then having a go at adjusting the PWM to one or other of the motors to synchronise their encoder readings. So my slant of playing with encoders is not quite what your are currently doing, but nevertheless encoders are fresh in my mind.
The first thing I would check is to record the signals from both encoders on the motor to see if they give the same results. Also are the results of going forward and reverse consistent with your forward only results.
When you stop the motors in your code, whilst the motor appears to come to an immediate halt, maybe it 'coasts' on for a tick or two. I expect you have marked your wheel to see if its starting and stopping on the mark, but is your marking arrangements sufficient to detect if the wheel settles just one tick out? (if you see what I mean). Likewise, when starting the motors, are they always consistent, to the last encoder tick, as to how quickly they get to turn for a given voltage input. How reliable and consistent is your voltage power source? Does it make a difference if you immediately try to drive the motors at full voltage compared to driving with a low voltage.
Also what happens if the motor, at its stationary position at startup, just happens to be directly agains one of the hall effect sensors. Is there the possibly of a bunch of readings, or does the count only start when the first transition from high to low (or vice versa) is obtained. How would the encoder reading in that scenario differ from a starting position in between one of those encoder 'ticks'. It should just be one tick different at the most (logically)
Anyway just some random thoughts, but when I try to emulate intelligence by donning on some false Spock ears, your readings don't seem logical. Surely something's up Doc. I'll be interested to see what your further test and trials may reveal.
I’m assuming that your testing is done with no load on the motors.
Yes I had the robot base lying on its back on the desk. However I just tried the same settings with the robot base running on the floor with completely different results! I need the robot base to self adjust until the wheel counters are the desired ratio which in the case of driving straight is 1:1 ratio. I have had some success at doing that in the past and will post the code when I am happy with it. The actual speed (PWM) is not relevant to distance travelled, the encoder counts per distance (wheel revolution) should be the same no matter how fast the wheels turn.
The item on the right of the robot base is the stool I sit on to adjust the software just so you can see the difference 🙂
Out of curiosity I used the compass on my mobile phone to watch the changing of direction. I need to try a compass module as the mobile phone compass worked very well.
As I think I might have mentioned before: the goal is to write low level functions for the Arduino board to read sensors and turn on motors and then call them from a higher level language within an easy to edit higher level language on a PC (or RPI). This is kind of essential when processing more complex visual data.
John
Coasting was an issue I thought was the problem last time I was trying to get accurate encoder values. I imagined for a given speed (PWM value) the coasting would be the same meaning just subtract a value from the final count. Although I guess the faster it is travelling the longer it would take to slow down. I think you can brake the motors as I used to do on my old robot simply by shorting out the motors via a high wattage resistor. Maybe if it slows down as it approaches the final count. More experimenting to do.
Coasting might be it, seems close if 40 odd pulses are added after the motors are turned off.
560 - 40 = 520 or 520 counts per one turn (520/1)
1120 - 40 = 1080 or 540 counts per two turns(1080/2)
..
5600 - 40 = 5560 or 556 counts per ten turns(5560/10)
Yes I marked the wheels and can judge fairly well if they keep returning to the same position for a given number of rotations which is what I did to get the above figures. Rotate a number of times and if the encoder count assumed was wrong the stop position would slowly drift away from the starting position with each rotation. It appears to be spot on.
The voltage (PWM) is not relevant to how far the wheels turns (encoder counts) only how fast it turns. The relative count ratio per unit of time between two differential drive wheels is relevant.
Thanks for all the suggestions I will take them into consideration.
The item on the right of the robot base is the stool I sit on to adjust the software just so you can see the difference
LOL
To err is human.
To really foul up, use a computer.
What I did was set PWM1 in turn from 100 to 250 and then adjusted the PWM2 for each of those values until both wheels turned at the same rate.
Motor2 ran faster going forward than motorA for the same PWM value.
Motor1 ran faster going reverse than motorB for the same PWM value!
So for example:
forward if PWM1 = 230 then PWM2 = 223 to get them turning at about the same rate.
reverse if PWM1 = 230 then PWM2 = 252 to get them turning at about the same rate.
The relationship between the PWM1 value and the PWM2 values for any given speed wasn't completely linear.
In the code I used to find those relationships, I set PWM1 and PWM2 to the same value and while the motors were running the code would auto adjust PWM2 until the changes in counter1 and counter2 values were the same.
This actually resulted in the robot base eventually driving fairly straight although it would turn at the beginning because the PWM values were the same but the wheel speeds were not.
I used my mobile phone to determine how straight it was travelling within one degree accuracy. It was also good in seeing how many degrees the robot base rotated. I will have to add a compass reading sensor for sure!!
This is the actual code I used to get the relative PWM values. I would give the PWM1 and PWM2 the same values from 250 to 100 and run each in turn.
// Motor A const int enA = 11; const int in1 = 10; const int in2 = 9; // Motor B const int in3 = 7; const int in4 = 6; const int enB = 5; volatile long counter1 = 0; volatile long counter2 = 0; int PWM1 = 100; int PWM2 = 100; // button pins assignment const int btn1 = 24; // press button const int btn2 = 26; // encoder pins assignment const int ENCA = 3; // pin of encoder1 const int ENCB = 2; // pin of encoder2 void forwardA(int rate){ digitalWrite(in1, HIGH); digitalWrite(in2, LOW); analogWrite(enA, rate); } void forwardB(int rate){ digitalWrite(in3, HIGH); digitalWrite(in4, LOW); analogWrite(enB, rate); } void reverseA(int rate){ digitalWrite(in1, LOW); digitalWrite(in2, HIGH); analogWrite(enA, rate); } void reverseB(int rate){ digitalWrite(in3, LOW); digitalWrite(in4, HIGH); analogWrite(enB, rate); } void turnOffMotorA(){ digitalWrite(in1, LOW); digitalWrite(in2, LOW); } void turnOffMotorB(){ digitalWrite(in3, LOW); digitalWrite(in4, LOW); } void turnOffMotors(){ digitalWrite(in1, LOW); digitalWrite(in2, LOW); digitalWrite(in3, LOW); digitalWrite(in4, LOW); } void setup() { // Set all the motor control pins to outputs pinMode(enA, OUTPUT); pinMode(enB, OUTPUT); pinMode(in1, OUTPUT); pinMode(in2, OUTPUT); pinMode(in3, OUTPUT); pinMode(in4, OUTPUT); // buttons pinMode pinMode(btn1,INPUT_PULLUP); pinMode(btn2,INPUT_PULLUP); pinMode(ENCA,INPUT); // set as input digitalWrite(ENCA,HIGH); // enable internal pullup resister attachInterrupt(digitalPinToInterrupt(ENCA), encoder1, RISING); // interrupt initialization pinMode(ENCB,INPUT); // set as input digitalWrite(ENCB,HIGH); // enable internal pullup resister attachInterrupt(digitalPinToInterrupt(ENCB), encoder2, RISING); // interrupt initialization counter1 = 0; counter2 = 0; Serial.begin( 9600 ); Serial.println("Starting up"); } // interrrupt server routines for reading encoder void encoder1() { counter1 = counter1 + 1; } void encoder2() { counter2 = counter2 + 1; } int cycleCount; int totalCounter1; int totalCounter2; void loop() { // if button1 being held down run motors if (digitalRead(btn1) == 0){ totalCounter1 = 0; totalCounter2 = 0; cycleCount = 0; // run for x cycles counter1 = 0; counter2 = 0; while (cycleCount < 300){ if (counter2 > counter1 + 1){ PWM2 = PWM2 - 1; totalCounter1 = totalCounter1 + counter1; totalCounter2 = totalCounter2 + counter2; counter1 = 0; counter2 = 0; } if (counter2 < counter1 - 1){ PWM2 = PWM2 + 1; totalCounter1 = totalCounter1 + counter1; totalCounter2 = totalCounter2 + counter2; counter1 = 0; counter2 = 0; } reverseA(PWM1); reverseB(PWM2); Serial.print(counter1); Serial.print(" "); Serial.print(counter2); Serial.print(" DIFF = "); Serial.print(counter2 - counter1); Serial.print(" PWM1 = "); Serial.print(PWM1); Serial.print(" PWM2 = "); Serial.println(PWM2); cycleCount = cycleCount + 1; } Serial.println ("========="); Serial.print("totalCounter1 = " ); Serial.print(totalCounter1); Serial.print(" totalCounter2 = "); Serial.println(totalCounter2); Serial.println(); turnOffMotorA(); turnOffMotorB(); } }
You've made a good start with Stool-e. 👍
The encoder reading variations for the same PWM for two different motors is what I have also noticed. A few years ago when I last dabbled with getting my bot to wiz around the outdoors I found that the bumpy surface conspired to make any attempts to get the bot to run straight via encoder input was not sufficient for the task and the use of imu, compass bearing and GPS readings got it to go in the right direction, albeit in a still rather wobbly path. I also had some experiments with using a camera with Open CV to pick out objects to aid navigation, but did not progress that far with it. That project then got shoved into a project box to await for when I had more time to devote to it.
A couple of months ago the bot came out the box for another dabble with it. My initial test of the PWM and encoder readings was to see if there was a consistent and predictable variation between the motors to then apply the noted bias when driving the motors, albeit that this bias may have to be adjusted for the actual speed at which its currently being driven. To send the bot along a desired course I will be looking for other navigation inputs as before. To start with I'm looking at what I may be able to do with an attached AI camera. The AI stuff is all new to me and I've only dipped a little pinky toe into that soup. Create a neural network, quantise and compress, convert to camera format, package into firmware for camera. Yes well, a good few months before I even understand what all that means. Eventually I may get it to recognise my 3d printed round orange disk, maybe. 😲
I rather expect that you will find similar challenges with the usefulness of encoder feedback as you progress, though I expect it will be much more consistent indoors on smooth surfaces. Challenges will arise when driving the motors both from encoder feedback and imu and compass bearing signals. I think you may well find that whilst initial runs seem to indicate the bot can be sent correctly along a desired path, the cumulative effect of continually running the bot with many turns and direction changes will eventually mean the bot looses it sense of where it is in its universe. Some other form of identifying the position of the bot within its operational space may be required for accurate autonomous navigation. But thats all part of the fun.
It will be interesting to see how Stool-e transmutes to an autonomous bot. You should have a behaviour goal in mind for Stool-e, like getting it to recognise what chair you're sitting in and then to move and park itself by the appropriate chair. Without a specific end goal I fear the robot mojo may desert you again and I'm hoping to see how your project develops to maturity. 😍
Encoder feedback is more about controlling and monitoring the motorized wheels not about navigating the real world.
The important thing is that the software can auto adjust things like the PWM values as a result of error signals from other sensors. In biological systems it is called proprioception.
Proprioception is the use of joint position sense and joint motion sense to respond to stresses placed upon the body by alteration of posture and movement.
To start with I'm looking at what I may be able to do with an attached AI camera. The AI stuff is all new to me and I've only dipped a little pinky toe into that soup. Create a neural network, quantise and compress, convert to camera format, package into firmware for camera.
Vision as you probably remember is my main interest as a robot sensor.
Yes well, a good few months before I even understand what all that means.
What it means is you have to take lots of images of the thing you want the AI (neural network) to recognize and send them to a service that will train a neural network to detect the probability that the thing and its position exists in any image given to it.
As you use Python there is lots of examples as to how to do it including using pretrained classifiers.
https://www.geeksforgeeks.org/detect-an-object-with-opencv-python/
I still think in terms of working out a coded solution rather than using a trained neural network. You can probably remember the target example?
In the above link they show recognizing a stop sign.
I would do that using the same techniques as I used in the target example. Blob the image. Extract blobs. Recognize the outline shape and read out the text STOP. As with the target example knowing the dimensions of the stop sign you can have an estimation of distance and angle to that sign.
But once the target is recognized action must be taken and that requires motor control.
Without a specific end goal I fear the robot mojo may desert you again and I'm hoping to see how your project develops to maturity. 😍
Past behavior is the best predictor of future behavior so ...
I will drop the encoder stuff for a while and see if I can get the robot base to orientate to a visual target and move toward it as a first task. In theory navigating is moving from target to target. I don't see any project developing to maturity it is just having fun trying to get simple things work.
Sometimes you get suprise results. Today I tried to write some code to get the robot base to keep on track by adjusting the PWM values when the encoder counts were different. The robot base kept over shooting the target path and ended by doing a complete round about and heading off into another direction!
I did some experimenting with OpenCV for my bot project a few years ago. Its good at recognising particular shapes in a current camera image from a pre made comparison image that its programmed to search the current image for. I think you followed my code to recognise a sauce bottle.
But, on the likes of an older Rpi, its not so good at giving a speedy response when searching for lots of different images of an object when a lot of image pictures are taken from a bunch of different angles. This can be mitigated by keeping the image something simple like a sphere or even your sauce bottle. Also I think OpenCV was too slow with colour recognition and its comparisons where all done by first putting the images into black and white.
In my experiments on this, which was on a small indoor bot, I used a circular disk placed on the bot and an overhead camera. This worked very well but of course not much use to take this process to use on an outdoor bot and the overhead camera was a temporary test rig affair. I was thinking of taking this a bit further and, whilst limiting the indoor bot tracking to one room, have three camera's identify an object on the bot that looked similar from any direction and to then calculate the bots location via triangulation but did not get around to doing much with that idea at the time.
I'm hoping the AI accelerated camera will prove a step up from the rpi with just OpenCV. Speed of object recognition is of course paramount and I will see what it may be capable of. I'm hoping colour recognition will be easily achievable. For my outdoor bot a nice blob of orange should be easily identifiable, and together with the known sector and the direction of the bot its quite precise location will be calculated.
But as I said, I've a way to go with the steps to follow with the particular Ai camera for the rpi I recently obtained. It does come with some examples already to test that will recognise objects like people and chairs. In fact when I first tried it I was not in the camera's picture frame, but when I simply waved just my hand in view of the camera, it immediately bounded my hand with a box and said 'person' on the screen. So instead of a spherical orange object for the AI to spot I could probably place a glove on it to look like a hand and make a quick start, although dear wife will probably wander into the picture and get identified as a bot.
This is the AI camera I'm playing with.
https://www.raspberrypi.com/documentation/accessories/ai-camera.html
I hope your visual target tracking goes well, keep us updated on the progress.
I think you followed my code to recognise a sauce bottle.
Yes I remember your sauce bottle. I think that was the last time I played with Python
https://forum.dronebotworkshop.com/user-robot-projects/robot-navigation/paged/3/
I'm hoping colour recognition will be easily achievable.
Also I think OpenCV was too slow with colour recognition and its comparisons where all done by first putting the images into black and white.
Color segmentation is easy. I don't know why the OpenCV would be so slow. I use a video capture dll and process the images in my own code and it works in real time to segment images by color.
One of Rene Bakx's robots uses vision to find its charger.
https://www.youtube.com/channel/UCp0v7urX-f5PYcoTyy33t5Q/videos
This one uses AI and provides more than just a video demo.
https://www.instructables.com/Object-Finding-Personal-Assistant-Robot-Ft-Raspber/
Still working on encoder code to get the motors to work together.
You remark you have not done any Python programming since your short delve into OpenCV some years back, but if vision is your main interest for a robot sensor then maybe a brush up would be to your advantage. I remember RoboPi was learning Python specifically as he was interested in vision and sound for his bot.
I have not done anything with OpenCV since then either, and not really done much programming at all until quite recently, so my remarks about OpenCV were a bit off the top of my head. Undoubtedly OpenCV can process colour but the OCV routines for template matching that I used required grey scale (not black and white as I put).
In link to an example of homing the robot where 3 coloured circles are used the code may well have simply converted the target images to grey scale as indeed I did on my experiments where I used a coloured disk on the bot, but the colour had nowt to do with the image recognition. But in video the process of homing the bot was rather painfully slow so who knows what the code was up to.
In the other link you posted the rpi was linked to a 'coral USB Accelerator' which I looked up and the info on this dongle states:
"The on-board Edge TPU coprocessor is capable of performing 4 trillion operations (tera-operations) per second (TOPS), using 0.5 watts for each TOPS (2 TOPS per watt). For example, it can execute state-of-the-art mobile vision models such as MobileNet v2 at almost 400 FPS, in a power efficient manner."
So the AI in that link needed the assistance of an external AI processor, so I think I'm on the right tack in going with an AI type of camera for the vision processing on my bot.
On the link to the homing bot, I did rather like the idea of his charging station. Thats an idea to take on. 👍
When adjusting the PWM for the motors on my bot I found I could get close, but there was always some jitter in the readouts. I took the motor with the highest rpm count at the target and adjusted the PWM sent to the other motor up or down.
In the snippet I show below the encoder readings relate to revolutions per second of the wheel extrapolated from the revolutions per second of the motors to which incorporate the encoders. I'm not sure I have the numbers in this extrapolation correct, but they are consistent so serve the purpose. As you see the algorithm is continually adjusting the PWM of the left motor to try to keep all in sync. The encoder readings and PWM adjustments are made once per second in this example.
PWM R:0.25 PWM L:0.25 Encoder: Right 0.0| Left 0.0 | Difference 0.0
PWM R:0.25 PWM L:0.25 Encoder: Right 0.0328905| Left 0.02471981 | Difference 0.008170685
PWM R:0.25 PWM L:0.26 Encoder: Right 0.1366357| Left 0.1116394 | Difference 0.02499625
PWM R:0.25 PWM L:0.27 Encoder: Right 0.1674848| Left 0.143921 | Difference 0.02356377
PWM R:0.25 PWM L:0.28 Encoder: Right 0.1802505| Left 0.1555037 | Difference 0.02474676
PWM R:0.25 PWM L:0.29 Encoder: Right 0.1869164| Left 0.1765871 | Difference 0.01032932
PWM R:0.25 PWM L:0.3 Encoder: Right 0.1957561| Left 0.1986513 | Difference -0.002895132
PWM R:0.25 PWM L:0.29 Encoder: Right 0.1918724| Left 0.1900508 | Difference 0.001821592
PWM R:0.25 PWM L:0.3 Encoder: Right 0.1883004| Left 0.2010686 | Difference -0.01276822
PWM R:0.25 PWM L:0.29 Encoder: Right 0.1935491| Left 0.1986914 | Difference -0.005142257
PWM R:0.25 PWM L:0.28 Encoder: Right 0.1926192| Left 0.1918466 | Difference 0.000772655
PWM R:0.25 PWM L:0.29 Encoder: Right 0.1930293| Left 0.1932627 | Difference -0.0002333671
PWM R:0.25 PWM L:0.28 Encoder: Right 0.1892724| Left 0.1961884 | Difference -0.006916031
As you see the PWM of the left motor has to be constantly varied between about 4% and 20% each second to keep the wheels running in a rough straight line. Of course this only relate to my motors but this sort of variation could mean a bit of a meandering track.
As I did not think I would be using encoder data to keep my bot on its trajectory I've not yet done much with this data yet, but I will do some more tests both to get an average PWM difference that I could simply apply when driving the motors, and to see if all those + and - differences get to somewhere near zero or if their is still some bias to sort out. Whatever PWM signal is applied there will undoubted be some variation in the actual voltage the electronic circuits actually deliver for any given PWM level and that may not be as linear as may be expected. Thats another thing to check out. I guess an oscilloscope may well come in handy. I was going to get one, but a new computer is my current Christmas present. 😀
But in video the process of homing the bot was rather painfully slow so who knows what the code was up to.
Yes I couldn't find any reference to the code used. It appears to get circular areas (noticed it circled his camera lens).
You do not of course need to learn or use Python to use OpenCV as it can be used with many languages. Also simple detection of high contrast images does not require loading a bloated computer vision library. For example I don't need OpenCV to detect and recognize circles or coloured discs.
I think I'm on the right tack in going with an AI type of camera for the vision processing on my bot.
It will save you worrying about how to code your own solution to detect some visual target.
Still testing and experimenting. Will add this little module to see if it works as well as my mobile phone in determining direction with the advantage of the data being accessible by the software.
FYI @byron
In order to control the two motorized wheels with encoders to drive and steer a robot base I have used the encoder values to adjust the PWM value for each motor to apply the right amount of power to control the speed of each motor.
But the encoders can also be used to estimate where the robot base is on an x,y axis and the direction theta it is pointing. It needs a starting x,y value and a direction theta. In this case it is x=0, y=0 and theta = 0 degrees to start.
If the robot base is moving in a straight line the x value should change with distance travelled, the y value and theta value would remain constant. In practice unless the motors are synchronized to the same speed the robot base will veer left or right shown by the y value increasing or decreasing and the direction theta changing as well.
So in the code below the function updatePOSE() uses the change in encoder readings to compute an estimation of where the robot should be and in what direction it is pointing.
Of course over time the POSE of the robot will drift away from the estimated position and some external means will be needed to reset it.
A point to make is I use the encoder counts to measure the distance the wheels have rotated. These can be converted into inches or centimeters if required and which I will do to compare the estimated POSE with the actual POSE of the robot base.
// Motor A const int enA = 11; const int in1 = 10; const int in2 = 9; // Motor B const int in3 = 7; const int in4 = 6; const int enB = 5; volatile long counter1 = 0; volatile long counter2 = 0; int PWM1 = 100; int PWM2 = 102; // button pins assignment const int btn1 = 24; // press button const int btn2 = 26; // encoder pins assignment const int ENCA = 3; // pin of encoder1 const int ENCB = 2; // pin of encoder2 void forwardA(int rate){ digitalWrite(in1, HIGH); digitalWrite(in2, LOW); analogWrite(enA, rate); } void forwardB(int rate){ digitalWrite(in3, HIGH); digitalWrite(in4, LOW); analogWrite(enB, rate); } void reverseA(int rate){ digitalWrite(in1, LOW); digitalWrite(in2, HIGH); analogWrite(enA, rate); } void reverseB(int rate){ digitalWrite(in3, LOW); digitalWrite(in4, HIGH); analogWrite(enB, rate); } void turnOffMotorA(){ digitalWrite(in1, LOW); digitalWrite(in2, LOW); } void turnOffMotorB(){ digitalWrite(in3, LOW); digitalWrite(in4, LOW); } void turnOffMotors(){ digitalWrite(in1, LOW); digitalWrite(in2, LOW); digitalWrite(in3, LOW); digitalWrite(in4, LOW); } void setup() { // Set all the motor control pins to outputs pinMode(enA, OUTPUT); pinMode(enB, OUTPUT); pinMode(in1, OUTPUT); pinMode(in2, OUTPUT); pinMode(in3, OUTPUT); pinMode(in4, OUTPUT); // buttons pinMode pinMode(btn1,INPUT_PULLUP); pinMode(btn2,INPUT_PULLUP); pinMode(ENCA,INPUT); // set as input digitalWrite(ENCA,HIGH); // enable internal pullup resister attachInterrupt(digitalPinToInterrupt(ENCA), encoder1, RISING); // interrupt initialization pinMode(ENCB,INPUT); // set as input digitalWrite(ENCB,HIGH); // enable internal pullup resister attachInterrupt(digitalPinToInterrupt(ENCB), encoder2, RISING); // interrupt initialization counter1 = 0; counter2 = 0; Serial.begin( 9600 ); Serial.println("Starting up"); } // interrrupt server routines for reading encoder void encoder1() { counter1 = counter1 + 1; } void encoder2() { counter2 = counter2 + 1; } int prevCounter1; int prevCounter2; // POSE float theta; float x; float y; void updatePOSE(){ float SR = counter1 - prevCounter1; float SL = counter2 - prevCounter2; prevCounter1 = counter1; prevCounter2 = counter2; x += SR * cos(theta); y += SL * sin(theta); theta += (SR - SL)/100; if(theta > 2*PI) theta -= 2*PI; else if(theta < -2*PI) theta += 2*PI; } void loop() { updatePOSE(); Serial.print("x = "); Serial.print(x); Serial.print(" y ="); Serial.print(y); Serial.print(" theta ="); Serial.println(theta); Serial.println(); }
Thanks for that info, though on my outdoor bot attempting to do such steering does not give the necessary accuracy due to lumps and bumps encountered during the process and checking the compass bearing against the GPS desired bearing and applying that feedback to applying the skid steering proved more beneficial.
The magnetometer I used was in a MPU 9250 board, but that had i2c comms so I did not need to figure out what the various pins were for apart from identifying the SDA and SCK pins of course.
I did come across a cheap magnetometer board that also used i2c that you may be interested in if your play with your Chinese boards comes to nowt. Here it is:
https://thepihut.com/products/adafruit-triple-axis-magnetometer-mmc5603-stemma-qt-qwiic
At least a peruse may give you some hints on using a magnetometer.
My bot is currently back in its box and I may put the blame on you. 😐
I read a link in a previous post of yours where you link to some previous discussion on using some example code of mine to find your sauce bottle. As I was going to make some remark about using that code I looked at the link to find the post I made was blank and my example had been 'disappeared'. 😵
So I had a trawl though my computer to find my examples of a few years back and it hit home to me what a labyrinthine and horrible mess a lot of stuff I have on my computer is. In prep for my new computer I thought it was high time I sorted out just what I want to take over to the new, which also let me to think about what to do with the existing imac. It could make an ideal second monitor as the screen is lovely, but this was found to be not possible without surgery or without some dubious techniques of some sort.
Well I do not wish to risk damaging a good working computer but I then remembered I put aside my wife's old imac that packed up a couple of years ago but there wasn't anything wrong with the screen and I kept it in case I might be able to do something with it.
Bottom line is, an iFixit kit has arrived today, a converter board to enable the screen to be used as a monitor has been ordered, the desk has been cleared ready for surgery, and I've started the long trawl though my computer files.
The robotbuilder butterfly fluttered his wings which lead to consequences half a world away. You have to shoulder the blame. And probably after some magic smoke signalling the final demise of my wifes old computer whilst under the knife, hopefully it will be back to the robot stuff sometime in the next year. 😀