@robotbuilder I think you missed something. There are close to 80 samples of the line intersection code in 80 different languages. Here is the C++ version from rosetta. I just compiled it and the asserts and 1 court needed to be changed to serial prints.
#include <iostream> #include <cmath> #include <cassert> using namespace std; /** Calculate determinant of matrix: [a b] [c d] */ inline double Det(double a, double b, double c, double d) { return a*d - b*c; } /// Calculate intersection of two lines. ///\return true if found, false if not found or error bool LineLineIntersect(double x1, double y1, // Line 1 start double x2, double y2, // Line 1 end double x3, double y3, // Line 2 start double x4, double y4, // Line 2 end double &ixOut, double &iyOut) // Output { double detL1 = Det(x1, y1, x2, y2); double detL2 = Det(x3, y3, x4, y4); double x1mx2 = x1 - x2; double x3mx4 = x3 - x4; double y1my2 = y1 - y2; double y3my4 = y3 - y4; double denom = Det(x1mx2, y1my2, x3mx4, y3my4); if(denom == 0.0) // Lines don't seem to cross { ixOut = NAN; iyOut = NAN; return false; } double xnom = Det(detL1, x1mx2, detL2, x3mx4); double ynom = Det(detL1, y1my2, detL2, y3my4); ixOut = xnom / denom; iyOut = ynom / denom; if(!isfinite(ixOut) || !isfinite(iyOut)) // Probably a numerical issue return false; return true; //All OK } int main() { // **Simple crossing diagonal lines** // Line 1 double x1=4.0, y1=0.0; double x2=6.0, y2=10.0; // Line 2 double x3=0.0, y3=3.0; double x4=10.0, y4=7.0; double ix = -1.0, iy = -1.0; bool result = LineLineIntersect(x1, y1, x2, y2, x3, y3, x4, y4, ix, iy); cout << "result " << result << "," << ix << "," << iy << endl; double eps = 1e-6; assert(result == true); assert(fabs(ix - 5.0) < eps); assert(fabs(iy - 5.0) < eps); return 0; }
First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
My personal scorecard is now 1 PC hardware fix (circa 1982), 1 open source fix (at age 82), and 2 zero day bugs in a major OS.
@robotbuilder I think you missed something. There are close to 80 samples of the line intersection code in 80 different languages. Here is the C++ version from rosetta.
I simply went to the FreeBASIC version as I am using that language in my vison project. Converting to C++ or Python was an after thought. I realise that to share code I have to also write a Python version.
You can not only say the same thing in different languages you can say the same thing differently using the same language!
@robotbuilder That is a very educational site.
First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
My personal scorecard is now 1 PC hardware fix (circa 1982), 1 open source fix (at age 82), and 2 zero day bugs in a major OS.
Thanks for you nice python program as created by a c++ to Python converter. Just don't ever get tempted to use any of the AI programs like copilot or you will soon get into a sea of troubles and end up with the most horrible Python programs 😎
I think your remark of using tuples was in connection with your basic code, but as you can see from the code snippet on compass bearing calcs I posted, using tuples in python is a very usual thing to do.
I have been getting back up to speed with Python. I was going to translate a FreeBASIC graphics program giving a visual take on locating an observer given the known positions of two beacons and their angle to the robot. Out of curiosity I thought I might see if AI could do it and it DID!!
import pygame import math # Define constants Pi = 4 * math.atan(1) TwoPi = 8 * math.atan(1) RtoD = 180 / Pi # radians * RtoD = degrees DtoR = Pi / 180 # degrees * DtoR = radians # Define a point structure class Point: def __init__(self, x=0, y=0): self.x = x self.y = y # Initialize Pygame pygame.init() screen = pygame.display.set_mode((1000, 600)) pygame.display.set_caption("Beacon Angle Detection") clock = pygame.time.Clock() # Positions of two beacons x1, y1 = 150, 150 x2, y2 = 500, 300 running = True while running: for event in pygame.event.get(): if event.type == pygame.QUIT: running = False mx, my = pygame.mouse.get_pos() # Calculate angles a1 = math.atan2(my - y1, mx - x1) * RtoD a2 = math.atan2(my - y2, mx - x2) * RtoD if a1 < 0: a1 += 360 if a2 < 0: a2 += 360 # Clear the screen screen.fill((0, 0, 0)) # Draw Beacon 1 pygame.draw.circle(screen, (255, 0, 0), (x1, y1), 2) pygame.draw.circle(screen, (255, 0, 0), (x1, y1), 5) font = pygame.font.Font(None, 36) text = font.render(f"A1 = ({x1}, {y1}) angle = {int(a1)}", True, (255, 255, 255)) screen.blit(text, (x1 + 5, y1)) # Draw Beacon 2 pygame.draw.circle(screen, (255, 255, 0), (x2, y2), 2) pygame.draw.circle(screen, (255, 255, 0), (x2, y2), 5) text = font.render(f"A2 = ({x2}, {y2}) angle = {int(a2)}", True, (255, 255, 255)) screen.blit(text, (x2 + 5, y2)) # Draw mouse position pygame.draw.circle(screen, (0, 0, 255), (mx, my), 2) pygame.draw.circle(screen, (100, 100, 255), (mx, my), 5) # Calculate end points for lines from beacons xd1 = math.cos(a1 * DtoR) yd1 = math.sin(a1 * DtoR) s1 = Point(x1, y1) e1 = Point(x1 + xd1, y1 + yd1) xd2 = math.cos(a2 * DtoR) yd2 = math.sin(a2 * DtoR) s2 = Point(x2, y2) e2 = Point(x2 + xd2, y2 + yd2) # Calculate intersection point A1 = e1.y - s1.y B1 = s1.x - e1.x C1 = A1 * s1.x + B1 * s1.y A2 = e2.y - s2.y B2 = s2.x - e2.x C2 = A2 * s2.x + B2 * s2.y det = A1 * B2 - A2 * B1 if det != 0: # Check for parallel lines x3 = (B2 * C1 - B1 * C2) / det y3 = (A1 * C2 - A2 * C1) / det # Draw intersection point pygame.draw.circle(screen, (200, 0, 200), (int(x3), int(y3)), 8) # Display coordinates text = font.render(f"mx = {mx} my = {my}", True, (255, 255, 255)) screen.blit(text, (4, 2)) text = font.render(f"x3 = {int(x3 + 0.5)} answer.y = {int(y3 + 0.5)}", True, (255, 255, 255)) screen.blit(text, (4, 30)) pygame.display.flip() clock.tick(60) # Run at 60 FPS
That code does look ok, but, did you ask for a screen display visulisation or just a calculation example?
If you were just wanting the python code to do the calculation to feed into your other functions then the code relating to pygame may not be of much use, supposing you even have that library loaded. And if you were running micropython on one of you ESP32's then pygame will not work on that.
And thats often the problem with this AI stuff especially for beginners. But as an old hack you may well find good use of it, though for myself, I would use a search engine so as to better evaluate from whence the suggested code originates.
But, regardless, thats a nice example 'generated' code. 😀
I think you missed the point and the purpose of using a graphics display instead of a series of print and input statements to test an algorithm. You can select the input with the mouse and view the output in the graphics display.
Did you try the code?
It is not simply 'generated' code it is translated code just as you might translate one human language to another human language something google does on web pages with the click of a button.
With your AI camera you might get the angle of some visual beacons of known locations. You feed that into a getRobotPosition function which uses the algorithm to compute the robot's position.
positionOfRobot = getRobotPosition(beacon1angle, beacon2angle, beacon1Position, beacon2Position)
What I gave the AI was a FreeBASIC program that tested the code and gave a visual display with a mouse input. Instead of typing in different robot positions and reading printed output coordinates to test the algorithm I could simply use the mouse pointer.
The algorithm itself is independent of the demo code and can be used in your AI robot.
This is the actual FreeBASIC code it translated. I was astonished how good it was at the translation which I intended to do myself.
#Define NaN 0 / 0 ' FreeBASIC returns -1.#IND 'some useful defines Const Pi = 4 * Atn(1) Dim Shared As single TwoPi = 8 * Atn(1) Dim Shared As single RtoD = 180 / Pi ' radians * RtoD = degrees Dim Shared As single DtoR = Pi / 180 ' degrees * DtoR = radians Type _point_ As single x, y End Type Dim As _point_ s1, e1, s2, e2 screenres 800,600,32 dim as integer mx,my,mb dim as single x1,y1,x2,y2,x3,y3,a1,a2 'positions of two beacons x1 = 150 y1 = 150 x2 = 500 y2 = 300 do getmouse mx,my,,mb screenlock cls locate 2,2 a1 = atan2(my-y1,mx-x1)*RtoD a2 = atan2(my-y2,mx-x2)*RtoD if a1<0 then a1 = a1 + 360 if a2<0 then a2 = a2 + 360 circle (x1,y1),2,rgb(255,0,0),,,,f circle (x1,y1),5,rgb(255,0,0) draw string (x1+5,y1),"A1 = (" & x1 & ","& y1 & ") angle =" & int(a1) circle (x2,y2),2,rgb(255,255,0),,,,f circle (x2,y2),5,rgb(255,255,0) draw string (x2+5,y2),"A2 = (" & x2 & ","& y2 & ") angle =" & int(a2) circle (mx,my),2,rgb(0,0,255),,,,f circle (mx,my),5,rgb(100,100,255) dim as single xd,yd xd = Cos(a1*DtoR) yd = Sin(a1*DtoR) s1.x = x1 e1.x = x1+xd s1.y = y1 e1.y = y1+yd xd = Cos(a2*DtoR) yd = Sin(a2*DtoR) s2.x = x2 e2.x = x2+xd s2.y = y2 e2.y = y2+yd Dim As single a1 = e1.y - s1.y Dim As single b1 = s1.x - e1.x Dim As single c1 = a1 * s1.x + b1 * s1.y Dim As single a2 = e2.y - s2.y Dim As single b2 = s2.x - e2.x Dim As single c2 = a2 * s2.x + b2 * s2.y Dim As single det = a1 * b2 - a2 * b1 x3 = (b2 * c1 - b1 * c2) / det y3 = (a1 * c2 - a2 * c1) / det circle (x3, y3),8,rgb(200,0,200) locate 4,2 print "mx =";mx;" my =";my print print "x3 =";int(x3+0.5);" answer.y =";int(y3+0.5) screenunlock sleep 2 loop until multikey(&H01)
Oh I see, it was generated from your basic code, and I too am surprised it did it so so well.
Yes I did try the program and it works well. Its sort of similar to what I had done with openCV. For that I took the middle top as being North and calculated a compass bearing between the two points from the screen position of the base point and the angle between the base and destination points. When I have time I will augment that pygame display to represent that compass bearing as well as it does make a nice visual check on what the calculation is producing. I think I may have to adjust the pygame grid as it appears to be a slightly more oblong rather than square, but that should not be a problem.
Thanks for the example, it will be fun to have a play with it.
You've been a bit quite on the robot front, I hope Stool-e is progressing and terrorising the living room inhabitants. 😆
I've not got back to my bot yet, but I did have a quick play with the pygame angle creation example you posted. Actually I've decided not to navigate via a camera angle calc algorithm, but now I see that there are some quite cheap GPS RTK boards to be had I will eventually go with that for my outdoor bot.
But however one calculates the angle from the current position to the target its necessary to know the current position the bot is pointed so as to manoeuvre it to the correct heading (or angle) for it to proceed along the correct path. To that end I will be using a magnetometer sensor and I think you alluded to the fact you may be using the same for your indoor bot?
Anyway I had a quick play in an idle moment with the pygame angle calc to get the compass bearing from the bots current position to a target position for use with a magnetometer sensor (the target position being set by clicking the mouse at the desired screen location spot. I'm only good for some schoolboy geometry calcs and could only think to do these calcs by drawing right angle triangles from source to target and then translating these to a compass bearing, taking the top middle of the screen as North. If my mathematical genius Niece happens to visit I will give her the challenge to come up with some proper mathematical stuff. 😎
Anyway its a pretty display that I could modify its use by having it feed from GPS current and target positions from the GPS RTK sensors for a visual display of the bots meandering progress. It could be modified to represent a room with blobs for furniture to both control and visually show a room-bots wanderings if such a bot was using a magnetometer for navigation positional calcs.
So for what its worth here is the program.
import pygame from math import atan, degrees, hypot import time # to calculate the compass quadrant def quadrant(x,y): if x>0 and y>0: return 'NW' elif x>0 and y<0: return 'NE' elif x<0 and y>0: return 'SW' elif x<0 and y<0: return 'SE' elif x >0 and y==0: return 'N' elif x <0 and y==0: return 'S' elif x ==0 and y<0: return 'E' elif x==0 and y>0: return 'W' else: return 'ERROR' # to calculate a compass bearing based on target to current position # triangle angle and target compass quadrant posistion. def compass_bearing(quad,angle): if quad == 'N': return 0 elif quad == 'E': return 90 elif quad == 'S': return 180 elif quad == 'W': return 270 elif quad == 'NW': return 360 - angle elif quad == 'NE': return angle elif quad == 'SE': return 180 - angle elif quad == 'SW': return (180 + (angle *-1)) # Initialize Pygame pygame.init() screen = pygame.display.set_mode((1000, 1000)) pygame.display.set_caption("Compass Bearing Calc - set target with mouse click") clock = pygame.time.Clock() # set intial target coordinates (to save a divide by zero as mouse defaults to 0,0) x1,y1 = 500,250 # loop to calculate screen target as selected by a mouse click run = True while run: for event in pygame.event.get(): if event.type == pygame.QUIT: run = False if event.type == pygame.MOUSEBUTTONUP: x1 = mx y1 = my screen.fill((0, 0, 0)) #compass grid pygame.draw.line(screen, (0,0,255), (500,0), (500,1000), 5) pygame.draw.line(screen, (0,0,255), (0,500), (1000,500), 5) fontH = pygame.font.Font(None, 40) text = fontH.render("N", True, (255, 0, 0)) screen.blit(text, (490, 0)) text = fontH.render("S", True, (255, 0, 0)) screen.blit(text, (490, 975)) text = fontH.render("W", True, (255, 0, 0)) screen.blit(text, (0, 490)) text = fontH.render("E", True, (255, 0, 0)) screen.blit(text, (980, 490)) #target location (a grey dot) pygame.draw.circle(screen, (150,150,250), (x1, y1), 6) # trangle from target to current position (i.e. centre of cross hairs) pygame.draw.line(screen, (0,150,0),(x1,y1), (500,500),5) pygame.draw.line(screen, (150,0,0),(x1,y1), (500,y1), 5) # Ajacent, Oposite and Hypotinuse triangle measurments in screen 1000x1000 units Aj = 500-y1 Op = 500-x1 Hy = round(hypot(Aj,Op),1) try: Angle = degrees(atan(Op/Aj)) except: Angle = 0 quad = quadrant(Aj,Op) if Hy != 0: bearing = round(compass_bearing(quad,Angle),1) else: bearing = 0 # Text font font = pygame.font.Font(None, 36) # Triangle sides text = font.render(f"Adjacent={Aj} Opposite={Op}", True, (255, 255, 255)) screen.blit(text, (10, 10)) # Angle calc for bearing text2 = font.render(f"Angle={round(Angle,1)}", True, (255, 255, 255)) screen.blit(text2, (10, 40)) # compass quadrant of target text3 = font.render(f"Quandrant={quadrant(Aj,Op)}", True, (255,255,255)) screen.blit(text3, (10, 70)) # compass bearing to proceed along text4 = font.render(f"Bearing={bearing} degrees", True, (255,255,255)) screen.blit(text4, (10, 100)) # distance to target (in screen 1000x1000 measurment terms) text5 = font.render(f"Distance={Hy} screen 'points'", True, (255,255,255)) screen.blit(text5, (10,130)) # on mouse click record a target position on the screen mx, my = pygame.mouse.get_pos() pygame.draw.circle(screen, (0, 255, 0), (mx, my), 6) pygame.draw.circle(screen, (255, 0, 0), (mx, my), 2) pygame.display.flip() clock.tick(60) # Run at 60 FPS pygame.quit()
@byron Is GPS RTK the acronym for the centimetre accurate version? How well does it work indoors?
I am surprised to hear that anyone would consider a magnetometer as a guidance tool, magnetic lines are not straight, they can wander a lot due to both natural features like meta deposits and human made features like steel beams in buildings etc. I will be watching with curiosity how that works out.
Afterthougt: What about gyroscopes, and sat nav, and (sorry senior moment, but there is another popular method used that I can't remember)
I am optimistic my health issues will soon be resolved so I can then start building some vehicles and learn from those who have gone on ahead of me
First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
My personal scorecard is now 1 PC hardware fix (circa 1982), 1 open source fix (at age 82), and 2 zero day bugs in a major OS.
@byron Is GPS RTK the acronym for the centimetre accurate version? How well does it work indoors?
Yes thats correct, but GPS RTK it wont work indoors a clear view of the sky is needed. I will only be using that for my outdoor bot. With GPS you can find your current position, give the bot a target Lat and Lon GPS coordinates target position and then calculate a compass bearing from current to target position. Then the problem of making the bot proceed to the target by following the compass bearing has to be tackled. The bot needs a compass. 😀 A magnetometer sensor!. Very common for both arial and ground rover bots I think, along with some dead reckoning calcs.
As to how well a magnetometer will work indoors, well they do give readings but they may well be wayward I guess, though they should work as good as a handheld compass does indoors. How well does your mobile phones compass work indoors? I guess it would all depend. I've actually got one on my desk for calibration and I've 3D printed a square mounting frame for it so as to handily ensure I can accurately tip on 90 degrees etc. during the calibration process. I well see how well that goes. I have previously used one on my outdoor bot which was quite wayward though that could very well have been down to my feeble program. I think @robotbuilder got himself a magnetometer sensor so maybe he can tell you how well it works indoors.
Good to hear your health is likely on the mend and I look forward to future RonBots.
You've been a bit quite on the robot front, I hope Stool-e is progressing and terrorising the living room inhabitants. 😆
With the xmas break it has been put in the shed to collect dust as I had many other family things to attend to.
The robot base was intended to act as a platform to test my visual programs with respect to controlling the robot via its motors. It is more than just navigation. GPS cannot recognize objects. Although vision is computationally difficult it has a lot more potential than LIDAR when it comes to navigating and recognizing where you are and what you are looking at.
You do not need a compass to navigate. I can navigate fine when bush walking using visual landmarks. You can move from one visual land mark to the next. In a house the walls give you direction as can the visual orientation of benches or tables and any fixed object. Indeed outdoors the orientation of buildings, roads and other fixed things give you direction. In the bush the stars or the sun give you direction. In the mallee country where I live the sandy hills run north/south all easily seen visually. We get around fine without the need to know exactly where we are in some xy coodinate system.
A video camera is a very cheap sensor considering the amount of data it can collect even if the processing of that data is difficult the pay off is such I think it will be a game changer when it comes to expanding the capabilities of robots both indoors or outdoors on a farm.
There is a place in the scheme of things where GPS or LIDAR provide a simple but accurate way to measure the world when that is required but in a general purpose robot vision is the solution IMHO and GPS or LIDAR will just be tools it can use if required.
You do not need a compass to navigate
Indeed not 😀, but your remarks on your ability to navigate your goodself by gazing at those far off sandy hills, I think that maybe you missed my point. As I said "if such a bot was using a magnetometer for navigation positional calcs" and I thought you may be as you posted about using one following your placing of your mobile phone on stall-e, then you need to use a magnetic bearing for that part (maybe a small part) of your navigation process.
Calculating this magnetic bearing from the calculated angle from destination to source was what I augmented your angle calculation example to do. Nowt more. I don't doubt your bot will be using vision in all sorts of exciting ways. Maybe we will see what you achieve in due course
Blimey my old eyes, now you quote a bit of my post I see I typed quite instead of quiet. Ah well, maybe stall-e will get retrieved and dusted down now yet another Christmas passes by and the last of the sherry bottle has been drained. 😎
On my bot, the GPS RTK and magnetometer sensors are just a part of the whole system in which camera's are intended to be utilised though they may not be mounted on the bot. But its early days and I'm not currently working on it apart from some little bits of test code here and there.
Yes I was playing with the idea of using a compass, gyro or both but then over xmas thought it wasn't needed. It is not a robot boat avoiding rocks using two beacons. For your outdoor robot GPS seems a logical solution for navigation. It was meant to scare bunnies away from the vegetable?
The past they say is a good predictor of the future and it has been what 6 years since we started on our projects? Actually I started playing with visual recognition decades ago! So I wouldn't hold your breath waiting for anything to happen.
An ideal outdoor robot for surveillance is a drone!
That will scare the bunnies 🙂
Here is a cheap example. Look at all its robot features.
https://www.kogan.com/au/buy/kogan-4k-camera-drone-with-obstacle-avoidance-2-batteries-kogan/