Inqling Junior - Ro...
 
Notifications
Clear all

Inqling Junior - Robot Mapping, Vision, Autonomy

240 Posts
10 Users
87 Likes
8,258 Views
Inq
 Inq
(@inq)
Noble Member
Joined: 8 months ago
Posts: 963
Topic starter  

UPDATE

My least favorite part!  Building the Motherboard.

The Rat's Nest

RatsNest

Populated

Populated

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, Access Point Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
robotBuilder reacted
ReplyQuote
Ron
 Ron
(@zander)
Famed Member
Joined: 2 years ago
Posts: 3456
 

@inq We share that, I am horrible at the physical construction.

"Don't tell people how to do things. Tell them what to do and let them surprise you with their results.” - G.S. Patton, Gen. USA
"Never wrestle with a pig....the pig loves it and you end up covered in mud..." anon


   
ReplyQuote
Inq
 Inq
(@inq)
Noble Member
Joined: 8 months ago
Posts: 963
Topic starter  
Posted by: @zander

@inq We share that, I am horrible at the physical construction.

I think it's the age and the eyes.  I used to be able to read the fine print on top of these sensor chips without glasses.  Now, I can barely see the sensor chip.  Got to use Coke bottle bottoms to see where to put the tiny wires.  I have to triple check continuity and make sure I don't have any shorts.  Can't see them... got to stick a meter in the holes!  

I blame it on working in front of computer monitors in both of my careers.  My Dad still had uncorrected 20:18 at sixty years of age.

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, Access Point Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Ron
 Ron
(@zander)
Famed Member
Joined: 2 years ago
Posts: 3456
 

@inq I relate to all of that, but I think my biggest problem is making the leap from a logical circuit to a physical as well as parts placement and connectors. My lack of knowledge in this area is overwhelming. The reason I don't show my projects is I am afraid of killing someone from laughing to death. Yours looks like a pro job to me, I only recently learned the name of those neat grove connectors, but so far no luck finding a source. Yes I can find them but are they the right size and gender and # of pins. Surely there must be 'standard' connectors, like 2 pin male and female power. So on top of being visually impaired I am possibly google search impaired too.

"Don't tell people how to do things. Tell them what to do and let them surprise you with their results.” - G.S. Patton, Gen. USA
"Never wrestle with a pig....the pig loves it and you end up covered in mud..." anon


   
ReplyQuote
Inq
 Inq
(@inq)
Noble Member
Joined: 8 months ago
Posts: 963
Topic starter  
Posted by: @robotbuilder

Are you going to program this yourself at the low level or are you going to use libraries like SLAM to do the magic for you?

Historically, I'm bad about wanting to write every aspect of a problem and I can't see myself changing now. 🙄  I wrote a Ray Tracing program back in college, so messing with 3D geometry, vectors, and matrices gives me much of the basic building blocks I'll need to do this.  I've gone through several of the videos @Inst-Tech recommended with "Coding Coach" and his Henry IX.  He hit on several aspects of SLAM and several others and showed some pictures, but I have not seen any details about those libraries. 

I haven't quite decided whether to try to go into it with a blank sheet or researching those libraries to get a good basis for the state of the art.  Thinking about it right now... I might make a stab just using first principles and ignore those libraries.  I can always come back to them if/when I hit bottom. 😣  

Either way, I'll write it in C++ on the ESP8266.  Receiving the accumulated data on the PC, I'll probably just use JavaScript so that it fits in with my InqPortal library and allow me to run it on any device... phone, tablet, laptop or even simultaneously to do parallel processing on multiple computers if need be.  I could break out my Visual Studio and do the PC side in C++ or C#, but I'd rather not while tinkering around with it.  I tend to write JavaScript like I write C++ even with OOP, class oriented structuring, so it will easily port back to C++ if/when I attempt to move it back to the ESP8266 or Raspberry Pi.

I need to re-read the rest of your post.  I have several questions already, but I want to work it out in my prime time... I'm a morning person.  I can only do pixels on bitmaps when its this late.  Deep logic... only in the morning. 😴 

VBR,

Inq

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, Access Point Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Inq
 Inq
(@inq)
Noble Member
Joined: 8 months ago
Posts: 963
Topic starter  
Posted by: @robotbuilder

Now on the left of the image below is the simulation. The robot scans 360 degrees and plots the distance to the wall for each angle. I coloured the walls to make it easier to see although I guess returning the color as well as distance is an option. On the right is a plot of the distance and relative position of the obstacle (wall in this case) to the robot.

Things are clearer in the morning. 😉 

The questions I had last night were answered in the light (actually none here yet) of the morning.  They've devolved more to your images and mundane sounding issues. 

The mundane - When you say simulation... Did you simulate this dual image with say MS Paint for discussion purposes... or is the right image plotted data from your robot and the left is simulated conversion of that data to a vector space?  I think the latter based on how you wrote it, but I want to clarify.  

The non-mundane - I understand the process of taking the Cartesian points (or was it Polar points from a rotating Lidar) and converting to vectors... Oops, tangent... 

My immediate gut reaction was to convert Polar to Cartesian at the get-go.  All-be-it, most human structures are rectangular.  I'm now wondering if their is benefit to be gained by keeping the Math in Polar as the "perspective" of the Robot.  Requires more thought.

... back from the Dark Side.  What appeared to be a mundane question, Is the data on your right image actual data?  My limited use of any range finding has been the HC-SR04.  Looking at a perfect, clean sheet-rock wall would return data more like...

image

My Engineering background involved more "exact" approximations (sounds like an Oxymoron) like Finite Element Analysis, Continuum Mechanics, Laminated Plate Theory...  I'm sorry to say, I didn't go in for much of the black arts of Statistical Analysis.  I knew how to take material data and create an A-basis but have no core understanding (feel) for it.

How do you take this shot-gun data, of a perfect, clean wall using the three-point makes a line method and generate "a wall" in the Oasis?  (from Ready Player One)

Then... what happens as its walking in a real house and sees sofas, chairs, chair legs, cabinetry, cloth surfaces (HC-SR04 Achilles' Heal) Dark versus Light transitions (Infrared Achilles' Heal)???

Then... I need to add the 3rd dimension and use 4 points to define planes.

I'm not shying away.  In fact this is Cat-nip to me!  Although, I think my earlier reference to a couple of months was way, way... optimistic. 😆 

Sure wish you had a robot project going on to run in parallel.  Sure I can't talk you into it??? 🤩 🤔 😎 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, Access Point Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
robotBuilder
(@robotbuilder)
Noble Member
Joined: 3 years ago
Posts: 1563
 

@inq 

Did you simulate this dual image with say MS Paint for discussion purposes...

The left image of the room was drawn in MS Paint and saved as room.bmp
The right image was generated by a program.

Here is the actual program. It is however written in FreeBASIC although given time to get up to speed with JavaScript I might be able to translate. I suspect with your background you can read it ok and write a cleaner version.  Remember I am a self taught programmer and do not have your math background or expertise in writing code. I copied it to the Arduino IDE to save as html so it looks ok in the post.

The shoot ray code is actually what I used when playing with writing a retro style game  like Doom.

snapShot

 

'some useful defines
Const Pi = 4 * Atn(1)
Dim Shared As single TwoPi = 8 * Atn(1)
Dim Shared As single RtoD = 180 / Pi   ' radians * RtoD = degrees
Dim Shared As single DtoR = Pi / 180   ' degrees * DtoR = radians

Dim Shared As single rAngle            ' angle of ray

const WORLDX = 480
const WORLDY = 480

Screenres 960,480,32
color rgb(0,0,0),rgb(255,255,255):cls

dim shared as any ptr image
image = imagecreate(480,480)
dim shared as any ptr memory
memory = imagecreate(480,480)
line memory,(0,0)-(479,479),rgb(20,20,20),bf


bload "room.bmp",image

dim as integer ox
dim as integer oy
dim as single  oAngle
dim as single  mv



ox = 30
oy = 20
mv = 0    'not moving

Dim Shared As single dx,dy             ' some working variables

dim shared as single x,y            'hit position

Dim Shared As single w


sub drawWorld()

    put (0,0),image,trans
    put (480,0),memory,trans

end sub


''============================================================
'' This shoots ray from observer and returns type of tile
'' of any target hit. It also draws a ray on the ground
''============================================================
function shootRay(x1 As Integer,y1 As Integer,angle As single) as ulong

    Dim As single dx,dy,change,aa 'x,y
    dim as ulong c   'color hit

    dx = Cos(angle)
    dy = Sin(angle)
    y = y1
    x = x1

    If Abs(dx)>abs(dy) Then
        change = Abs(dy)/Abs(dx)
        If dx<0 Then
            aa = -1
        Else
            aa =  1
        End If
        If dy<0 Then
            change = - change
        End If
        'flag = 0
        c = point(x,y,image)
        While x>=0 And x<WORLDX And y>=0 And y<WORLDY and c = rgb(255,255,255)
            x = x + aa
            y = y + change
            pset (x,y),rgb(255,0,0)
            c = point(x,y,image)
        Wend
    Else
        change = Abs(dx)/Abs(dy)
        If dy<0 Then
            aa = -1
        Else
            aa = 1
        End If
        If dx<0 Then
            change = -change
        End If
        c = point(x,y,image)
        While x>=0 And x<WORLDX And y>=0 And y<WORLDY and c = rgb(255,255,255)
            y = y + aa
            x = x + change
            c = point(x,y,image)
            pset (x,y),rgb(255,0,0)
        Wend        
    End If 
    
    circle memory,(x,y),2,c,,,,f
    
    return c
End function
'=================================================




'==========   MAIN LOOP =====================
dim as ulong item
dim as single distance

ox = 240
oy = 240

'Do


    drawWorld()

    '=========   USER INPUT =========================
    'Check arrow keys and update position accordingly
    Dim As Integer velocity
    mv = 0 'set velocity to zero
    If MULTIKEY(&h4B) Then oAngle = oAngle - 4 * DtoR  'angles in radians
    If MULTIKEY(&h4D) Then oAngle = oAngle + 4 * DtoR
    If oAngle > TwoPi Then oAngle = oAngle - TwoPi
    If oAngle < 0 Then oAngle = oAngle + TwoPi
    If MULTIKEY(&h48) Then mv =  10  'move forward
    If MULTIKEY(&h50) Then mv = -10  'move back
    dx = Cos(oAngle) * mv
    dy = Sin(oAngle) * mv

    'within bounds?
    If (ox+dx) > 0  And (ox+dx) < 479 And (oy+dy) > 0 And (oy+dy) < 479  Then
        ox = ox + dx
        oy = oy + dy
    End If

    '=======================================

    for oAngle = 0 to 359
        screenlock
        cls
        drawWorld()
        item = shootRay(ox,oy,oAngle*DtoR)
        distance = sqr((ox-x)^2 + (oy-y)^2)
        screenunlock
        pset memory,(oAngle,distance),item 
        sleep
    next oAngle

    
'Loop While Not MULTIKEY(&h1)

 

 

How do you take this shot-gun data, of a perfect, clean wall using the three-point makes a line method and generate "a wall" in the Oasis? Then... what happens as its walking in a real house and sees sofas, chairs, chair legs, cabinetry, cloth surfaces (HC-SR04 Achilles' Heal) Dark versus Light transitions (Infrared Achilles' Heal)???

Yes it is clean data and real data would be a bit (a lot) more messy. That is why I wrote I would need to look at your actual data to figure out any kind of solution although I realise now you are probably more capable than myself to massage the data into actual line data. I am just a self taught hobby programmer with limited experience and math is not my strong point which is why programming remained a hobby.

Sure wish you had a robot project going on to run in parallel. Sure I can't talk you into it???

Well I can't duplicate your robot as I don't have a 3d printer for the wheels. I can get an ESP8266 for $24.95 and already played with the HC-SR04 ultrasonic distance module as a scanner if you want to use that. I can't find my posts on that subject but it was like this except I only sent data to the Serial Monitor and Plotter to see how accurate the distance measurements were.

 

However my real interest is vision leaving the other sensors to detect an object or avoid a physical collision.

 


   
Inq reacted
ReplyQuote
Will
 Will
(@will)
Noble Member
Joined: 1 year ago
Posts: 2134
 
Posted by: @inq

My immediate gut reaction was to convert Polar to Cartesian at the get-go.  All-be-it, most human structures are rectangular.  I'm now wondering if their is benefit to be gained by keeping the Math in Polar as the "perspective" of the Robot.  Requires more thought.

But then you'd have a problem comparing data because you didn't have a common frame of reference.

... back from the Dark Side.  What appeared to be a mundane question, Is the data on your right image actual data?  My limited use of any range finding has been the HC-SR04.  Looking at a perfect, clean sheet-rock wall would return data more like...

image

That looks like a job for least squares regression to find the equation of the wall ...

https://www.statisticshowto.com/probability-and-statistics/statistics-definitions/least-squares-regression-line/

Experience is what you get when you don't get what you want.


   
Inq reacted
ReplyQuote
b
 b
(@b)
Prominent Member
Joined: 3 years ago
Posts: 912
 
Posted by: @inq

I'm not shying away.  In fact this is Cat-nip to me!  Although, I think my earlier reference to a couple of months was way, way... optimistic. 😆 

I have to say that my thought too.  But what do I know, and I have to say not very much in this aspect of robot development.  

A couple of years (or more) ago I got a 360 degree lidar sensor the same as shown in a dronebot video.  After a little playing with it I soon realised its main limitation was that it only works in one narrow horizontal plane.  Of course one could mount it on a tilt platform to try to improve its field of view but then it all starts to get too complicated, for me anyway.   So that meant is was not going to be much use for any navigation use for an indoor bot except for some limited testing.  Shame you are not down the road from me as I would gift you the sensor to see what you could do with it.   Other sensors I have looked at, the typical hobby lidar or other distance sensors did not appear to have the necessary resolution to be of much use for anything apart from immediate obstacle avoidance.  Hence I then started to look at indoor positioning system that act like an indoor GPS systems.  Some expensive stuff out there, but I have not looked at this in detail for some years and there maybe some newer sensors at reasonable prices now.  If not I expect there will be in the future.  In the meanwhile I follow the inq bot progress and hope you have more success than me.  

 

This post was modified 5 months ago by b

   
ReplyQuote
b
 b
(@b)
Prominent Member
Joined: 3 years ago
Posts: 912
 
Posted by: @inq

My least favorite part!  Building the Motherboard.

Advice to a new hobbyist without programming experience who chance upon a Dronebot video and decide to have a go would be to take a few days out and have a go at one of the many programming intro courses before tackling the project.

And advice to a dabbler in wiring up those perf boards and the like is to take just a few days out to get to a beginner stage with PCB design.  Once you find how easy it is to knock up a small PCB design you probably will not bother with perf boards again.

Here is a good beginners guide for anyone interested to doing this.  It got me going without too much of an effort.  Of course a small investment in time will be necessary before you can get dangerous.

This post was modified 5 months ago by b

   
ReplyQuote
Inq
 Inq
(@inq)
Noble Member
Joined: 8 months ago
Posts: 963
Topic starter  
Posted by: @robotbuilder

Remember I am a self taught programmer and do not have your math background or expertise in writing code. I copied it to the Arduino IDE to save as html so it looks ok in the post.

I'm right there with you.  I've only had two formal programming classes.  Basic in High School and Fortran for Engineers in college.  IOW the Fortran was about solving problems not teaching Comp-Sci philosophy.  When I had to change careers... software seemed to be the most lucrative that I could pick up quickly.  I swear by "Learn C++ for Windows Programming in 21 Days".  🤣  Learned it over the summer (60 days) of unemployment, went in for my first job interview and nailed it.  I got the same starting salary that I left my old career.  I thought I was being ballsy asking for that amount.  Apparently, I was young and stupid... and left money on the table.

Posted by: @robotbuilder

The shoot ray code is actually what I used when playing with writing a retro style game  like Doom.

There was just something about Doom in that time frame.  When every other game didn't even know what a frame rate was, Doom was usable on even the slowest PC.  Your code is just fine.  Very clean!  If we find situations where you have a great solution already, I won't have any troubles converting.  With retro all the rage, have you published?

Posted by: @robotbuilder

Yes it is clean data and real data would be a bit (a lot) more messy. That is why I wrote I would need to look at your actual data to figure out any kind of solution although I realise now you are probably more capable than myself to massage the data into actual line data.

Does this mean, supplying data in this thread might be of interest?  It sounds like you've been in the trenches with this problem already.  I was wondering if this thread was going to die a boring death once I got to the the real point of this thread where I am going to have questions and problems.  I was doing all the pretty pictures of the progress to hopefully hook a couple of you.  I thought looking at code and reams of data would stack up there with watching paint dry.  😆 

Posted by: @robotbuilder

Well I can't duplicate your robot as I don't have a 3d printer for the wheels. I can get an ESP8266 for $24.95

OMG!  I went to your profile... BLANK.  Where do you live?  Amazon delivers to ISS for cheaper than that! 😆   The last batch I got...

image
Posted by: @robotbuilder

However my real interest is vision leaving the other sensors to detect an object or avoid a physical collision.

I hopefully will catch up to you some day.  I'm still just hoping just to stop running into this wall!

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, Access Point Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Inq
 Inq
(@inq)
Noble Member
Joined: 8 months ago
Posts: 963
Topic starter  
Posted by: @will

But then you'd have a problem comparing data because you didn't have a common frame of reference.

You're probably right!  It was just a tangent.  But, sometimes coming at a problem from a different coordinate frame simplifies things.  In this case... I think looking at tabular data of a room in Cartesian will be recognizable, whereas, Polar might as well be Greek.  Polar - Can that stupid idea!

Posted by: @will

That looks like a job for least squares regression to find the equation of the wall ...

https://www.statisticshowto.com/probability-and-statistics/statistics-definitions/least-squares-regression-line/

Thank you @will... I don't know what year Statistics is normally taught in school... but I missed it, slept through it or chased girls through it.  I'm sure this will be in my future!

VBR,

Inq

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, Access Point Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Inq
 Inq
(@inq)
Noble Member
Joined: 8 months ago
Posts: 963
Topic starter  
Posted by: @byron

And advice to a dabbler in wiring up those perf boards and the like is to take just a few days out to get to a beginner stage with PCB design.  Once you find how easy it is to knock up a small PCB design you probably will not bother with perf boards again.

I am very interested as is evidence I get Google placed ads every day about using those kind of services...  "Build your custom PCB for just a few dollars.

As my hardware skills are still something closer to Musk's mindset... build it and predict a 25% chance of success.  I will always need another. 

I've already found changes, I'd make in this perf board project.  I will also want to try my other stepper drivers (TMC2209) and they require some wiring changes.  Also, I think I recall I need to add a capacitor to the servo... so the MPU won't brown-out.

I'll get there some day.

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, Access Point Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Inq
 Inq
(@inq)
Noble Member
Joined: 8 months ago
Posts: 963
Topic starter  
Posted by: @byron

A couple of years (or more) ago I got a 360 degree lidar sensor the same as shown in a dronebot video.  After a little playing with it I soon realised its main limitation was that it only works in one narrow horizontal plane.

Those are pretty cool, but expensive.  I also think it would overload any kind of processing ability of the MPU.  

Besides...

I'm hoping this 8x8 pixel sensor will live up to expectations.  Getting horizontal and vertical data at one time... supposedly getting hand waving gestures.  Maybe... flipping Inqling Jr. off will make it go in the corner and sulk.  🤣 

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, Access Point Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
robotBuilder
(@robotbuilder)
Noble Member
Joined: 3 years ago
Posts: 1563
 

@byron 

A couple of years (or more) ago I got a 360 degree lidar sensor the same as shown in a dronebot video. After a little playing with it I soon realised its main limitation was that it only works in one narrow horizontal plane.

Yet it seems to be practical enough to be used by commercial robot vacuum cleaners?

 


   
ReplyQuote
Page 7 / 16