Notifications
Clear all

Time of Flight (ToF) VL53L5CX - 8x8 pixel sensor

251 Posts
5 Users
74 Likes
12.3 K Views
Will
 Will
(@will)
Member
Joined: 3 years ago
Posts: 2504
 
Posted by: @inq

I have re-oriented the sketch so all the angles are on the diagonal of the 8x8 grid and also shown all four quadrants.  Hopefully, this helps to explain.  

Perspective2

So my contention is theory suggest that the ratio (for this configuration) should be 1.1180.

However given one of the examples from above, the ratio is only around 1.0130.

d1i16s0

Now, if you think I'm still wrong, keep trying.  I'm pretty dense, but a baseball bat usually works... eventually.

I don't know which of us is wrong, but here's my version of this ...

Working with the assumptions:

- sensor is exactly 100 cm away from the wall

- sensor is perfectly perpendicular to the wall

- the point directly across from the sensor is the XY plane origin [0,0]

- sensor senses from 22.5 degrees left to 22.5 degrees right

- sensor senses from 22.5 degrees down to 22.5 degrees up

- sensor has 64 equal angle pixels at 7.59 degrees each

- each pixel spans a square with angular change of 7.59 degrees

 

From this we can derive ...

the middle of the first block on the right/top is at angle 0.5*[7.59,7.59] = [3.795,3.795] degrees.

This corresponds to a point on the XY plane at x = y = 100*tan(3.795) = [6.6,6.6] cm.

This, in turn represents a distance of 9.334 cm diagonally out from the origin.

 

Similarly, the top rightmost block's centre is at 3.5*[7.59,7.59] = [26.545,26.545] degrees.

This corresponds to a point on the XY plane at x = y = 100*tan(26.545) = [50,50] cm.

This, in turn represents a distance of 70.71 cm diagonally out from the origin.

 

Using Pythagoras, the distance to the inner point is sqrt(100*100+9.334*9.334) = 100.43 cm.

Using Pythagoras, the distance to the outer point is sqrt(100*100+70.71*70.71) = 122.47 cm.

So the ratio of the outer to inner points is 122.47/100.43 = 1.219

 

Which seems to violate everything we've seen. Where did I go wrong ? Am I calculating the wrong ratio ?

 

Anything seems possible when you don't know what you're talking about.


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
You did one minor thing wrong below, but you also did a refinement that I didn't.  So it balances out. 😆 
Posted by: @will

I don't know which of us is wrong, but here's my version of this ...

Working with the assumptions:

- sensor is exactly 100 cm away from the wall

- sensor is perfectly perpendicular to the wall

- the point directly across from the sensor is the XY plane origin [0,0]

- sensor senses from 22.5 degrees left to 22.5 degrees right

- sensor senses from 22.5 degrees down to 22.5 degrees up

The part above, I followed that what you're calling the XY plane is the wall and all those numbers are based on the horizontal and vertical FoV (+/- 22.5°).  The part where you parted reality is this next line.  I think you used my 7.59° number for the pixel FoV from my green/blue diagram.  Note on that diagram that the blue FoV triangle is canted across the green XY plane diagonally.  IOW it has already incorporated Pythagoras.  Since you are working in the horz/vert directions first and leaving Pythagoras till last, I make the changes to your stuff below.  Really the key is...

The total FoV = 45°

The horz/vert Pixel FoV = 45 / 8 = 5.625°

- sensor has 64 equal angle pixels at 7.59 5.625 degrees each

- each pixel spans a square with angular change of 7.59 5.625 degrees

From this we can derive ...

the middle of the first block on the right/top is at angle 0.5*[7.59 5.625,7.59 5.625] = [3.795 2.8125, 3.795 2.8125] degrees.

This corresponds to a point on the XY plane at x = y = 100*tan(3.795 2.8125) = [6.6 4.913,6.6 4.913] cm.

This, in turn represents a distance of 9.334 6.948 cm diagonally out from the origin.

 

Similarly, the top rightmost block's centre is at 3.5*[7.59 5.625,7.59 5.625] = [26.545,26.545] degrees.

This corresponds to a point on the XY plane at x = y = 100*tan(26.545 19.6875) = [50 35.781, 50 35.781] cm.

This, in turn represents a distance of 70.71 50.601 cm diagonally out from the origin.

Using Pythagoras, the distance to the inner point is sqrt(100*100+9.334 6.948*9.334 6.948) = 100.43 100.241 cm.

Using Pythagoras, the distance to the outer point is sqrt(100*100+70.71 50.601*70.71 50.601) = 122.47 112.07 cm.

So the ratio of the outer to inner points is 122.47 112.07/100.43 100.241 = 1.219 1.118 Which is what I have on that blue/green sketch.

Which seems to violate everything we've seen. Where did I go wrong ? Am I calculating the wrong ratio ?

The part you did (extra credit) that I bypassed... You used the more accurate distance to the 0.5 pixel near the center 100.241, while I just used the distance at the center 100.0. 

Fundamentally, the only difference in the way we approached the problem:  You used tangent to get distance on the wall, then used Pythagoras to solve for the hypotenuse.  While I used cosine and solved for the hypotenuse directly.  The answers are the same.

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
Will reacted
ReplyQuote
Will
 Will
(@will)
Member
Joined: 3 years ago
Posts: 2504
 

@inq 

Thanks, I'll be able to sleep tonight after all 🙂

Anything seems possible when you don't know what you're talking about.


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 

@inq 

The bot can be programmed to either enter the doorway, say... using some "always turn right" algorithm or stay in the same hallway and track and map all the doors.

You wouldn't need to program the robot to specifically go through a door or turn in any particular direction during the map making phase. You start with an empty global map which you can think of as being a grid of points or squares or cells. Each cell in the grid can be empty, full or unknown. When a distance value is received all the cells along the path are filled in with the empty symbol and the cell at the returned distance is filled in with full value. So the robot's map making behaviours can be guided by the current global map's unknown value locations.

In the example below the robot has been wandering about using its lidar to map the cells as filled (black), empty (red) or unknown (white). It will seek out white cells to fill in. Now in the example below it would not simply turn left toward a white area as it is blocked by filled in cells. There is however a clear way through the door. So what might happen is a white cell is chosen and a path planning algorithm is called which plots a path through the empty (red) cells to some unknown valued (white) goal cell. It is the path planning algorithm that will direct the robot through the door. Also while the robot moves along the computed path the lidar would still be happily filling in any cells through which the laser light travels with the "empty" value (red) and any hit cells with "full" value (black).

One experiment I would try with your robot is to use the TOF sensor as an obstacle avoidance sensor. Essentially doing the same thing many have done using sonar where the little bot comes across an obstacle, its little sonar servo controlled "eyes" look left and right and then the robot heads off in another direction.

As the TOF sensor doesn't scan just horizontally but also up and down does it see the floor?

To enlarge image, right click image and choose Open link in new window.

mapMaker

 

 


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 

@inq 

In another post you seemed to feel I was down on the VL53L5CX because I was thinking it had certain limitations in certain applications.  But I do encourage you to explore it to its potential seeing as you actually have the sensor. Here it seems ROS2 allows you to use it?

 


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

Mere Words Fail Me

I finally got an answer from ST (the company).  I think I have to totally re-wire my brain around it and I've yet to stumble on a reason they thought this made anyone's life easier.  I see what they're saying and all the data above IS within tolerance.  I will just give you their words:

Here is the thread I opened on their forum:  https://community.st.com/s/question/0D73W000001PX6aSAG/detail?fromEmail=1&s1oid=00Db0000000YtG6&s1nid=0DB0X000000DYbd&s1uid=0053W000002oubV&s1ext=0&emkind=chatterCommentNotification&emtm=1664195021977&emvtk=UEBKNbK4.FQN0nSuwA7yIJm9CuGviI.4EZIocXJ25jY%3D

But here are the responses from two different employees:

  • Anne BIGOT (Employee)

    8 hours ago

    Hello

    Our supposition is that you are expecting a radial distance while the sensor returns the perpendicular distance. According to the datasheet, you should have +/-5% accuracy for all zones (meaning 5cm at 1 meter). Your measurements are within the datasheet limitations.

    Hope his answer your question

    Anne

  • John E KVAM (Employee)

    5 hours ago

    The answer is simple. We know the problem and we correct for it inside the sensor. When you are perpendicular to the wall at 1M, you should get 64 zones all saying about 1M.

    (There is a term for the correction, but I just cannot think of it at the moment.)

    • john

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 

@inq 

It appears to be like the issue I had with the 3d stuff I wrote for the dungeons "game".  The straight walls were curved on the display because I was computing their position on the screen based on the distance from the camera.  I can explain the trig later if you.

 


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 

@inq 

Here is the actual correction code

distance = Sqr( Abs(sx-ox)^2 + Abs(sy-oy)^2) 'actual distance from observer
sAngle = atan2(sy-oy, sx-ox)
If sAngle > TwoPi Then sAngle = sAngle - TwoPi
If sAngle < 0 Then sAngle = sAngle + TwoPi

distance = distance * Cos(sAngle-oAngle) 'adjusted distance for 3D display
w = range * tan(sAngle - oAngle)

w is the angle position along the x-axis

oAngle is the direction the ray was sent along the floor until it hit a wall.

 


   
ReplyQuote
Will
 Will
(@will)
Member
Joined: 3 years ago
Posts: 2504
 
Posted by: @inq

Mere Words Fail Me

  • John E KVAM(Employee)

    5 hours ago

    The answer is simple. We know the problem and we correct for it inside the sensor. When you are perpendicular to the wall at 1M, you should get 64 zones all saying about 1M.

In other words, we give you a sensor with 64 points of interest but don't worry, they're corrected to give you a solitary distance regardless of direction.

The term for the correction appears to be BOGUS 

Anything seems possible when you don't know what you're talking about.


   
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 3 years ago
Posts: 6661
 

@will He did say perpendicular, so that's not so bad, it's the away from perpendicular up to about 45 degrees that matters, still, very strange support.

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
Sure you can learn to be a programmer, it will take the same amount of time for me to learn to be a Doctor.


   
ReplyQuote
Will
 Will
(@will)
Member
Joined: 3 years ago
Posts: 2504
 

@zander 

Yeah, but Inq didn't get that until after all the work that he did trying to rationalize the results he was getting from the device.

I don't know him personally, but he doesn't seem to be the type to have "overlooked" that explanation in the description of the part, nor to have continued beating his head against the same wall if he had previously understood the complete nature of the component.

So, I consider the manufacturer guilty of very bad product information.

The good news is that it should be able to find a m$%^&*(r-f@#$%^&g door way !

Anything seems possible when you don't know what you're talking about.


   
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 3 years ago
Posts: 6661
 

@will Yes, all true. I was somewhat tongue in cheek pointing out that there was one case where it was correct like a broken clock. For sure, Dennis @Inq spent many hours trying to get that device to make sense. The question I now have is if it can/will ever be of use to him, I am not sure. I am puzzled though at the comment from the Company that seems to say this is the first they have heard of a problem. Perhaps I misunderstood that part, I certainly hope so for Dennis's sake.

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
Sure you can learn to be a programmer, it will take the same amount of time for me to learn to be a Doctor.


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

I appreciate you all for the vote of confidence.  My brain doesn't do this kind of work after about... say noon.  So now that it is 4:00AM, I can tackle this head-on.  My first task is to re-read every... single... word I have about this thing.  If I missed that critical piece of information, I'll be the first person (unless someone beats me to it) to air my stupid mistake.  

Posted by: @will

In other words, we give you a sensor with 64 points of interest but don't worry, they're corrected to give you a solitary distance regardless of direction.

Posted by: @will

The good news is that it should be able to find a m$%^&*(r-f@#$%^&g door way !

I knew I'd be able to draw out that @will keen mind of grasping the concept and wrapping it in a one-liner.  

Two hours later...

I've found their full document page for this sensor - https://www.st.com/en/imaging-and-photonics-solutions/vl53l5cx.html?rt=ds&id=DS13754#documentation

... and found I did not have the latest revisions.  I've now re-read the documentation for this sensor with a fine tooth comb.  This sentence describing optimum conditions (which is not mere coincidence that I was using this configuration in my tests above) does use the word perpendicular.  Maybe, I was suppose to infer from this reference all this that they are describing.... Ummm... NO!  

Note: The detection volume of Table 2. FoV angles has been measured with a white 88 % reflectance perpendicular target in full FoV, located at 1 m from the sensor, without ambient light (dark conditions), with an 8x8 resolution and 14 % sharpener (default value), in Continuous mode at 15 Hz.

IOW - I can confidently say there is no reference to this underlying trigonometry going on.  Since, I have the diplomacy of a sledge hammer, I'll be taking them to task on this subject.  I'll do my best not burn the bridge.  

... stay tuned.

VBR,

Inq

 

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
(@davee)
Member
Joined: 3 years ago
Posts: 1606
 

Hi @inq and @robotbuilder,

Re: The answer is simple. We know the problem and we correct for it inside the sensor. When you are perpendicular to the wall at 1M, you should get 64 zones all saying about 1M.

What this person is saying is actually the same as the table from the ST app note that I quoted on page 1 of this forum thread. I  feel a little embarassed to admit I didn't interpret it that way. Did I miss the author's explanation or did they think it was obvious? 🙄 

image

Of course, the data looks 'fake' because all the values are 'perfect', and as sensors usually report the quantity they measure directly (leaving the host system to convert into 'useful' data), it seemed counterintuitive that the sensor would 'correct' the distances to a perpendicular distance form. Also, the accuracy (5% -ish) type of specification left a fair bit of wiggle room, so I assumed (wrongly it appears) that the data values shown were oversimplified to concentrate on describing the 'sharpener' function. It just didn't occur to me the sensor was 'trying' to be helpful.

Oversimplistically, I guess, if you are building a simple robot that moves around in a pseudo random fashion, but always moving in a 'fairly forward' direction, with respect to the sensor, and trying not to collide with other surfaces, maybe all you need to know is the distance to the nearest surface?

I confess, I am still unsure where the conversions, take place, in part because I haven't looked at the accompanying software.

I noted @robotbuilder recently said:

Here is the actual correction code

distance = Sqr( Abs(sx-ox)^2 + Abs(sy-oy)^2) 'actual distance from observer
sAngle = atan2(sy-oy, sx-ox)
If sAngle > TwoPi Then sAngle = sAngle - TwoPi
If sAngle < 0 Then sAngle = sAngle + TwoPi

distance = distance * Cos(sAngle-oAngle) 'adjusted distance for 3D display
w = range * tan(sAngle - oAngle)

w is the angle position along the x-axis

oAngle is the direction the ray was sent along the floor until it hit a wall.

I confess I haven't worked through that yet, but I am curious to know the origin of this information and where/how is it implemented in the system?

Best wishes, Dave


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 

@davee

I confess I haven't worked through that yet, but I am curious to know the origin of this information and where/how is it implemented in the system?

As I wrote in my post, it is implemented in my code for a 3d view in a computer game.

https://forum.dronebotworkshop.com/user-robot-projects/inqling-junior-robot-mapping-vision-autonomy/paged/7/#post-31321

As @inq didn't comment I can only assume it is not implemented in the VL53L5CX to correct a display for the different distances from the sensor travelled by the light to the wall where the distance from the horizontal on the screen would be the same all along the wall if looking perpendicular to the wall. If you want a complete explanation I can give it to you in a private post so as not to hijack this thread with material irrelevant to the workings of the VL53L5CX sensor.

 


   
ReplyQuote
Page 14 / 17