Inqling Junior - Ro...
 
Notifications
Clear all

Inqling Junior - Robot Mapping, Vision, Autonomy

240 Posts
10 Users
87 Likes
19.7 K Views
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @zander

@inq Don't forget the Pi comes in some pretty small form factors like the Zero2 now with WiFi as the PiZero2W. Lot's of capability, size about the same as NODEMCU but power consumption may be an issue unless expensive wake up boards are employed.

Although I feel it is a ways off in my plans, I do see a RasPi on the robot on the horizon.  As discussed in this thread, the first cuts will send the ToF data back to a computer for any real work.  But I don't see why that can't be on-boarded the robot at some time to a RaspPi.  In a nut-shell, I don't see why a robot couldn't just go into a room and do all the scanning and store data real time... and THEN pause in the room and let a RasPi do the work to map out the room in 3D.  I don't see any reason why significant pauses would be an issue.  A real person needing to memorize a room would easily spend more time.  After the analysis, it could even feedback to have the Arduino level system do some more re-scans... or closer up detailed scans.  

I have a bunch of the 1st gen Pi-Zero-W.  Although slow, they sip the power and again... they may take 10x more time, but I can't really see that taking even multiple minutes.  The geometry just isn't that bad.  

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

UPDATE

Ok... I think I've fulfilled my CAD designing itch.  This Mk2 version of Inqling Jr looks good enough, robust enough, configurable enough for me to move on to sensor phase.

Inqling Jr. Mk2

As discussed above, I've segregated a single ESP8266 to handle the drive system independently.  This is now all below the white battery case.  It contains...

  • ESP8266 WeMos
  • (2) A4988 Stepper Motor Drivers - Common 3D Printer drivers.
  • (2) NEMA-17  23mm Stepper Motors - Smallest ones in the Nema-17 line.
  • (1) GY-89 10DOF Sensor - Accelerometer, Gyroscope, Magnetometer, Barometer
  • (1) DC-to-DC converter to change 16.8V to 5V.
  • (1) Toggle Switch

Learning and Making Changes

During the learning process, I've made a few design changes that are incorporated into the Mk2.

  • @ronbentley1 - Ron Bently's thread https://forum.dronebotworkshop.com/show-tell/esp-32-timer-interrupts-unlimited-programmatic-user-definable-timers/ on interrupts convinced me that I needed to take a re-look, re-think about driving the stepper motors.  I was getting a periodic glitch in Inqling Jr.'s Get-a-long.  It suddenly hit me it coincided with the ESP8266 sending out the periodic telemetry data.  Being a single core, and WiFi having precedence over the Loop() method was momentarily starving the code sending out Step events to the two A4988.  So I wrote a 2 stepper motor driver class using Interrupts to send out the pulses at a higher priority than even the WiFi.  As you'll see in the  video, the drive is now as smooth as butter and actually has lower CPU overhead than the old version.  Instead of polling to see if it is time to send out a step, it now calculates the exact microseconds to the next step and sets the callback precisely.  No wasted effort, no wasted time-slice.  
  • The two-speed transmission switches the stepper drivers from using 1/16 microstepping in first gear to no microstepping for second gear.  In essence a gear change ratio of 16:1.  This is now handled automatically.  The gear change for each motor is handled independently so one may be 1st gear while the other is in 2nd due to turning input.
  • Somewhere along the way, there were some comments about stepper motors being slower and less torque than regular motors with encoders.  I thing the video will alay those concerns completely!  The torque is enough to break loose the wheels, and pop wheelies.  The speed is also staggering.  It will be quite the challenge to both self-balance and utilize that speed in an autonomous manner.  And these... are the smallest Nema-17 stepper motors.  As stated earlier... using stepper motors relieves me from having to write and tune encoders.  It also requires no CPU usage to use steppers.
  • One downside, stepper motors are active even when not moving.  In fact they seem to take more energy when still.  They get hot, the drivers get hot and the batteries get warm.  Although, I didn't do a drain down test with the drivers continuously on, I would estimate the drive time would be less than 30 minutes even if Inqling Jr wasn't driving.  In Mk2, I've also drive the A4988 drivers Enable pin with logic.  If both motors have zero speed, I turn off all power to the motors.  Since much of Inqling Jr's time will be standing still and scanning, I felt this might help the power usage.  I thing several hours runtime is no problem.  In the 15 minute video, you'll note at the end the gas-gauge is barely off the Full peg.  
  • In another thread, I discussed centering the head based on using the Magnetometer in the body and one in the head to sync the zero position for the head.  That's now been proven to be a non-starter.  As guessed... the stepper motor proximity to the Magnetometer totally washes it out... useless.

Next Up...

I've started the design work for the head.  From the lower unit, four wires come up to be used by any future additions.  Two are power (16.8V) and are switched by the main switch under Inqling Jr.  The other two are for serial communications if/when it might be needed for the two units to talk together.  The upper unit is planned (at the moment) to contain:

I think that about covers it, except for the proof.  We all know... if no video... IT DIDN'T HAPPEN.

VBR,

Inq

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
robotBuilder and Ron reacted
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 3 years ago
Posts: 7074
 

@inq Speedy Gonzalez!

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
My personal scorecard is now 1 PC hardware fix (circa 1982), 1 open source fix (at age 82), and 2 zero day bugs in a major OS.


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @zander

@inq Speedy Gonzalez!

I was thinking Taz... 

image

I've been driving it around the room at full speed.  Using the laptop and mouse, finer adjustments can be made and its almost like Nascar ... Turn Left.  Using the Admin I can just set the two wheel speeds slightly different and it'll happily go around in circles by itself.  Have to run a drain down test sometime using that.

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
Ron reacted
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @robotbuilder

Ok. No serious AI discussion going on here. I will move on. Although still have an academic interest in what you can learn about your VL53L5CX sensor.

As your comment is not pertinent to the VL53L5CX sensor, I have moved it back to where it is pertinent.

That's fine.  I already know your opinions on what you consider AI to be and I have no intentions of engaging in a debate with you on the subject that will never meet with your definitions.  So, the following is NOT AI.

In the extreme, final case it is a multi-part test.  The test will be conducted in the old 1930's school where I demonstrated Jr.  It has one major hallway with about thirty class rooms off of it.  The gym is centrally located.  It is nearly an obstacle course.  It'll have open room doors and closed room doors.  Obviously only mapping the open ones.  It has tables, desks, bookcases, heavy equipment and many obstacles.  Some, the bot will fit under, other not so much.  It will have to decide for itself, whether to go under or not.  The main hall is an ancient, abused hardwood floor.  There are gaps you can stick a pencil in, and the boards are cupped and edges raised.  Any rolling bot could in no way use a counter based methodology for tracking distance or angle.  Driving what the bot thinks is a strait line across those boards would have it looking like a drunken sailor.

A successful test (which I doubt will ever do)

  1. The Jr bot is to go around the walls, say always turning right until it has mapped the entire space.  Battery life permitting.
  2. Any room that it can't see across, it will have to map the internal contents in the center of the room.
  3. It is a fire and forget test.  No human commands after (a) power-on (b) command:  Map.
  4. It will scan and map using the full 4 meter range of the VL53L5CX sensor.
  5. If flat walls can be interpreted as such, the data will be reduces at runtime.
  6. If noise in the data indicates there might more than just a flat wall, the bot will move in closer to get higher resolution scanning of its surroundings. IOW, to discern legitimate chair/table legs from random noise points.
  7. Test 1 is finished after returning to the start point.
  8. I will label rooms and major features in the rooms.
  9. Test 2 - Placing the bot anywhere in the space and powering on, the bot will be told to go to a specific labeled item within a designated room.
  10. It will have to figure out for itself where it is in the building.  It will then have to go to the specified item.  Point to it and alert me.
  11. Extra credit - Do the same tests using it in self-balancing mode.

Although technically fire-and-forget, I will follow it around mainly to keep WiFi in range so it can upload runtime data to be processed and to feedback any commands for direction, speed, etc.  My soul duty will be taking video of it and giving it a failing grade at any point during the test.

I seriously believe IF I can make it succeed at this test AT ALL, it will not happen in this calendar year.  I plan to be very thorough as I have been doing with the separate VL53L5CX thread.  I'd rather know the component limitations early, and adjust the design and/or components.  I have already done this:

  • Overheating - MK2 may appear more enclosed, but it uses internal vertical pathways to promote convective airflow through the chassis.
  • Overloaded MPU - MK2 uses two ESP8266 so the VL53L5CX can process independently of the interrupt drive required to make stepper control smooth.
  • Wheels falling off - MK2 uses bolts on the wheels to tighten on the shaft.
  • Speed - MK2 uses a figurative two speed transmission utilizing the A4988 micro stepping feature.  Low speed for precision, high speed (4.1 mph) for traveling.  Independent / automatic "gear change" in case one wheel needs to be in high while the other in low in turning situations.  
  • Head - MK2 has bounced back and forth between stepper and servo driven.  Stepper can do 360 degree, but has logistics troubles I don't want to mess with at the moment.  Servo only does +/- 60 degrees.  This is way limiting, but the bot can be turned in finer angles than the servo anyway.

I'd rather do it this way than throwing some random bot together and spend time till hell freezes over trying to figure out and fix an all-up configuration.  More patience will be required.

VBR,

Inq

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2043
 

@inq 

Any rolling bot could in no way use a counter based methodology for tracking distance or angle.

Ok but to relate what the bot sees to the position and orientation of what it sees you need some way to give it a position and orientation to stitch it all together in a global map. An inertial navigation system might be a solution as all the bumps and jumps would be processed compared with simple encoder readings.

https://vetco.net/products/9-axis-inertial-navigation-module-for-arduino-d65
https://hackaday.com/2013/07/31/an-introduction-to-inertial-navigation-systems/

The sensor might need to be stabilized using such systems as well. Have you seen the shake taken out of a movie camera? Hold a chook and move it about and note how its head tries to remain fixed in space.

Otherwise you would need to make use of two beacons on adjacent walls which might involve selecting a natural feature like lights on the wall or light coming through a window which would involve a 360 view using a light sensor and a bit of trigonometry.

 

 


   
ReplyQuote
Will
 Will
(@will)
Member
Joined: 3 years ago
Posts: 2535
 
Posted by: @inq
  • Head - MK2 has bounced back and forth between stepper and servo driven.  Stepper can do 360 degree, but has logistics troubles I don't want to mess with at the moment.  Servo only does +/- 60 degrees.  This is way limiting, but the bot can be turned in finer angles than the servo anyway.

One servo can do 0-180, but two servos stacked vertically can do 360 (one does all left side, other does all right side)

Anything seems possible when you don't know what you're talking about.


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @robotbuilder

@inq 

Any rolling bot could in no way use a counter based methodology for tracking distance or angle.

Ok but to relate what the bot sees to the position and orientation of what it sees you need some way to give it a position and orientation to stitch it all together in a global map. An inertial navigation system might be a solution as all the bumps and jumps would be processed compared with simple encoder readings.

https://vetco.net/products/9-axis-inertial-navigation-module-for-arduino-d65
https://hackaday.com/2013/07/31/an-introduction-to-inertial-navigation-systems/

The sensor might need to be stabilized using such systems as well. Have you seen the shake taken out of a movie camera? Hold a chook and move it about and note how its head tries to remain fixed in space.

Otherwise you would need to make use of two beacons on adjacent walls which might involve selecting a natural feature like lights on the wall or light coming through a window which would involve a 360 view using a light sensor and a bit of trigonometry. 

The sensor in Inqlinq Jr is a GY-89, the predecessor to this GY-91.  As far as I can tell, the GY-91 has a newer Barometer chip in it. 

Inertial Naviation System

You all probably understand Inertial Navigation Systems (INS), but I'd like to share something for those who might not understand it.  My Father was an airline pilot for Pan American Airways.  I used to go with him occasionally for his 6 month safety refreshers.  They put him (us) in this big hydraulic ram driven simulator and throw emergency conditions at him.  After five minutes, you couldn't tell the difference... you could feel the tires going over the expansion joints and even the G's of acceleration down the runway could be felt.  I got to fly it once.  I crashed on landing. 🙄 Anyway...

I remember asking him about GPS... and how simple/cheap it was.  Surely you use this in these big planes.  He had one for the car.  But this is how I learned about INS... apparently for the longest time, GPS was not allowed (by the FAA) to be used in passenger aircraft even though readily available.  He didn't say why.  But, explained how INS worked.  When the plane is first powered up, they have to "start-up" the INS.  Back in those days it was literally an analog, mechanical based system of gyroscopes and big weight accelerometers.  The gyroscope was a pretty good sized wheel that is spun up to very high speeds within a vacuum chamber with air bearings.  Powered off, it'd spin for days!  That once speed stabilized, you would plug in the location and direction the plane was pointing while sitting at the gate. 

The INS would measure all the incremental rotations of the plane with respect to the stabilized gyroscope.  Understanding... that gyroscopes don't change orientation even if their housing do.  So... they measure the difference between the housing and the gyro.  This could be validated against the compass for reality checks.  The accelerations would spit out all the little accelerations... bumps, turbulence, take off, climb, etc.  Summing up all these little accelerations multiplied by their timing interval gives you the integral of acceleration which is speed.  This could be validated against airspeed within reason.  Summing all the speed changes with their timing intervals gives you the integral of speed which is distance.  Since you have the current direction you're flying while those incremental measurements are going on, you can tell where you were on the Earth.  It sounds incredibly Geometry intensive.  This made sense to me as I was going through Physics at the time in high school.  Thanks Dad, nice real-world physics lesson.  It always sounded incredible to me that they could/would rely on this system for world navigation.  Dad said he'd seen flights when set in New York at the gate, it being within TEN FEET of the gate in Japan after a 14 hour flight!  Totally blew my mind.

Back to the problem at hand...

For self-balancing you have to do part of the same kind of calculations as  INS, primarily using the gyroscope but using the accelerometer as a reset to take out the mathematical round-off error that accumulates on the gyro.  The problem is with these digital devices, they're incredibly tiny and have no real Inertia to work with/against.  Frankly, although I intuitively understand the spinney thing, I can see how silicon gyros work.  

I initially discounted using them as you (@robotbuilder) suggest.  The fastest rate of accessing the Gyro is 190Hz, accelerometer 50Hz and Magnetometer is only 6.25Hz.  I just picture in my mind that a little real-world, millisecond bump might be totally missed by the sampling rate even though that little jolt, steered the bot a fraction of a degree.  And we all know how much "digital" noise is in these sensors.  A totally still accelerometer shows a constant "digital" randomness about zero.  I just don't see if I'm summing up all this digital noise, that it'll integrate-out into a valid speed and then distance.    But I'm not really seeing any alternative except trying to follow that wall and hope I can mesh the points of one sampling and recognized that the same bunch of points are on the same wall as this bunch of points.

Neither method seems reliable.

It'll be a very interesting Physics/Mathematics problem and set of routines to work on.  A good juicy challenge!

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @will

One servo can do 0-180, but two servos stacked vertically can do 360 (one does all left side, other does all right side)

You know... putting two together like that never crossed my mind. 

Question... these cheap SG90 servos... even though I plug in values between 0 and 180... they only move through 120 degrees.  Is it just these cheap things only do 120 degrees or am I using them wrong?  If I could get a real 180 degrees, I could settle with that for this bot.

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Valerio
(@valerio)
Member
Joined: 2 years ago
Posts: 69
 

@inq 

I tested the SG90 with a servo tester during the four-legged robot project,

and they could not move further than 120 degrees.

I think a servo tester can output a square wave from 500 uS to 2500 uS

so it's probably a characteristic of the SG90.


   
Inq reacted
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2043
 

@inq

Inertial Naviation System

You all probably understand Inertial Navigation Systems (INS), but I'd like to share something for those who might not understand it.

Ok I will have to bow down to your superior mathematical/physics knowledge on this one. I had read they were used in some vacuum robots but maybe they don't jump around too much.

https://robotvacuums.com/blogs/blogs/a-complete-overview-of-robot-vacuum-navigation

"Gyroscopes have been used for many years as a navigation tool in aircraft, ships, and cell phones. They are also highly effective when it comes to your robot vacuum. A great robot with gyroscope navigation, like the Tesvor x500, can get the job done well for under $200. They typically work as rotation sensors as well, enabling them to better understand where the robot is in relation to obstacles in your home.

Gyroscopes can keep robots from bumping into things and even form a basic map of your home. Gyroscope navigation may not be quite as effective as systems that use lasers, such as Lidar and SLAM, but they can do the job pretty well at a significantly lower cost."

But I'm not really seeing any alternative except trying to follow that wall and hope I can mesh the points of one sampling and recognized that the same bunch of points are on the same wall as this bunch of points.

I would say as you travel along a wall it would look pretty much all the same from point to point particularly with such low resolution. Maybe when it hits a corner or some other obstacle it has to "go around" to keep following the wall you will have some shape to recognize.

Have you actually tested how accurately distance travelled over this rough terrain correlates with the number of pulses sent to the stepper motors?

https://www.kudan.io/blog/3d-lidar-slam-the-basics/

Your project is far too complex for me and will I suspect require some heavy duty software and math.


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @robotbuilder

https://robotvacuums.com/blogs/blogs/a-complete-overview-of-robot-vacuum-navigation

"Gyroscopes have been used for many years as a navigation tool in aircraft, ships, and cell phones. They are also highly effective when it comes to your robot vacuum.

I think this problem (navigating in a complex building) is a very interesting challenge.  You've rekindled my interest in approaching it from this gyro sensor's standpoint.  The article is interesting.  Every section of it made sense and was good information EXCEPT the "Gyroscope Nav..."  A digital Gyroscope sensor (like on this GY-89/91 breakout board) ONLY gives you angular velocity of the sensor...nothing more.  "Gyroscopes" will not... "calculate distances", does not contain a "quickly spinning wheel" or "a beam of light that circulates".

Posted by: @robotbuilder

Have you actually tested how accurately distance travelled over this rough terrain correlates with the number of pulses sent to the stepper motors?

No, I have not.  And you are right... it is a vital test, easy to do and it might totally displace my pessimistic drunken sailor expectations.  I'll check to see when I might be able to drive a still blind Inqling Jr in school building.

Posted by: @robotbuilder

https://www.kudan.io/blog/3d-lidar-slam-the-basics/

Excellent, concise article!  I'll definitely bookmark and reference this in our future discussions.  Do you know if their naming conventions are universally accepted or are they just their naming conventions?  We've pretty much talked about all of these aspects, but haven't been using their names.

  • And I thought 2D spinning Lidar was expensive.  3D! 😳 
  • Undistortion - Another nail from me using spinning ones.  Great if the bot isn't moving, but having to both know exactly where the sensor is pointing, where your bot is pointing and where your bot is all at the same time to undistort the geometry.
  • Voxelization - This concept is new.  I hadn't thought about doing this, but it sounds exactly like what we do in Finite Element Method in structural analysis.
  • Scan Matching - Yeap... that's the problem I see on movement... do I trust the stepper motor counts or do I trust the scanner results telling me I'm looking at the same wall?  Probably some combination of both.
  • Loop Closure - Where our summing of stepper positioning has round-off and slippage error and things don't match up once we've come full circle.
  • Re-localization - Yeahp... step 9, Test 2 above.

I have to admit, I haven't looked at SLAM before.  I was falsely assuming it was some big open-source code project that I didn't really want any part of.  Seeing now that it is the acronym, concept and we've been discussing SLAM all along is heartening.

Posted by: @robotbuilder

Your project is far too complex for me and will I suspect require some heavy duty software and math.

I know from studying our forum, you have been doing this far longer than I and have tried out and researched things in depth that I'm just speculating on.  I also know this "vision" awareness of the environment is of interest to you...  am I making this more difficult than it is?  Do you know of a different path than software and Math that I should be looking into?  Seriously, it won't be the first time I've thrown lots of Math at something and someone comes along and says, "Have you considered <blank>?"  I may get all red-faced and cuss a blue streak, but I'll admit when I'm wrong and use a better can-opener.

VBR,

Inq

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
DaveE reacted
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  
Posted by: @valerio

so it's probably a characteristic of the SG90.

Well crap!  I was really hoping for a silver bullet.

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2043
 

@inq 

I notice Bill wrote an article on the mpu6050

https://dronebotworkshop.com/mpu-6050-level/

I know from studying our forum, you have been doing this far longer than I and have tried out and researched things in depth that I'm just speculating on. I also know this "vision" awareness of the environment is of interest to you... am I making this more difficult than it is?

If you are making it more difficult than it is it would be because you are not thinking in terms of goals.

Vision has been my primary interest for as long as I can remember.  My first foray into vision was with a security camera that used the vidicon tube.  I used frame grabber hardware to transfer the data to a MSDOS PC to experiment with visual processing.  However I will not hijack your thread with the subject 🙂

 

 

 


   
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 3 years ago
Posts: 7074
 

@robotbuilder @Inq FYI I am 99% sure I helped somebody with some MPU6050 problems. There were some library issues and something else. If you go down that path, I will search the forum to refresh my memory.

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
My personal scorecard is now 1 PC hardware fix (circa 1982), 1 open source fix (at age 82), and 2 zero day bugs in a major OS.


   
ReplyQuote
Page 12 / 16