InqEgg - One step f...
 
Notifications
Clear all

InqEgg - One step forward, two steps back...

207 Posts
6 Users
72 Reactions
9,617 Views
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

@zander,

The whole CF of the US trying to convert from Imperial/US units/Metric.  I preferred Metric from the early 80's (in college).  Remember the Hubble Telescope fiasco and it was near sited?  Someone missed a decimal place because he was forced to use metric... and also forgot to account for gravity on the grinding of the main mirror. 

As a side note - I was part of the team that did the corrective package for that near sidedness.  See what pretty pictures we've gotten since? 😉 😊 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
DaveE reacted
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 4 years ago
Posts: 7283
 

@will Yes, recipes are a big problem unless they are very clear what measurement system they are using. For a while the grocery stores reversed the price per weight unit (one larger than the other) before eventually going back to metric again where we are today. BUT a 2x4 stud or sheet of 4x8 plywood will never be converted to metric.

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
My personal scorecard is now 1 PC hardware fix (circa 1982), 1 open source fix (at age 82), and 2 zero day bugs in a major OS.


   
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 4 years ago
Posts: 7283
 

@inq Near sighted? Yes, and we had the Gimly airport near disaster when an airplane was fueled in Liters instead of Imperial Gallons and had to do an emergency landing when they ran out of fuel. The airport was a WWII training center, but since abandoned except for the fact there were a bunch of folks there for some reason I don't remember. They got a big surprise, both on the ground and the pilot.

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
My personal scorecard is now 1 PC hardware fix (circa 1982), 1 open source fix (at age 82), and 2 zero day bugs in a major OS.


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

Posted by: @zander

BUT a 2x4 stud or sheet of 4x8 plywood will never be converted to metric.

Are you kidding???  Most of the plywood we get now is some weird dimension... since it was actually built somewhere else in the world.  We NEVER used fractions less than an 1/8th inch for house construction... now we get...

image

... of course it is because if you tell some good-old-boy around here to use metric... and he'll just show you the finger he measures with.

😘

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
ReplyQuote
Will
 Will
(@will)
Member
Joined: 3 years ago
Posts: 2548
 

@inq 

So what, even your 2x4s aren't really 2 by 4 any more 🙂

Anything seems possible when you don't know what you're talking about.


   
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 4 years ago
Posts: 7283
 

@inq Plywood thickness is the exception. The weird plywood like 5ft  x 5ft is the highest quality 3/4 ply used for jigs mostly. The best is called Baltic Birch but often comes from Russia which is a lower quality.

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
My personal scorecard is now 1 PC hardware fix (circa 1982), 1 open source fix (at age 82), and 2 zero day bugs in a major OS.


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

Messing with DRV8825 Stepper Drivers

I got these DRV8825 stepper drivers in, a while back.  I haven't put them in until now because my motherboard, using the A4988 drivers was working fine and I didn't want to be debugging my software and any issues I might run into with the DRV8825 drivers.

Before installing them into the InqEgg motherboard, I wanted to try them out on a breadboard and do a little research.  As  @thephilnewman reported they burned out and also burned out his MPU, I thought to err on the side of caution.  I found this outstanding YT on the subject by Pololu.  I'm glad I did as there were some decided differences as compared to using A4988 stepper drivers.  The YT also highlighted some of the things @davee taught us on the relative current being applied to the coils depending on stepping versus micro-stepping settings.  It also went into why a bench power supply is showing far less current than the settings we might be setting on the driver's potentiometer.  Here are the differences I noted:

  1. The DRV8825 IS pin compatible with A4988 and other such boards as they are meant to be replacements in 3D printers and other stepper based products.  However, one must be careful identifying the pins.  The labels for the pins were underneath.  I also noted on my clone DRV8825 boards the potentiometer is on the opposite end of the board as it is on the A4988. Carelessly assuming the pot was some identifier of pin locations would clearly destroy components.  
  2. The Vref calculation is different.  It is merely Vref (in Volts) = Desired Current Limit (in Amps) / 2.0
  3. The DRV8825 actually has its own buck converter for driving its logic voltage from the stepper supply voltage.  I confirmed this by just setting up the VDD/GND coming from a ESP8266 WeMos as I would on the A4988.  No Vref voltage could be seen until I also hooked up the stepper motor supply (8.2V to 45V are allowed).  The Vref then could be observed.
  4. This is a disadvantage as the stepper power must be applied.  The steppers must be disconnected or potential damage can occur during tuning.  The A4988 didn't need stepper voltage and thus could be tuned with steppers attached.
  5. Tuning the Vref was critical!  At least on my boards coming out of the wrappers had the Vref showing values above 1.6V.  IOW, if I had hooked up a stepper motor it would have been trying to drive above 3.2A!!!  IOW... it would have likely damaged both the stepper motor AND the certainly have blown the DRV8825.  It seems under-handed that the manufacturer would intentionally default damaging settings.  At the very least it assures return customers. 🤨 
  6. For InqEgg's, I have a custom stepper driver that shifts gears by changing from micro-stepping to full step modes while driving.  I have all three driver pins activating together at this gear change.  On the A4988, this equates to 1/16th micro-stepping.  On the DRV8825, this equates to 1/32nd micro-stepping.  Changing a constant in the software is required to compensate.
  7. Benefits over the A4988:
    1. Supposed to handle up to 2.2A per coil versus 1.0A
    2. The Vref potentiometer was far easier to calibrate than on the A4988.

Although I don't need high current on InqEgg, in later projects, I may need that extra oomph.  I do like this last advantage though.  Basically there is an association between this Vref => Current Limit => Torque of the stepper motor.  I only need so much torque to accelerate this little bot.  To little Vref (torque) and the stepper motors will miss steps under higher accelerations.  To much Vref and I'm just burning lots of current for no reason! 

Thus, I'll be critically tuning these Vref values to get just enough torque to avoid steps while minimizing my current demands thus:

  1. Reducing heat
  2. Minimizing the gauge wire I need for the umbilical cord while training.
  3. Prolonging battery run life after training is complete!

 

Tuning on separate breadboard

PXL 20230911 151734164

InqEgg with new drivers.  Like souping up your modern car with tuner chips instead of the Holly 4 Barrell.

PXL 20230911 152632734

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
DaveE reacted
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 4 years ago
Posts: 7283
 

@inq Argh, I printed 3! pieces, didn't catch the fact I need 2 of the middle part. Now I have to print one more middle part.

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
My personal scorecard is now 1 PC hardware fix (circa 1982), 1 open source fix (at age 82), and 2 zero day bugs in a major OS.


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

InqEgg - "If I only had a brain?Scarecrow, Wizard of Oz

Another case of one-step forward, two steps back.  

I started getting into working with what I'd hope would be the first brain for InqEgg.  I wanted to use a Back Propagation version of Artificial Neural Networks (ANN).  

A little background - There appears to be two main contenders for a learning type ANN.  At least that I've learned so far.  The Back Propagation (BP) version seemed to me initially being a lot easier/faster to implement than the alternative which is a Genetic Algorithm (GA) version like @thrandell is working on.  Back Propagation is based on Calculus to determine how to make adjustments to the huge array that represents the Neural Network.  This is mathematically defined.  GA on the other hand uses a technique that is more reminiscent of evolution like Darwin's Theories.  It seemed to me to be a little to willy-nilly.

I've studied several different articles and YT.  Two that contributes most of my understanding of BP are:

This is a series: 

This is an 8 minute YT that gives the basics AND source: 

Both of these vids use a the Hello World problem for ANN of doing character recognition on a database of hand-written numerals.  I've taken the last one's code and got it running on a Windows PC.  It runs through the learning database and tests against a second database of digit written by different people.  Both vids achieve around 96% reliability against the databases.  

Next... I moved the code base to the Arduino IDE.  I made some changes just to get it to work.  The basic engine is left unchanged and I removed the pieces that weren't really applicable to Arduino projects like robot brains. 

Most of the limitations were in the size of the problem a microprocessor can handle.  I'm using an ESP8266.  Arduino need not apply.  Just not going to work on one.  I doubt even an RP2040 has enough RAM and Flash space.  An ESP32 would do much better.

  1. The mnist database for learning is 107MB.  The test one is almost 18MB.  I used these in their entirety on the PC, but on the Micro, I got the learning one down to 2.7MB and 180KB so that they could fit on the 3MB of Flash Disk space.
  2. The numeric digits are 28x28 pixels.  Pretty course as it is.  On the PC this expands out to an array of 784x300 floating point values (~1MB).  During the calculations total usages is more than doubles this.  The ESP8266 only has 50KB of ram to work with.  I was able to take every other pixel in the two databases for a paltry image size of 14 x 14 pixels. 
  3. The Hidden layer is 300 elements on the PC.  I was only able to use 12!  Anything larger and the memory is used up and it crashes.
  4. The good news during all of this... a training session for one digit only required 18ms.  With a robot motion, obstacle avoidance brain would need less than 20 elements instead of the 200, I used.  At 20 elements a training only takes 2ms.  This is well within the time slices needed to learn while the bot is moving and experiencing its environment.  IOW, there is no need to send it to a PC to offload the grunt work.

Even with all these limitations, I was able to get it to run.  And low and behold, it got up to a remarkable 85% success rate.

Now, for the bad news! - One fundamental difference between BP and GA versions that I didn't discover until I tried to start molding the above engine to InqEgg.  BP requires you to give it the answer during the training of the engine.  This is next to impossible for a robot reading say 20 sensors but it only has two outputs... one for each motor.  To give it an answer for its BP learning, I'd have to give the exact speed each motor needs.  

GA evolves a solution and all you have to tell it is how good its doing.  Per the ACAA study, they just encourage it to go fast, not twirl around and stay away from stuff.  Time to dig into GA!

But, someone here might be interested in a BP solution if their problem is more like reading digits and you have the answers you want the brain to learn.  So here is a working Sketch with all the ANN libraries included.  I've also included the paired down mnist database needed.  As mentioned, it works on an ESP8266 and probably will work with minor changes on an ESP32.  Arduino's simply don't have the power or flash space.  But I'd be interested if anyone gets it running on anything else.  Here is the Sketch/Library/Database I've modified.  Again... the library started from the second VID above.

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
robotBuilder and DaveE reacted
ReplyQuote
Ron
 Ron
(@zander)
Father of a miniature Wookie
Joined: 4 years ago
Posts: 7283
 

@inq I am afraid to ask these questions and if you tell me to go away or similar I understand. 

1. Why can't you use the free character recognition libraries I ASSUME are available?

2. If that much resource is needed just for character recognition how in blue blazes is your 'brain' going to know what to do when a life form enters the path of your 4 ton vehicle travelling 50Km/hr?

Sorry, I had to ask even at the risk of being totally ridiculed.

First computer 1959. Retired from my own computer company 2004.
Hardware - Expert in 1401, and 360, fairly knowledge in PC plus numerous MPU's and MCU's
Major Languages - Machine language, 360 Macro Assembler, Intel Assembler, PL/I and PL1, Pascal, Basic, C plus numerous job control and scripting languages.
My personal scorecard is now 1 PC hardware fix (circa 1982), 1 open source fix (at age 82), and 2 zero day bugs in a major OS.


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

Posted by: @zander

@inq I am afraid to ask these questions and if you tell me to go away or similar I understand. 

1. Why can't you use the free character recognition libraries I ASSUME are available?

2. If that much resource is needed just for character recognition how in blue blazes is your 'brain' going to know what to do when a life form enters the path of your 4 ton vehicle travelling 50Km/hr?

Sorry, I had to ask even at the risk of being totally ridiculed.

No ridicule.  

  1. It's been a long time since you or I have done a "Hello World" program.  The digit recognition, is just a well characterized test for ANN.  It's mainly to check that the algorithm is actually working.  In this case, this algorithm, that the kid in the vid created works very well and is really quite simple.  The end point isn't to recognize characters... its to create the generalized algorithm.  The same engine can be used for many other things.  The code I supplied in the zip file is pretty full featured.  It sets up the file system and loads databases, and saves the learned ANN.  This pseudo code below is using the same library, but configuring it to run an obstacle avoidance robot.  This doesn't include the learning side, but this is all that would be needed to handle the runtime side.  Simply... setup all the pins for hardware and for every time slice, read sensors run it through the ANN predictor and apply the motors.  
#include "nn.h"
#include "matrix.h"

const u16 SENSORS = 8;
const u16 HIDDEN = 6;
const u16 MOTORS = 2;
    
NeuralNetwork* _ann;

void setup() 
{
    // The Neural Network.  The weights matrices are generated
    // with random data.  
    _ann = network_create(SENSORS, HIDDEN, MOTORS, 0.1);

    // Setup all the pins to control PWM 2 PWM motors and to 
    // access say 8 infrared sensors.
        
}

void loop() 
{
    Matrix* input = matrix_create(8, 1);
    for (int i=0; i<SENSORS; i++)
        input->entries[i][0] = IRA[i].distance_reading();
    Matrix* prediction = network_predict(_ann, input);
    motor[0].speed = prediction->entries[0][0];
    motor[1].speed = prediction->entries[1][0];
    matrix_free(prediction);        
    matrix_free(input);                
}

 

As far as your second question, I won't be doing any large vehicles.  I'll leave that homework assignment for Elon.  My point was that even a lowly ESP8266 can easily handle something on the order of a hundred inputs to control various things.  Driving InqEgg around, avoiding obstacles should eventually be a cake walk for the engine once I figure out how to train it.

Remember, this project is to document my learning process... I don't know all the answers yet.

 

 

 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
Inst-Tech reacted
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2045
 

@inq 

BP requires you to give it the answer during the training of the engine. This is next to impossible for a robot reading say 20 sensors but it only has two outputs... one for each motor. To give it an answer for its BP learning, I'd have to give the exact speed each motor needs.

One way might be to show an answer (output pattern) for each sensor input pattern by manually driving it around where it can then generate an error signal between your selected motor action and its ANN selected motor action for a particular sensor input pattern. Learning by example.

In the simulated robot I used a line following algorithm to replace the manual control that might work for an physical robot as well.

 


   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

Posted by: @robotbuilder

One way might be to show an answer (output pattern) for each sensor input pattern by manually driving it around where it can then generate an error signal between your selected motor action and its ANN selected motor action for a particular sensor input pattern. Learning by example.

I mentioned doing that in @thrandell thread: 

Posted by: @inq

However... I'm still playing around with the Back Propagation (BP) solution.  I thinking that I can set up BP on the bot and use it real time.  I can manually drive the bot around by RC and the bot will fetch both the inputs from sensors and read what settings I am real-time manually setting on the motor, feed it through BP and potentially pre-learn the weights/biases.  This might get it over the initial awkward spinning around in circles and head banging phases.  I'll obviously compare it to the random initialization GA version.

The above sample Sketch/Library/Database is far more complex than what's needed for an 8 sensor (or even 20 sensor) bot and can learn via BP at each time slice.  The ACAA paper used 300ms time slices, and I'll probably use the 100ms time slices I can achieve with the ToF sensor.  That give's plenty of time to do the 2 ms Back Propagation.

Although the Sketch proves it can be done, and I may still try it out, I believe, it will only become a recording of my driving skills.  My RC thumb skills couldn't hold a candle to a Gen-X or Z video game junky.  If anything, I think my driving may confuse the issue.  A simple example... will illustrate my concern.  The given bot moving strait with nothing in its vision range should continue strait and only veer to the right when it sees the box, yet I (as the RC driver) may veer left before it even sees the box.  The BP wouldn't know why I did what I did.  

image

I think for academic purity... I really need to start from the totally random genome.  Its time to start coding a Genetic Algorithm version.  Who knows... maybe some hybrid of GA and BP can be realized. 

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
Inst-Tech reacted
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2045
 

@inq 

I mentioned doing that in @thrandell thread:

Oops I missed that, sorry.

Your reasoning for not manually controlling the robot are sound. The reason is the i/o relationship is fixed. It is a simple reflex machine determined by the fixed weights. For any possible current input there can only be one possible output.

If the robot is to run about making a map it will need a memory and some way to represent that map in the neural net. In other words to make it more than a rigid fixed and limited reflex machine it would require inputs from past events and changing goal states.

netWithMemory

   
ReplyQuote
Inq
 Inq
(@inq)
Member
Joined: 2 years ago
Posts: 1900
Topic starter  

Posted by: @robotbuilder

Oops I missed that, sorry.

No need to be... it just means we're getting on the same page some of the time.  Except...

Posted by: @robotbuilder

If the robot is to run about making a map it will need a memory and some way to represent that map in the neural net. In other words to make it more than a rigid fixed and limited reflex machine it would require inputs from past events and changing goal states.

I'm not sure if we're on the same page about this part.  From what I can see right now, the ANN size weights/biases (Genome) is approximately (# of sensors)^2 * four bytes (give or take).  So if I throw a whole slew of sensors at the thing... say 20, I get 16 KB.  That's easy enough to work in the core memory during trial runs of each Genome or after training.  During the training, I'm taking a wild guess it might take a second or two to generate a new generation population of 80 genomes.  Using the same as @thrandell and the ACAA paper, that'd be 80 x 16KB = 1.3MB.  Plenty of space on the 3MB of flash memory.  However, I could probably generate one Genome at a time and then run it.  Just to see trends, I probably would want to send all the Genomes to a base station just to keep a record and chart how its doing.  But, technically, it wouldn't be necessary as it reproduces toward a solution.

Now... if you're using map the same way I mean in mapping a house/building... I see that as being totally after it has done all the learning of how to get around.  The learned ANN wouldn't know or need to know anything about the building.  It's merely going to be how to move around, avoid obstacles and follow walls to gather data.  

Although things are likely to change as I learn more, I'm currently thinking, that needs far more memory and compute power than a micro to set goals, store the data and 3D map the building so it knows when it has come full circle and returned to its starting place.  If the rooms are bigger than the ToF sensor range, then it'll need to aim the ANN bot brain toward the interior spaces of rooms to map obstructions within the rooms.  That is likely up in the multi gigabytes range.  At first that may merely be sending data to a PC or it might just involve say a Raspberry Pi Zero W with a large SD card if I wanted to make the whole thing self-contained.  Which I would want to do eventually.  

But, then again, maybe you have some other meaning or understanding that I'm not at yet.  Please elaborate if you see something I'm missing.  I'm a little dumbfounded that I missed the fundamental requirement of Back Propagation needing a hard answer that I won't have a way of creating.

3 lines of code = InqPortal = Complete IoT, App, Web Server w/ GUI Admin Client, WiFi Manager, Drag & Drop File Manager, OTA, Performance Metrics, Web Socket Comms, Easy App API, All running on ESP8266...
Even usable on ESP-01S - Quickest Start Guide


   
DaveE reacted
ReplyQuote
Page 5 / 14