Notifications
Clear all

Resources for Programming the Me Arm?

40 Posts
5 Users
3 Likes
14.6 K Views
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
Topic starter  

I'm working with a Me Arm.   I'm just starting to write a program to control it using the Arduino.  I will also be using an SD card with this to store various positions that I can then easily recall.

I just thought that before I get too involved with starting from scratch it might be wise to ask if anyone else has already written a Me Arm position control program, or knows of any resources for this?

I don't want to control it with pots.  I want to be able to control it by just giving it x,y z coordinates and having my program convert those x, y, z, coordinates into the proper servo angles.   Then step from the current position to the new position in nice smooth steps.

As I was starting to work on this program last night I quickly realized that it's going to be somewhat of a monster to write.  I'll probably do well to brainstorm this for a while first and write up a general block diagram or flow chart for the program, and only then move on to start writing code.

Has anyone programmed the Me Arm? 

No point in me reinventing the wheel if someone already has a well established program.

DroneBot Workshop Robotics Engineer
James


   
Quote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
Topic starter  

I did a quick brainstorming block diagram/flowchart.   I figure the best place to start is with the bird's eye overview of what I hope to accomplish.   By the way, while I'm writing this for a Me Arm I'm actually intending this to be used with other arms I build.   Possibly even some that are driven by stepper motors instead of servos.

Anyway here's my crude flowchart.  I'll explain below:

ME ARM flow chart

To begin with, my idea is to have previously designed arm motions stored on an SD Card.  Each motion will consist of several positions that should be executed sequentially.   The motions will be stored as x,y,z coordinates relative to the center of the arm base.  This program will then need to convert from x,y,z into either servo angles, or stepper motor positions.  When writing the program for the first time I'm going to focus solely on the servo angles.  I included the stepper motor positions on this chart just to show that the program can have this capability as it evolves.

After having converted the x,y,z, coordinates into servo angles the program then proceeds to move the arm from its current position to the new x,y,z position.   This will also need to be done in a "Smooth" manner.  In other words, the program will need to calculate the best stepping sequence to step through angles smoothly with tiny delays.   I want what the arm just jumping around as fast as the servos can go.   This part of the "Sooth" routine will also need to figure out how to my x,y,z simultaneously.   I've been thinking about this for some time and it's going to require a bit of clever math.  That's because the differences between the previous coordinates won't all be the same.   In other words, x may need to move at a very large angle while, say z, may not move much at all.  So the actual calculations for how to move from one position to another will require some thought.   I could move just one axis at a time which would be far easier to program, but it might make for a more jerky motion. 

Finally after the arm has been moved to the new position the program loops back to see if the current SD file has more x,y,z data.  If so it just loops until it reaches end of file.   Then the arm is in the new final position.

Note:  I realize that this seem like overkill for a tiny Me Arm.  But I'm only using the Me Arm as a prototype.  I'm actually writing this program now for use later on a much larger and more sturdy arm.  In fact, in the end I hope to have two arms on the final robot that can interact with each other.

So this program is just a prototype "seed" for something much larger to evolve.

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
jscottbee
(@jscottbee)
Member
Joined: 5 years ago
Posts: 107
 

You seek Yoda...  Well, maybe not Yoda but inverse kinematics.

A bunch of "fun" math.


   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
Topic starter  

Getting Serious with the Me Arm

I needed to organize the physical aspect of things before dealing with the programming.   So today I cut a nice piece of plywood for the base for the Me Arm.   It's 16" x 17" as this covers the area the Me Arm can reach and also allows for area to mount a breadboard and Arduino Mega.  I'm using a Mega just because I have a bunch of them and I might need a lot of memory during prototyping.

Here's the front view of the set up.  I have the breadboard and Mega just setting in place for now.  On the breadboard I have a PCA9685 I2C 16-channel servo controller.   I'll only be controlling 4 servos but again, I have a lot of boards so I may as well use them.  The final robot will be using these and may even have more servos.   There is also the SD card reader on the breadboard.  I'll need to wire all this up but that should go quick.  Hopefully by tomorrow night I'll have it ready for programming,.

Front view

Platform (1)

Side view

Platform (4)

This may seem like overkill for a Me Arm, but I want to get this program down pat.   I'll be adding some little shelves at different heights within the reach of this arm.  Then I'd like to see if I can program it to move little objects around setting them in different places, etc.

I was playing with this a bit before I built this base and I noticed that sometimes the servos conflict with each other because of the mechanical design of the Me Arm.  So I'll want to study that as well and try to avoid having servos fighting each other.

So hopefully this will become a "serious" Me Arm programming project.

 

DroneBot Workshop Robotics Engineer
James


   
jscottbee reacted
ReplyQuote
jscottbee
(@jscottbee)
Member
Joined: 5 years ago
Posts: 107
 

Nice layout.


   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
Topic starter  

Progress Report:

Not a lot to report, but put a lot of work into it.  I got everything wired up and it's all working.  I wrote some test code for both the SD Card Reader and the PCA9685 Servo controller board.   Everything is working.  I need to tweak the code for the Servo controllers as they aren't calibrated properly just yet.  But that is just a matter of tweaking the code.  So far it's up and running and looking promising.

Here's another photo with everything labeled.

Me Arm (2) labels

And here's a close up of the wiring and breadboard

Me Arm (1)

Request:

If anyone is interested in building this project, or something similar to it that would be great. I'll be posting the wiring diagrams later along with all the code I develop.    The idea is to see if I can program this Me Arm to do any serious work and use the SD card to store various tasks so that I can just point to a file on the SD card to get the Me Arm to do things that is has previously been programmed to do.

I plan on starting with extremely simple examples.   And I'll be posting those as I proceed.   I'll wait until I get the code calibrated before I post it.

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
Recycled Roadkill
(@recycled-roadkill)
Member
Joined: 5 years ago
Posts: 75
 

Here's an arm that's been printed up for about six months. It's mostly printed from files on thingiverse however the claw rotates 360 degrees using a 28BYJ-48 stepper and the jaw is a really nice one made by makeblock run by a dc motor and will be reversible using a H-Bridge.  I plan to run it using an arduino uno or nano once I learn a bit more about creating sketches.

I'd planned on no real use for it, other than maybe lift a marble to a ramp and pick it back up once the marble has run its course. Y'know, automation at its best.

I'm really not feeling too great about the design and would like to print one of those pups that looks like pipes with elbows. I got the time, I just need to apply it.

This message was approved by Recycled.Roadkill. May it find you in good health and humor.


   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
Topic starter  
Posted by: Recycled Roadkill

Here's an arm that's been printed up for about six months.

 
Did you mean to include a photo or link and forget?   I didn't see the arm.
 
I'm hoping to build a larger more-versatile arm using the 28BYJ-48 stepper motors myself.   That's another dream project that's waiting its turn to become reality.  I'm just focusing on this Me Arm right now simply because I have one. ? 
 
The Me Arm does have the major limitation of no wrist action.   In fact, I have some chemistry equipment here like a test-tube rack and some flasks etc.   I'll be using those in some of my experiments with this Me Arm.   And then I realized there would be no way to pour a liquid from one flask into another because of the lack of wrist action.  So yeah, the Me Arm will have big limitations in that regard.
 
Still I think it will be a worthwhile experience to see how accurately it can be programmed.  For example, just to lift a test tube out of one rack and place it into another will be quite the challenge.   The reason being that the test tube will need to be raised straight up to clear the rack before moving it anywhere.  This will require the coordination of two servos on the Me Arm.   The two servos associated with the upper and lower arm movement.   Getting that motion down pat to lift something straight up rather than in an arc if only one servo was used will be quite the challenge.   And of course, learning how to solve this problem will carry over to other arms as well, probably including arms driven by stepper motors too.
 
In fact, this will be the first problem I'll be looking to solve as I get this thing up and running.
 
But yeah, it would be nice to have the wrist action so I could try pouring a fluid from one container into another.
 
Another experiment I thought of with the Me Arm is to see if I can get it to pick up a felt-tipped pen and draw letters on a piece of paper.  That's within the capability of the Me Arm and it will be interesting to see just how good (or poorly) it will be able to perform that task.
 
In this thread I'll be working with extreme "beginner's" examples.  So this will be a good project to join for anyone who wants to get their feet wet with learning some basic ideas of how a robot arm might be programmed.
 
It will be a "serious" study of the Me Arm.  But not necessarily a very serious study of robotic arms in general. ? 
 
Hopefully, I'll have a better arm to work with in the future.
 
 

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
Recycled Roadkill
(@recycled-roadkill)
Member
Joined: 5 years ago
Posts: 75
 
3
4
RA1

Yep, thought I did but this forum is somewhat different from the ones I've been on previously. Let's see it the pics come through this time.

Well, the pic with the makeblock gripper wasn't that good. everything but the gripper.

I'll get another one up later. The makeblock gripper is much better than the one I'd printed out.

This post was modified 5 years ago 2 times by Recycled Roadkill

This message was approved by Recycled.Roadkill. May it find you in good health and humor.


   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
Topic starter  

That arm appears to be a step up from the Me Arm.   I imagine that it would still be programmed in a very similar fashion.   You might be able to use some of the ideas I'll be posting for my Me Arm project.  You'll just need to change the actual output code to accommodate the drive requirements of your motors.

Although I'm going to begin by using servo angles from 0 to 180 degrees.   I'm not sure how that would be converted for driving a stepper motor.  I guess you need to know how many steps your motor requires to make one degree of turn.  Then just convert the degrees into the correct number of steps.

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 

I had thought about buying the owi arm to try out some proof of concept ideas and wrote a bit of experimental code to compute the shoulder and elbow rotation required to move the wrist to some point in 2d space. The third dimension I imagined would be the rotation of the base. Just work out the amount of rotation to point to the object.

The length of the limbs are known (red lines). The distance from the shoulder joint to the wrist joint (blue line) is calculated from the mouse coordinates returning an angle w. Using the known lengths of the sides of the triangle the angles can be calculated. a1 = shoulder angle of rotation. a2 = elbow angle of rotation.
https://www.mathsisfun.com/algebra/trig-solving-sss-triangles.html

As I move the mouse around its position (mx,my) is the starting point to compute the required angles of rotation to move the wrist to the mouse location (object).

The demo code is written in FreeBASIC because it has easy to use graphic commands.

'some useful defines
Const Pi = 4 * Atn(1)
Dim Shared As Double TwoPi = 8 * Atn(1)
Dim Shared As Double RtoD = 180 / Pi   ' radians * RtoD = degrees
Dim Shared As Double DtoR = Pi / 180   ' degrees * DtoR = radians

const SCRW = 1280
const SCRH = 600
screenres SCRW,SCRH,32

dim as integer mx,my

dim as single a,b,c    'sides
dim as single a1,b1,c1 'opposite angles

dim as single px0,py0  'position of shoulder
dim as single px1,py1  'position of elbow
dim as single px2,py2  'position of wrist

dim as single ww    'angle from shoulder to object
dim as single L     'distance from shoulder to object

dim as single nx,ny  'working variables
dim as single x,y    'position of wrist 

dim as integer cx,cy   'displacement on screen
cx = SCRW\2
cy = SCRH\2


a=300  'length from shoulder to elbow
b=340  'length from elbow to wrist
c=1200

px0 = 0
py0 = 0

do
    getmouse mx,my    'position of object

    mx = mx - SCRW\2
    my = my - SCRH\2
    x = mx
    y = my

    c = sqr(x^2+y^2)


    'get angles a1,c1
    a1 = (b^2 + c^2 - a^2) / (2*b*c)
    b1 = (c^2 + a^2 - b^2) / (2*c*a)
    c1 = (a^2 + b^2 - c^2) / (2*a*b)
    a1 = acos(a1)*RtoD
    b1 = acos(b1)*RtoD
    c1 = acos(c1)*RtoD
    ww = atan2(my-py0,mx-px0)*RtoD
    L = sqr(mx^2+my^2)
    
    a1 = a1 + ww

    nx = px0 + cos(a1 * DtoR) * b
    ny = py0 + sin(a1 * DtoR) * b
    px1 = nx
    py1 = ny

    nx = px1 + cos((c1+a1-180) * DtoR) * a
    ny = py1 + sin((c1+a1-180) * DtoR) * a
    px2 = nx
    py2 = ny
    
    
    screenlock
    cls
    'draw xy coordinates
    line (0,SCRH\2)-(SCRW-1,SCRH\2),rgb(200,200,255)
    line (SCRW\2,0)-(SCRW\2,SCRH-1),rgb(200,200,255)

    line (px0+cx,py0+cy)-(px2+cx,py2 +cy),rgb(0,0,255) 'shoulder to wrist
    line (px0+cx,py0+cy)-(px1+cx,py1+cy),rgb(255,0,0)  'shoulder to elbow
    line (px1+cx,py1+cy)-(px2+cx,py2+cy),rgb(255,0,0)  'elbow to wrist
    screenunlock
    
    sleep 2
loop until multikey(&H01)

 
computeAngles

   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
Topic starter  
@casey
 
I went and downloaded and installed FreeBASIC.   I then copied the source code you posted and complied it.   The resulting program appears quite different from the graphic you displayed in your post.
 
I just see a black screen with following lines on it.   They do follow my mouse around, but there's no text on the screen or any numbers indicating any angles.   Are you sure you posted the correct version of your source code?
 
Here's what I see when I run your program.
 
Angles

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 

I had trouble finding this post again in the forum I had to go to my google history.

The posted image was a modified version of the program's output captured and redone using Paint to make it clear what the lines represented. Perhaps I shouldn't post any code until I can get up to speed with Python and code everything in that.

Regardless of the language you use the important part is how it works. First you compute the line length and angle between the shoulder and the wrist of the robot arm and from that using the "Law of Cosines", explained in the link given, to calculate the the angles.

Over the weekend I purchased a Pi NoIR Camera V2 with the idea of using that with the Raspberry Pi based projects. Might turn out to be all too hard. With FreeBasic or C++ I can write code to capture images using multiple webcams or any inbuilt webcams however it does use escapi.dll that requires the Windows operating system and will not work on my Linux Raspberry Pi.

https://sol.gfxile.net/escapi/index.html

The reason I mention this is because you have the option of using visual feedback to guide your robotic arm to its target (which might have moved or be in any orientation). I tried that with the old version of the toy owi arm and a webcam on its elbow looking down at the gripper. I had a set of coloured wooden blocks just the right size for the gripper to grab and hold. However there were issues with using a toy robot arm and the newer owi arm looks even weaker and less capable.

 

 


   
ReplyQuote
Robo Pi
(@robo-pi)
Robotics Engineer
Joined: 5 years ago
Posts: 1669
Topic starter  

Well, I must confess that I you had me all excited.  You had that little tiny bit of FreeBASIC code and such a nice graphic I quickly downloaded the FreeBASIC thinking it must have some powerful GUI code!  I thought that was too good to be true.  Still even the graphics it does produce isn't too shabby considering the small source code footprint.

In any case, getting back to programming a robotics arm.    My goal with this project is extremely elementary, although perhaps not all that simple.   I'm not concerned with having any sensory feedback of any kind in this project at this time.  The goal is to simply master control of the Me Arm and be able to save motion sequences on the SD Card for recall later.

I'm also looking at being able to save complex movements with an extremely small data set of may as few as a dozen text entries on the SD card as a Text File. For example, a typical movement might be a 10-entry text file that looks like this

x0,y0,z0,cn
x1,y1,z1,cn
x2,y2,z2,cn
x3,y3,z3,cn
x4,y4,z4,c45
x5,y5,z5,cn
x6,y6,z6,c0
x7,y7,z7,cn
x8,y8,z8,cn
x9,y9,z9,cn
x10,y10,z10,c45

The x,y,z coordinates are various points along the path the arm is expected to go.   It's not necessary to call out every single step between those points as the program will take care of providing "smooth motion" between these key points.   The key points are required to make sure the arm moves in a very specific way.  We can't have it just going straight from point A to point B.  It might knock something over or hit something between the start and destination point.   We might also need a specific motion, like lifting something straight up rather than just lifting it in an arc due to the arm's natural motion.

The cn variable, means that claw does nothing during that entry.  c45 opens the claw to 45 degrees, and c0 closes the claw all the way.  Of course other values for claw angle are also available.

In my above example here's what happens.

From x0 thru x 4 the arm moves from the last known position or possibly a "home" position, to a final position described the the data at x4.  At this position the claw is opened c45 in preparation to grab something.

Coordinates x5 and x6  moves the arm to position the claw in position to grasp the object.  Then c0 closes the claw to grasp the object.

Coordinates x7 thur x10 then guide the arm to the final destination and c45 releases the object.

After this the arm may be instructed to return to a "home" position, or something.

Anyway, this is the idea I currently have for this.   I will be figuring out all of these important intermediate coordinates for now by hand.  Typing them into a text file using NotePad.  And then have the robot carry out the task and see how well the arm can repeat this task.

Only after I have these basics down will I be prepared to move on to trying to have sensors provide the robot with a list of important intermediate coordinates to navigate the arm through a complex path.

So at this point you could say that I'm just working at the stage of getting the mechanical motion of the arm down pat.  I'm not worried about having an intelligent arm at this point.  That will come only after this first stage of controlling the arm is completed.  

And yes trying to figure out how to get sensors or a camera to provide the important intermediate coordinates of a path will be quite the challenge.   And I would love to be working at that level.  But I need to have a good method for moving the arm before I move on to that stage of development.  So that's what this project is focused on right now.

 

 

 

DroneBot Workshop Robotics Engineer
James


   
ReplyQuote
robotBuilder
(@robotbuilder)
Member
Joined: 5 years ago
Posts: 2037
 

I was responding to your earlier statement:
"I want to be able to control it by just giving it x,y z coordinates and having my program convert those x, y, z, coordinates into the proper servo angles."

Which I took to mean you want a function that has x,y,z as inputs and the joint angles the arm would have to have for the end of the arm to be at x,y,z  as the functions output?

In the FreeBasic example the desired x,y position was chosen by the mouse and the program was proof that the shoulder and elbow angles would put the end of the arm at that point using the math given.

https://github.com/R2D2-2017/R2D2-2017/wiki/%5BROBOARM%5D-Forward-and--Inverse-kinematics

Usually a data sequence of joint positions is obtained by holding the end of the arm and moving it with your hand. You could build a duplicate arm with POTS instead of motors and move that about while changes in the POT values are recorded.

I noticed the me arm appears to keep the claw parallel to the ground thus it should be able to pick up a test tube.

 

 


   
ReplyQuote
Page 1 / 3