ROS on Raspberry PI...

ROS on Raspberry PI 3B+ progress report.  

Page 5 / 7

Trusted Member
Joined: 10 months ago
Posts: 64
2019-09-26 4:36 pm  


Do you want me to send u a ready installed image on 128 GB SD, i can do that, by post. Not an issue.

Honorable Member
Joined: 10 months ago
Posts: 668
2019-09-26 7:49 pm  


Thank You !

I may actually take you up on that... 

As soon as I fail, which is one of the very few things I feel exceedingly qualified to do where ROS is concerned

However, I'm not at that point yet (yet, wait for it. I'm sure it's coming) Right now I'm at the point where I'm skipping the pages with the software for the $500 camera, or the much cheaper $450 camera

Obviously the fact that I was able to blow $100 on the Jetson means to these people that I can easily part with another $500 for a camera

I do have one of those xbox 360 3d scanner things (kinect) that I bought on ebay for $20 so I could scan images of people to make statues to print on my 3d printer, and I also bought the picam v2.0. I'n already using the picam on the jetbot, and it sorta works (except for the part about "free" and "blocked", which I'm having a bit of an issue with) 

What are you using for vision ?

I'm about to start looking for motor controller packages (if that's even a thing)

Honorable Member
Joined: 10 months ago
Posts: 668
2019-09-27 2:04 am  


Okay. It didn't fail, but, the picam doesn't come on unless I issue an instruction from the desktop, and the oled doesn't come on at all

Obviously I need to make something happen at boot time

I found a script for it, but it looks like its written for jetbot

import argparse
import getpass
import os

Description=JetBot stats display service

ExecStart=/bin/sh -c "python3 -m jetbot.apps.stats"


STATS_SERVICE_NAME = 'jetbot_stats'

def get_stats_service():
return STATS_SERVICE_TEMPLATE % (getpass.getuser(), os.environ['HOME'])

if __name__ == '__main__':
parser = argparse.ArgumentParser()
parser.add_argument('--output', default='jetbot_stats.service')
args = parser.parse_args()

with open(args.output, 'w') as f:

So it's just a naked ROS right now I guess. I could install various packages i suppose, like deep learning, jetbot, inference... ?

What package did you install on yours to make it move ? And how do I tell it what motor controller I have ?

Although, I guess if I install the jetbot part that should already be set

Trusted Member
Joined: 10 months ago
Posts: 64
2019-10-03 6:49 pm  


Hi spyder

Packages i used was

Gmapping for map generation

Move base 

For Vision used opinni .. for xbox 360, you can search openini ros for launch packages including depth image to laser scan

RPLIDAR- ros for navigation laser

Dwa planner for path planning 

Rossserial arduino for the arduino mega communication with ros. 

Imu mpu 6050 for imu data i2c for communication with ardunino mega

All are in ros org u have to go through the build process gradually 

BTS7960 foe motor control 

I have done two online courses on udemy site to learn ROS and ros navigation. This helped me to make the packages step by step right from installing ROS to building packages.







Honorable Member
Joined: 10 months ago
Posts: 668
2019-10-04 12:37 am  


Which course did you take at udemy ?

It looks like a sale going on $11.99 for each ROS course

Trusted Member
Joined: 10 months ago
Posts: 64
2019-10-04 9:05 am  



You check ROS Course in Udemy by Professor Anis Kobuaa


1. Basics , Motion, Open CV

2. ROS Navigation

ROS for Beginners: Basics, Motion, and OpenCV

Become an expert and Learn Robotics with Robot Operating System (ROS) in little time and don't be lost in broad docs
4.4 (974 ratings)
4,193 students enrolled
Created by Anis Koubaa
Last updated 7/2019
 English [Auto-generated], Italian [Auto-generated]

Honorable Member
Joined: 10 months ago
Posts: 668
2019-11-13 7:24 am  

I've only got one 3B+ and it's on my printer, so I'm doing this on a 3B. I'm going to share as much as I can. The class is running their ROS on a PC with ROS loaded on Ubuntu in a virtual machine, while I'm... not, which means that I have extra hurdles that the course doesn't explain

I also started with the Ubiquity ROS image rather than loading from scratch like the class did, but, there doesn't seem to be too much difference in the software, other than the preloaded packages, the software used to edit the packages (which the course doesn't make very clear at all), and a few other things that I can't remember right now, but I'm sure will become glaringly obvious during the course of the course

First things first if yer using Ubiquity...

username is ubuntu

password is ubuntu

Some things you're going to need...

sudo apt-get install libi2c-dev

sudo apt-get install nano

sudo apt-get install git

Now you update everything...

sudo apt-get update

sudo apt-get upgrade

I think I remember a key server issue. I took notes on what I did, but, I did a lot of things. This might be a solution to the key issue, if there is one

sudo apt-key del 421C365BD9FF1F717815A3895523BAEEB01FA116

sudo -E apt-key adv --keyserver 'hkp://' --recv-key C1CF6E31E6BADE8868B172B4F42ED6FBAB17C654

(Or I could be wrong and it might work without any problems)

He talks about Eclipse, and how great it is, and how he uses it to do this and that while I struggle to figure out how to install eclipse, only to discover that the only version of Eclipse written for ARM is an older version

Eclipse is an IDE, and the teacher gave no instructions on exactly how to install eclipse, so I had to figure it out. There are many instructables available on how to do this, but none of them worked for me, so I just typed in...

sudo apt-get install eclipse

And that seemed to have a positive effect. Now don't bother actually running Eclipse until you've gone thru the build process of the ROS packages, because if you wait to set up Eclipse, then you can tie it directly into your ROS packages, and edit them sorta live-ish

He also recommends MS Visual Studio Code, which won't install on RPi, so I found something called CODE-OSS which is supposed to be an RPi replacement, so...

git clone 
wget  -O - | sudo apt-key add -
curl -L  | sudo bash

At which point you'll have a lovely new program listed under "programming" (if you're using the desktop, which you kind of have to, and, sorry @robo-pi, but, I can't seem to get the RDP working on the Ubiquity) which opens a new window that I found to be completely useless

His next option is to install Sublime text editor, which also doesn't work on ARM, so I gave up on it and am using Eclipse

There is a bunch of stuff I'm doing remotely from ssh, and while I do have the Pi connected to my monitor, it's the same monitor that my computer is on, so I have to use the remote to switch between them, and since pressing 2 buttons on my remote is such a tedious task, I like to just stick with the ssh until I'm forced to switch over. And I bring this up because it's time to use gedit, which doesn't seem to like being used in ssh, luckily however, we installed nano. So...

sudo nano .bashrc

Scroll to the very bottom, and make these changes...

source /opt/ros/kinetic/setup.bash
source /home/ubuntu/catkin_ws/devel/setup.bash
#source /etc/ubiquity/

Now when you type


the .bash runs and sets up your workspace. In a few minutes we're going to be compiling the workspace, and since every time I tried to compile the workspace on my little Pi3B, the thing hung at somewhere between 3-5%, I needed to give it a little more breathing room, so, we need to find a USB thumb drive that you aren't using, and turn it into swap space. This isn't in the course cuz he's using a desktop with VM, so his environment has enough juice to do this, mine doesn't. And since swap space on a digital drive like an SD Card chews up the cards, we're going to sacrifice a thumb drive instead of your main SD Card

Insert the thumb drive, and type...

sudo blkid

And you'll see something like this...

"UUID=41b04e9d-a016-4b98-896b-cbd25f5f5e62" TYPE="vfat" "PARTUUID="d274b027-01"

Copy the UUID. Probably sda1

umount the drive with...

umount sda1

Then (replacing your own UUID) assign the drive as swap with...

sudo mkswap /dev/sda1 449c3124-f912-4e08-b13a-e04bc66098de

Copy the new UUDI listed next to the device...

no label, UUID=dc7fdebd-37ff-4907-bd8e-7f03e7799ce8

Now add the swap space to the file system by editing the /etc/fstab file...

sudo nano /etc/fstab

On the last line before the comments, type...

UUID=dc7fdebd-37ff-4907-bd8e-7f03e7799ce8 none swap sw,pri=5 0 0

Save and exit the file

Then activate the swap space with...

sudo swapon -a

And you should be good to go for the next steps, which is creating the ROS projects. ROS projects are called "packages", and the packages live inside the workspace, which is in catkin_ws. Inside the catkin_ws are the folders build, which contains the compiled files, devel, which are the developement files, and src for the source files

The working packages will be in the src folder, so that's where we need to be...


To create a ROS package, we're going to do this...

catkin_create_pkg is the command to create the package with the name which will be assigned in the next part of the command

ros_basics_tutorials is the name of the package we will be creating. It can be any name you want, followed by the dependencies, like...

std_msgs which, if I'm understanding this correctly, is needed for the publisher/subscriber thingy 

rospy for python

roscpp for C++

We can add more later, but for now, this is kinda the minimum we need for what we'll be doing. So, the command will look like this...

catkin_create_pkg ros_basics_tutorials std_msgs rospy roscpp

That was easy. Now comes the hard part. Well, hard for my poor little Pi3B anyway. This is where we compile the set into a "package" with the command...


At this point, if yer a tea drinker, you should go get some tea, earl grey, hot, because this is going to take a while, and if you're trying this on a Pi3B without the thumb drive, it should hang pretty soon, and stop doing anything at all. On the other hand, if you have a computer that can handle this, it's still going to take quite some time so you should go find some light reading to do, like "War and Peace", or put on a nice movie, like "Avengers Endgame" while you watch pretty much nothing happen. The first time I did this, I assumed it was just going slow, so I went to bed, only to wake up and find that nothing had actually happened, which is when I installed the thumb drive and tried it again, only to find that you can't try it again cuz some programmatic changes were made that precluded you from trying it again after a failed attempt, at which point I did a nuke and pave, and started all over again

Now that it's done, we can go to the src folder to make sure that the proper files and folders were created. Under src you should find the ros_basics_tutorials folder, under which you should find the include and src folders, as well as a file called CMakeLists.txt and package.xml

Include is where we put the libraries, and src is where we put the source code





Honorable Member
Joined: 10 months ago
Posts: 668
2019-11-16 7:33 am  

And now that the package has been compiled, it's time to open yer IDE and import the package

In CodeOSS (which I can't get to work) you just drag the folder into the open IDE screen, but with Eclipse, which I also can't get to work with Pi, you just import it. The only way I've been able to use any of the IDE's that the class describes is to install Visual Studio Code on my PC, then copy the whole folder structure (everything under /src/)from my Pi over, then import the folder structure. I was, at one point, able to use the Eclipse on the Pi, but, I can't seem to duplicate it, so I'm only going to share that which can be duplicated.

Another issue with this class is that the guy skips around a lot. So he opens a file before he even says anything about how, then compiles it, shows how it works, then shows how to compile it. It's like he can't talk in a straight line

Which brings me to the next package he wants to install. This one from his githib repo, so you need to be in catkin_ws/src/ when you clone it so that it goes into the right place. His poor instructions left me with 2 of the things, one of which was in the wrong place 

Don't worry about the compile, it takes less time the second time. And there's something that he never even covered in his class, is that you have to execute...

catkin_make --force-cmake -G"Eclipse CDT4 - Unix Makefiles"

In order for the compile to create the .cproject files needed for Eclipse to use. I had to figure this out myself. But what I didn't quite understand was this part...

catkin_make ros_basics_tutorials --force-cmake -G"Eclipse CDT4 - Unix Makefiles"

Do I need to use the exact project name in the compile ? I still don't know (I'm glad there was a sale going on when I purchased this class. If I had to pay more than the $12 that I paid, I would have been asking for a refund)

So now we execute 


Which failed for me cuz I'm running ubiquity, and it's already running, so I had to kill it first, so that I'd be able to have a screen in which to execute the subsequent commands

killall -9 rosmaster

And now I can execute roscore and the other commands, like

rosnode list


rostopic list

This tells you what ros things are actually running, and what ones you have available. For the next step you want 4 CLI windows open. This part works better on the actual desktop of the Pi rather than remotely thru ssh, unless you can get VNC running, which, for some reason, doesn't seem to wanna work on the ubiquity version. I'm assuming there's a conflict someplace, but, I got bored playing with it, and just switched over to the desktop

Window 1 is


window 2 type

rosnode list

just to see what options are available

Window 3 type 

rostopic list

window 4 type

rosnode run turtlesim turtlesim_node

(I got an error when I started this, but, the program ran anyway) And I ended up with a pretty blue box with what I suspect is supposed to be a turtle in the middle of it. The picture of his turtle looks different from mine, but, I assume it's a different version, even tho we're both running Kinetic. Now we go back to window 2, and type rosnode list again, and we see that the turtlesim is running, and if we type rostopic list in window 3 again, we will see 3 new topics for the turtlesim

And if we go back to window 2, and type

rosnode info turtlesim node

It will display a kind of a help feature of what yer turtle is doing, and why. You will see a number of things, one of which is "turtlesim_pose", which is the turtle's absolute position (or relative, if you start at 0,0). I think the most relevant part is listed under "subscriptions" where it says "cmd_vel". This is the topic that the turtle is subscribed to where it gets its movement commands

Now if we go to window 3 and type

rostopic info /turtle1/cmd_vel/

We see info about the cmd_vel, where we see "geometry_msgs/Twist", which explains that Twist is the command to give it movement. In other words, any published command that orders movement, has the word "Twist" in it. Then it goes on to explain that there are currently no publishers, even tho its subscribed to the "cmd_vel" topic

Now, again in window 3, we type

rosmsg show geometry_msgs/Twist

and we can look inside the latest Twist msg, which will list the linear and angular velocity. Now, the teacher is using the word "velocity", but, I think I would have used the word "location". The x and y describe location on an x/y axis, and the z describes which way it's pointing, which sounds more like location rather than velocity to me

I could be wrong. I usually am.

Now if we go back to window 3 and type

rostopic info /turtle1/pose 

we see that there is a publisher listed, but no subscriber. now, back in window 3 again, we type

rosmsg show turtlesim/Pose

and it lists the actual data of the location, which is listed as "velocity", which proves once again, that I was wrong, or that this makes no sense, which isn't wrong cuz that's an opinion, and opinions are never wrong, just misinformed

And now we open a 5th CLI window, which is gonna be to make the robot (turtle) move, which is gonna be the Publisher

So, open it up and type

rosrun turtlesim turtle_teleop_key

This window uses arrows to make the bot move. (I'm assuming that there's a teleop_joy in there somewhere)

And back to window 2 again and type

rosnode list 

and you'll see the new node called teleop_turtle, but it won't be in window 3 when you type

rostopic list 

because it's using turtle1/cmd_vel which is already there, but you will see something new in window 3 if you type

rostopic info /turtle1/cmd_vel 

And what you'll see is that there is now both a Publisher and a subscriber. Then in window 2 if you type

 rosnode info teleop_turtle 

you'll see that it publishes /turtle1/cmd_vel under [geometry_msgs/Twist] so now we see what both publisher and subscriber is doing. But for a better look, we can go to window 3 and type

rostopic echo /turtle1/cmd_vel 

which will display the data as it is sent to the bot which will be sent by going to window 5 and pressing one of the arrow keys... So go to window 5 and start pressing arrow keys

This post was modified 5 months ago by Spyder

Honorable Member
Joined: 10 months ago
Posts: 668
2019-11-17 6:01 am  

Okay, so this is interesting. It seems to be velocity, not the direction you're facing. When you press the left arrow, you see angular velocity at 2.0, and when you press the right arrow, angular velocity is at -2.0, and when you stop, no matter which direction you're facing, angular velocity is at 0.0

And the same thing goes for linear velocity. There is only a non zero number while the thing is moving. Once it stops moving, everything goes back to 0.0

So let's go back to window 3 and hit ctrl c to stop echo-ing the velocity, and this time we'll echo the pose of the robot. So in window 3 type

rostopic echo /turtle1/Pose

and then go back to window 5 and start hitting arrow keys again. This time what you're seeing is relative position to absolute zero. And the same thing goes for theta, which is angular. So it's x,y for location, and theta for direction faced

This is funny. In window 4, when you run out of space, and hit the end of the box, it pops up and says "Oh no! I hit the wall!"

Note to self... : Boundaries are a thing

And now he's going to get complicated. He's going to tell the bot EXACTLY where to go. Remembering, of course, that there aren't any obstacles blocking the path of the bot, cuz it's basically just an empty room. So in window 3, hit ctrl c to stop the echo, and type

rostopic pub -1 /turtle1/cmd_vel geometry_msgs/Twist -- '[2.0, 0.0, 0.0]' [0.0, 0.0, 1.8]' 

and we see the bot execute a turn for 3 seconds. Which I totally don't understand where it got the 3 seconds from.

The pub says what we're doing, which is publishing, and the /turtle1/cmd_vel says which topic, and the geometry_msgs/Twist says what we're gonna do and the numbers... are numbers. x,y, and theta. And if we make the 2.0 into 0.0 then the bot just turns in place without going anywhere. Again, for 3 seconds. Why 3 seconds tho ?

This teacher is cheating like crazy. He says if we want more info about the turtlesim topic we should go to the wiki page. I could've done that without paying him $12

I'm not gonna stop now. I'm only 7 sections away from the reason I took this course... "adding new hardware"

And now he's copying commands off the wiki page ! AARRGG

rosrun rqt_graph rqt_graph

This is a graphical depiction of how the pieces fit together, and it appears to be a living thing. So if you put your mouse on a piece of it, it lights up and tells you how it communicates

Next section is a test. Why would I take a test when I've got copious notes sitting right in front of me ?

Moving on

Next section is a pretty picture...


Now, heads up here @robo-pi cuz this might be a good reason to use ROS instead of just seeing it as some kind of bloatware. See, the main set of instructions comes from the publisher in the upper left corner. Whether that's a brain, or a joystick doesn't matter. It publishes a request to "go to BLAH". The subscriber then proceeds to head to location BLAH, but, as it's heading there the distance sensor notices an obstacle. The distance sensor then becomes a publisher, and sends out a message to anybody willing to listen that there is an iceberg in front of our Titanic, at which point a decision needs to be made as to what course of action to take

See how it's spreading out the work... If you look at the video from Explaining Computers when he was demo-ing the Jetson Jetbot, you'll notice that whenever it runs into an obstacle, it ALWAYS turns left to avoid it. Now what if your goal was to the right, and it ran into an obstacle at a 45 degree angle on the left ? At that point, turning left would be counter-productive, and turning right would be a more logical and direct solution. If it had a way to think about it's solution, it might choose to turn right rather than left, because left is a preprogrammed solution to all it's problems due to the fact that it isn't actually thinking about it's end goal because it's only thinking about it's immediate problem

The Jetson Jetbot is only one computer. It isn't spreading out the work. What's being described here isn't so much "thinking" as it is logical decision making based upon achieving a set goal. Now, admittedly, this isn't your goal, but, if you leave the "brain" free to think, then you can automate the parts that can be left to themselves. The same way you and I don't think about walking, we just want to get from point A to point B, and we can be doing something else while looking where we're going, and instinctively not walk out into traffic rather than having to remember that walking out into traffic would be bad

Anyway... moving on with the lesson...

We're going to be writing a publisher, and subscriber node in python and C++, which should be fun. I wonder where I'm going to pay him to copy/paste it from



(Note here that "frequency" does not describe PWM. It describes how often the message should be published)

(Note also that "Twist" is kind of a "go" message, and "spin" is kind of a loop while waiting for instructions)

It's going to be copied from... (See ? I was right)

Now we get to use the IDE. I have no idea if my idea of copying the file structure is going to work, but I guess now is where we find out. And PlatformIO opened up on me. I did not want that

And now he's in a new folder that he didn't tell us about... ubuntu/catkin_ws/src/beginner_tutorials/src

I know I didn't skip a step. Oh. No. There it is. Right at the top of the wiki page. Instructions to create the folders.

Ok, so if yer using VSCode, then click on the "src" folder, then click on "new folder" and name it "beginner_tutorials", then open that one, and click on "new folder" and name it "src" then open that folder, then off to your right, click on add python support. That should take a minute or so, at which point it will reload but bring you back to your new src folder (and if you're me, it will open PlatforIO again, which I will close again) And of course, now he's telling us to use Eclipse again after he's already told us to use Sublime

I'm gonna use VSCode cuz it seems to be working (right up until I try to upload this entire folder structure back into the Pi, and we'll see how THAT goes. I'll probably end up having to take a detour thru gihub to make it work. I signed up for another class. I hope it makes more sense than this one)

Ok, so he wants us to copy the first script, which is the talker, and name it talker.cpp (and that's how he shows us how to write a program)

(TIP : Click on "toggle line numbers" so you don't end up with 2 sets of line numbers which will totally confuse everything)

And now he wants us to copy the 2nd script and name it listener.cpp

(I had no idea that writing programs was gonna be this easy)

And now he's breaking down the file structure... just like in the wiki. Word for word

Just read the wiki, and I'll tell you what to do next

The only thing different between what he's saying and what's on the wiki page is that he explains that the "while" is like saying "while true", and the "HZ" in the wiki is actually what it sounds like... 10 cycles per second, so, 10 msgs per second. Pretty straight forward as far as that goes

There are some changes between what's in the wiki I see, and what he copied from the wiki, but copying what he's got from a video isn't easy. He's got boxes that keep popping up over the text. I can see differences tho. What's in the wiki now must've been updated since he made the video...

Stay tuned for the next lesson...


Robo Pi
Robotics Engineer Moderator
Joined: 10 months ago
Posts: 1602
2019-11-17 7:15 am  
Posted by: @spyder

Now, heads up here @robo-pi cuz this might be a good reason to use ROS instead of just seeing it as some kind of bloatware. See, the main set of instructions comes from the publisher in the upper left corner. Whether that's a brain, or a joystick doesn't matter. It publishes a request to "go to BLAH". The subscriber then proceeds to head to location BLAH, but, as it's heading there the distance sensor notices an obstacle. The distance sensor then becomes a publisher, and sends out a message to anybody willing to listen that there is an iceberg in front of our Titanic, at which point a decision needs to be made as to what course of action to take

See how it's spreading out the work...

So my question is then, "How does this differ from writing your own programs?'

My robot will have sensors that provide interrupts (i.e. a message to anyone who's willing to react to the interrupt) so the sensors on my robot are already "publishers" without ROS.  And I will have already written my own programs (subscribers) to deal with these interruptions. 

So in this particular situation I don't see where ROS is doing anything that I wouldn't already be doing in my own programming model.

When using ROS don't you need to set up who (or what) the "publishers" are?   And don't the "subscribers" need to react in some specific way to the messages from the "publishers"?

So I still don't see where ROS is doing anything different from how I would normally program my robot.  The only difference I see is in the way it's set up to do it.   My robot is going to be riddled with interrupts.  In fact, I like using the STM32 microcontrollers because they offer more interrupts.  Every I/O pin on an STM32 can serve as an IRQ signal.  There are also other ways of multiplexing IRQ signals and be able to identify who sent them (i.e. who the publisher of the IRQ was).   

ROS may be a nice way of keeping track of these things, but I don't see where it's actually adding anything that I'm not already doing.   It's not my intent to be down on ROS.  It's just that I don't see the advantage of using it when it appears to  be doing what I'm already doing anyway.

CAVEAT: I've never actually used ROS, so for all I know, if I actually used it I might come to  like it.  Unfortunately I need to curb my enthusiasm to learn about new things.  I'm currently studying AI on three different fronts.      AI on the Jetson Nano with Paul McWhorter,  Semantic AI using a speech engine,  and I'm  studying perceptrons and ANNs  using Python and numpy.  This keeps me pretty busy .     I'm also taking the 9-axis IMU course by Paul McWhorter too.    In addition to this I'm studying and building amplifies and RF communications and antennas.  Oh, and let's not forget the DB1 project!  I'm not building a carbon copy of DB1, but  I'm building several smaller robots based on a similar architecture that Bill is using for DB1.  Although that appears to be on HOLD.  I've been waiting for Bill to publish the motor drivers wiring and code for several months now.   Hopefully he'll be making more videos on the DB1 project fairly soon.

In any case, taking yet another course on ROS is just out of the question right now.   Although if you know of any  good free courses on ROS I might consider taking a look at them.  I just ordered a SECOND Jetson nano today actually.   I've become a fan of the little bugger.   So now I have two Jestons to play with.

By the way, aren't you running ROS on the Raspberry Pi now?  Wouldn't it work on the Jetson Nano?

The thread title says you're running it on a Rpi 3B+ is that true?

I have a RPi 3B+ sitting here idle.   I'm lazy though.   I need someone to lead me through the process  step-by-step.   I get tired of wasting so much time trying to find answers on Google for ever little problem I run into.

If I knew of a nice step-by-step course of how to install and run ROS on the RPi  3+ I might try to squeak that into my schedule.  Or on the Jetson Nano for that matter. Now that I have two of them there's lots of room for experiment.

DroneBot Workshop Robotics Engineer

Honorable Member
Joined: 10 months ago
Posts: 668
2019-11-17 7:33 am  
Posted by: @robo-pi

The thread title says you're running it on a Rpi 3B+ is that true?

I'm using a spare 3B that I had because I'm still waiting for the Jetson to come back from the shop, so, what's in this thread would work on your 3B+

I'm giving the lessons as I take them, so it is step by step

Besides, if I get this one working, I plan to stick it in another robot project that I haven't shown anybody pictures of yet

This post was modified 5 months ago by Spyder

Robo Pi
Robotics Engineer Moderator
Joined: 10 months ago
Posts: 1602
2019-11-17 7:55 am  
Posted by: @spyder

I'm giving the lessons as I take them, so it is step by step

Ok, I haven't been reading this thread in any depth.   Maybe I'll look it over sometime when I have time and see if I can duplicate your results.

DroneBot Workshop Robotics Engineer

Honorable Member
Joined: 10 months ago
Posts: 668
2019-11-19 6:59 am  

Aaaand this is an exercise in futility due to the fact that the talker and listener are both already installed in the ros_essentials_cpp that we already installed, so, unless you want to just have some practice copy/pasting, I think we can safely skip this part

While screwing around with all this, I did learn something extremely helpful tho, which was this... the double tab

The double tab is nothing more than hitting the tab key twice in rapid succession, and what it does is when you have the first part of a command typed, it will list all the options you have for the next part of the command

Here is a command with a short list as an example...

rosrun turtlesim

Leave a space after the "turtlesim" and hit tab tab...


This shows you that you only have 4 options of things you can type after the command rosrun turtlesim, and it shows you what those 4 options are. I figured this out when trying to launch the talker and listener that I created following his instructions which ended up not working, so I googled around to find out why the thing wouldn't work, and the reason was that he had been giving us the wrong commands (probably due to the fact that the file structure had changed since he created the video for the class, probably by uploading updated files with different names)

Don't sweat it, I'll make the mistakes so you won't have to figure out what yer doing wrong when you aren't actually doing anything wrong

For instance, when I tried to run the talker or listener I got an error that said "could not find an executable file at..."

Which led me to one of the interesting things I found, which doesn't actually make any sense, is that I had originally thought that my talker or listener wouldn't run because I hadn't given them run rights with chmod +x, but, for some reason, that doesn't seem to even be relevant, because once I had the right command, the programs ran WITHOUT  the +x, which actually makes no sense, but, I'm looking at them right now, and they're not green in SSH, and they don't have an x under rights in WinSCP

Now, I can't imagine that it would hurt anything if you went and added the +x, but, it doesn't seem to be needed for some reason

Moving on...

So let's spy on the talker node. We'll start with 4 CLI windows again

Window 1 will be for roscore, in which we'll have to issue killall -9 roscore and killall -9 rosmaster (in case you're using the Ubiquity and you've rebooted so that you can shut down those processes because Ubiquity autoruns them and you can only have one instance running)

I like to leave that window alone once I've issued roscore because the cursor doesn't come back after you enter the command, so I'd rather just not touch it

In window 2 we're gonna type

rosrun ros_essentials_cpp talker_node

and in window 3 we're gonna type

rosrun ros_essentials_cpp listener_node

and just for fun, in window 4 we're gonna type

rostopic /list

and note the one that says "chatter". So we're gonna spy on it by typing

rostopic echo /chatter

Okay, not very surprising or enlightening. But interesting. Let's try editing the talker to give a different message so that we can get a feel for what steps need to be taken to get that done. it's gonna be in


I went down to where it says "Hello World " (The space after World is to leave space before printing the number) and changed the "World" to "DB Workshop"

Then you'll want to recompile with


and when you do you'll see a few line color changes as they scroll up your screen. Such as

Scanning dependencies of target talker_node
[ 5%] Building CXX object ros_essentials_cpp/CMakeFiles/talker_node.dir/src/topic01_basics/talker_listener/talker.c pp.o


[100%] Linking CXX executable /home/ubuntu/catkin_ws/devel/lib/ros_essentials_cpp/talker_node

Those instances show which things got changed, and they're also handy to note just in case you've made an error in your program, cuz that's where the error will show up during the compile

Now let's have some more fun. Remember that when we originally initiated the catkin workspace one of the things we typed was rospy. That's so that ROS can also run python programs. Only this time we DO need to make sure that they're executable. Now, since the is already in there, let's run that along with the listener. The is located in a folder called scripts under the talker_listener folder, so...


And you'll see that he's already left a few there for us. And if you type

rosrun ros_essntials_cpp (tab tab) 

you'll see that is listed there as something you can type, so, type it and see what happens in the listener window

Yup, we get the correct message. But, did you happen to notice what's happening in the talker window ?

An interesting bit of information is that the number you see in the talker window is the time code, counted in seconds, starting on Jan 1 1970. (this isn't in the class, I just thought that now would be a good time to tell a funny story) So that number is counting up in seconds (and fractions of a second). And most computers have their dates set on the motherboard... in 32 bits...

What's the biggest 32 bit number ?

And how many seconds from now is that ?

Nevermind, I'll tell you. The end date is Jan 19 2038 (No. This is nothing like the Y2K bug, which didn't break anything except the emergency phones on the NYS Thruway, which nobody used anyway unless you planned very carefully where your car broke down)

And why is any of this relevant ? Because either this guy was a beta tester, or his clock was wrong, or he started this video on an earlier version of ROS (which could be why certain things have been updated), because according to HIS clock, the video that he made for this class was made 4.72 years ago, and Kinetic only came out 3.5 years ago. Again, No big deal. I was just curious

Now, at this point, you can suspend the with ctrl-c, and then we're going to run the and restart the so you can see the 2 interact

The important thing to note here is that if you edit the python programs, you no longer have to recompile the catkin workspace. The downside is that you also no longer get feedback from the compiler if there is an error. Both interact in the same way. Both respond to each other. And that appears to be the crux of this lesson


Honorable Member
Joined: 10 months ago
Posts: 668
2019-11-25 5:09 pm  


I started running ROS on a new installation of Nvidia's Ubuntu (just regular desktop, without the Jetbot), and, with a few minor changes (which could be either from the difference in source learning material, or change in the distro, due to ROS for the Nano requiring a newer version of ROS, Melodic on Nano, and Kinetic on Pi3B)

I've also ordered another Pi4 (so I won't have to take apart the steampunk laptop) which, according to what I've read, should also work with Kinetic (although, using Buster on the Pi4). I'll be testing that once it arrives

It seems to be working pretty much the same. I just tested out the turtlesim on the Nano, and all the commands are the same (still no RDP tho). This is obviously a good thing as now I won't have to learn 2 sets of commands

The goal is still the same... installing the drivers for the motor controllers onto ROS and adding controls from the other buttons on the gamepad for other functions on the bot

I'm still along way from a practical use-case, but this is a positive thing

Robo Pi
Robotics Engineer Moderator
Joined: 10 months ago
Posts: 1602
2019-11-25 6:09 pm  
Posted by: @spyder

I'm still along way from a practical use-case, but this is a positive thing

Before too long you'll be an expert. ? 

I ordered a second Jetson Nano and it just came today.     I haven't fired it up yet.      I'm going to build it into the plastic case with fan first.   So now with two of them I guess I better  start working on becoming a Nano expert.  

DroneBot Workshop Robotics Engineer

Page 5 / 7

Please Login or Register