
Last seen: 2024-06-22 3:26 pm
@Zander, Just ordered the Pi5 PoE hat. I will give it a try. Yes, have Poe routers for Sec cams. Starting to really like Poe stuff. If this works, meg...
@fvasquez1776 (and all) I have nothing new to add to the Pi5's non standard USB-C problem except to mention that I am still interested and following t...
Fellas, friends, @Davee and @zander. I think I am just about over trying to figure out what the Pi5 has or wants or expects or negotiates or doesn't n...
The only thing I have not done is use a USB C break out board, and shove 5vdc at 5A directly down the USB C power wires. Which, as far as I can tell,...
Hi Ron, because there IS a big mystery with the Pi5 power. I'll post a link to the thread where we explored it in depth: I tried many options with ...
I will probably explore this in the future. I currently use Real VNC from my main work station to connect to multiple Pis. With Real VNC, I can also c...
Results are in and NOT good. :( Turns out the Radxa X2L 8Gb is actually slower at processing local Large Language Model inference than the Raspberry...
@Davee, I'd say this setup is around $250-$300. Raspberry Pi version: $80 Pi5 8Gb$20 Case/cooling$120 nVME ssd, 2Tb (optional)$4 Pico$20 shipping/...
@davee Here is a better example addressing your request to share the evidence. Hardware: Raspberry Pi 5, 8Gb, Raspberry Pi Pico connected to Pi5 via ...
@zander Yep, agreed. My robotic experiments are organized whereby everything has to work independent of Internet. If Internet is available, then added...
I just ran llama3 on the Pi5/8Gb. It took 20 seconds for the inference to start, then streamed out the following text which which seemed to be complet...
Hi Dave, llama3 8B refers to the 8 Billion parameters it was trained on. The actual file size when you download the entire model, is 4.7Gb. To run th...
I don't plan to run a specialized Robotic OS like ROS. Someday, maybe, but that learning curve is beyond my availability. I'll probably run Linux Min...
@zander Thanks, Ron. Nope the problem with the Pi5 has not been sorted yet. I left off about 4 months ago with my best solution using a small automoti...
@zander Pi4 has not nearly enough compute to handle local LLM processing in reasonable speed.