<?xml version="1.0" encoding="UTF-8"?>        <rss version="2.0"
             xmlns:atom="http://www.w3.org/2005/Atom"
             xmlns:dc="http://purl.org/dc/elements/1.1/"
             xmlns:sy="http://purl.org/rss/1.0/modules/syndication/"
             xmlns:admin="http://webns.net/mvcb/"
             xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
             xmlns:content="http://purl.org/rss/1.0/modules/content/">
        <channel>
            <title>
									DroneBot Workshop Forums - Recent Topics				            </title>
            <link>https://forum.dronebotworkshop.com/</link>
            <description>Discussion board for Robotics, Arduino, Raspberry Pi and other DIY electronics and modules. Join us today!</description>
            <language>en-US</language>
            <lastBuildDate>Mon, 20 Apr 2026 03:40:54 +0000</lastBuildDate>
            <generator>wpForo</generator>
            <ttl>60</ttl>
							                    <item>
                        <title>My robot drives like a drunk toddler</title>
                        <link>https://forum.dronebotworkshop.com/help-wanted/my-robot-drives-like-a-drunk-toddler/</link>
                        <pubDate>Sun, 19 Apr 2026 20:14:32 +0000</pubDate>
                        <description><![CDATA[I bought two of these motors:from the

Pi Hut some time ago and added them to a standard robot chassis similar to the one in this video:
They have a 120:1 gearing ratio and two hall effect...]]></description>
                        <content:encoded><![CDATA[<p>I bought two of these motors: <a href="https://thepihut.com/products/tt-motor-with-encoder-6v-160rpm-120-1" target="_blank" rel="noopener">https://thepihut.com/products/tt-motor-with-encoder-6v-160rpm-120-1</a> from the</p>
10737
<p>Pi Hut some time ago and added them to a standard robot chassis similar to the one in this video: <a href="https://youtu.be/oQQpAACa3ac?si=rj-HZcn39TeYXuJv" target="_blank" rel="noopener">https://youtu.be/oQQpAACa3ac?si=rj-HZcn39TeYXuJv</a></p>
<p>They have a 120:1 gearing ratio and two hall effect sensors on the motor. Should be more than enough to keep a robot in a roughly straight line and turn sometimes. </p>
<p>They're plugged into an arduino with a L298N Dual H-Bridge Motor driver as you would expect. It all "works", I can drive, get signals from the encoders, count them.</p>
<p>&nbsp;</p>
<p>I wanted to create a simple differential drive robot for my niece to program to drive forward, turn right 90, etc. </p>
<p>Simple coding fun for her I thought...</p>
<p>Months later I'm baffled and wondering if it's possible to get a reliable count off these encoders at all. It's not that they're hopeless, but they're inconsistent enough to have the robot consistantly veer off to one side or turn by much too much. The unreliability isn't quite reliable enough to just compensate for either. </p>
<p><strong>Has anyone out there ever got reliable movement from a robot equipped with these?</strong> I notice that although these robot car chassis with encoders often appear in videos, I've never yet seen one demonstrate controlled movement. </p>
<p><strong>If you have succeeded with encoders of this design (ideally exactly like mine) I'd be very grateful to know how you did it.</strong> Having given myself months to get this prepared I'm now down to weeks.</p>
<p>A great many thanks in advance. </p>
<p>&nbsp;</p>]]></content:encoded>
						                            <category domain="https://forum.dronebotworkshop.com/"></category>                        <dc:creator>Scott</dc:creator>
                        <guid isPermaLink="true">https://forum.dronebotworkshop.com/help-wanted/my-robot-drives-like-a-drunk-toddler/</guid>
                    </item>
				                    <item>
                        <title>Ethernet with ESP32</title>
                        <link>https://forum.dronebotworkshop.com/2026-videos/ethernet-with-esp32/</link>
                        <pubDate>Sun, 19 Apr 2026 19:16:11 +0000</pubDate>
                        <description><![CDATA[Learn to use Ethernet and Power-over-Ethernet with the ESP32. We’ll build a simple PoE Camera and a Web Server
In this wireless age, having to connect a big, clunky cable to your microcontro...]]></description>
                        <content:encoded><![CDATA[<p>Learn to use Ethernet and Power-over-Ethernet with the ESP32. We’ll build a simple PoE Camera and a Web Server.<br /><br />https://youtu.be/JgwjGeAqW8E<br /><br /></p>
<p>In this wireless age, having to connect a big, clunky cable to your microcontroller to get network access seems almost primitive. But there is nothing primitive about Ethernet; it offers a wealth of performance and security advantages over Wi-Fi.<br /><br />And that’s not to mention Power-over-Ethernet, or PoE, which lets you use an Ethernet cable for both communications and power. Perfect for permanently installed IoT applications, cameras, and VoIP phones.<br /><br />Today, we’ll see how easy it is to use Ethernet with the ESP32. We’ll learn a bit about how Ethernet works, including how to choose the right cable (and which type to avoid). Then we’ll grab an inexpensive Ethernet module based on the W5500 Ethernet chip and connect it to an ESP32.<br /><br />After doing a few basic experiments, we’ll put together a simple web server that communicates via Ethernet for improved speed and reliability. You’ll see just how easy it is to use Ethernet in your projects.<br /><br />Then we will look at a Waveshare board that supports PoE, and we’ll put together a web camera powered by PoE (you can also use it with another power source if you prefer).<br /><br />Here is the Table of Contents for today's video:<br /><br />00:00 - Introduction<br />01:41 - Ethernet<br />08:04 - W5500 Ethernet Module<br />11:22 - DHCP Test<br />16:28 - Ethernet Web Server<br />21:10 - Power-over-Ethernet<br />25:33 - WaveShare ESP32-S3-POE-ETH<br />29:00 - PoE Web Camera<br />35:42 - Conclusion<br /><br />If you haven’t considered Ethernet for your next ESP32 design, I hope this video shows you why you might want to reconsider. Ethernet offers many advantages over Wi-Fi, and with modules like the W5500 and the Waveshare ESP32-S3-POE-ETH, working with Ethernet is very easy.<br /><br />I hope you enjoy the video.<br /><br />Bill</p>]]></content:encoded>
						                            <category domain="https://forum.dronebotworkshop.com/"></category>                        <dc:creator>DroneBot Workshop</dc:creator>
                        <guid isPermaLink="true">https://forum.dronebotworkshop.com/2026-videos/ethernet-with-esp32/</guid>
                    </item>
				                    <item>
                        <title>Robot Arm</title>
                        <link>https://forum.dronebotworkshop.com/introductions/robot-arm/</link>
                        <pubDate>Sat, 18 Apr 2026 14:01:18 +0000</pubDate>
                        <description><![CDATA[Hello, my name is Roger, and I am trying to learn coding.  I’m not a young man.  In fact the telephone equipment I worked on, well, you can view it in museums now.  But it’s important to me ...]]></description>
                        <content:encoded><![CDATA[<p>Hello, my name is Roger, and I am trying to learn coding.  I’m not a young man.  In fact the telephone equipment I worked on, well, you can view it in museums now.  But it’s important to me to keep learning.  So I’ve bought myself a MAX ARM, and an ESP32 learning kit. But I am struggling to get the arm working on my Mac Tahoe.  So, I hope to learn new things from the forum and perhaps get my project running. Looking forward to any help I can get.  </p>]]></content:encoded>
						                            <category domain="https://forum.dronebotworkshop.com/"></category>                        <dc:creator>Hey Rog</dc:creator>
                        <guid isPermaLink="true">https://forum.dronebotworkshop.com/introductions/robot-arm/</guid>
                    </item>
				                    <item>
                        <title>Real-time Video Focus Control System Based on Eye Squinting Degree</title>
                        <link>https://forum.dronebotworkshop.com/show-tell/real-time-video-focus-control-system-based-on-eye-squinting-degree/</link>
                        <pubDate>Fri, 17 Apr 2026 03:37:53 +0000</pubDate>
                        <description><![CDATA[Introduction
In the previous development of intelligent AI glasses for drone control, I explored the idea: whether operators’ subtle facial expressions can be used to assist in adjusting vi...]]></description>
                        <content:encoded><![CDATA[<h3><strong><b>Introduction</b></strong></h3>
<p>In the previous development of intelligent AI glasses for drone control, I explored the idea: whether operators’ subtle facial expressions can be used to assist in adjusting visual focus or zooming the field of view, so as to improve the intuitiveness and convenience of manipulation.</p>
<p>As a myopic person, the common behavior of squinting to obtain clearer vision provides intuitive inspiration for this concept. Squinting is essentially a natural visual adjustment. If it can be transformed into a contactless interactive command, the naturalness of human-computer interaction will be greatly improved.</p>
<p>In recent years, with the rapid development of computer vision, real-time face detection and eye keypoint localization have become mature and stable, laying a reliable technical foundation for this idea.</p>
<p>This project designs and implements a real-time video focus control system based on eye opening and closing degree. The system captures frames through a camera, calculates the Eye Aspect Ratio (EAR) to quantify the squinting level, and dynamically adjusts the video clarity accordingly: the image remains blurred when eyes are open (simulating myopia), and becomes sharp when squinting (simulating manual focusing). The system runs entirely on local computing with real-time response, no network required, featuring low latency and strong privacy protection.</p>
<p><strong><b>System Features</b></strong></p>
<ol>
<li>Real-time Response: The system can detect eye status in real time and adjust video clarity immediately.</li>
<li>Physiological Principle: Based on the actual squinting movement of the eyes, which conforms to natural human behavior.</li>
<li>Fully Offline: All calculations are completed locally without the need for network connection.</li>
<li>Customizable Threshold: The response threshold can be adjusted according to the eye characteristics of different users.</li>
<li>Lightweight Implementation: Based on Python and MediaPipe, with concise and efficient code.</li>
<li>Open-Source Code: The complete code is open-source, facilitating learning and improvement.</li>
</ol>
<p><strong><b>Hardware Environment</b></strong></p>
<ol>
<li>Development Board: Raspberry Pi 5 (4GB RAM)</li>
<li>Camera: Ordinary USB camera (supports 640x480 resolution)</li>
<li>Display: Any display for showing the processed video</li>
<li>Power Supply: Standard USB power supply
10732
<p style="text-align: left">The hardware connection diagram is drawn using the Digi-Key Scheme-it online design tool, ensuring clear and standardized connection logic, which can be directly used for project deployment and debugging reference.</p>
<strong><strong>Software Environment</strong></strong>
<ol>
<li>Operating System: Raspberry Pi OS</li>
<li>Python Version: 3.9</li>
<li>Main Dependent Libraries and Versions:</li>
</ol>
<p>                  OpenCV: 4.12.0.88</p>
<ol start="4">
<li>MediaPipe: 0.10.14</li>
<li>NumPy: 2.0.2</li>
</ol>
</li>
</ol>
<p><strong>Inference Framework:</strong> MediaPipe Face Mesh (a lightweight facial key point detection framework), which is used to capture eye status in real time, quantify the degree of eye opening, and provide technical support for real-time focus adjustment of the system.</p>
10733
<p><strong>System Workflow：</strong></p>
<ol>
<li>The camera captures real-time video frames.</li>
<li>MediaPipe Face Mesh detects the face and extracts eye key points.</li>
<li>Calculate the Eye Aspect Ratio (EAR) to reflect the degree of eye opening.</li>
<li>Calculate the focus level (ranging from 0.0 to 1.0) based on the EAR value.</li>
<li>Apply the corresponding degree of blur effect according to the focus level.</li>
<li>Display the processed video frames and provide real-time data visualization.</li>
</ol>
<p><span style="font-size: 12pt"><strong>Key Technologies</strong></span></p>
<p><strong>Eye Aspect Ratio (EAR) Calculation</strong></p>
<p>EAR is a key indicator for measuring the degree of eye opening, and its calculation formula is as follows:</p>
10734
<p>Among them, p1-p6 are the positions of 6 key points around the eye:</p>
<p>- p1, p4: Left and right corners of the eye</p>
<p>- p2, p6: Central points of the upper eyelid</p>
<p>- p3, p5: Central points of the lower eyelid</p>
<pre contenteditable="false">def calculate_eye_aspect_ratio(self, eye_landmarks):
    if len(eye_landmarks) &lt; 6:
        return 0.0
    
    p1 = eye_landmarks
    p2 = eye_landmarks
    p3 = eye_landmarks
    p4 = eye_landmarks
    p5 = eye_landmarks
    p6 = eye_landmarks
    
    A = np.linalg.norm(p2 - p6)
    B = np.linalg.norm(p3 - p5)
    C = np.linalg.norm(p1 - p4)
    
    if C == 0:
        return 0.0
    
    ear = (A + B) / (2.0 * C)
    return ear</pre>
<div data-page-id="JCEgdksDZoRQiqx52jHcoDWGnsf" data-lark-html-role="root" data-docx-has-block-data="false">
<div> </div>
<div class=" old-record-id-BMfSfxU7hdNbRychIQfcDVvonQb"><strong>Focus Response Curve</strong></div>
<div> </div>
</div>
<div data-page-id="JCEgdksDZoRQiqx52jHcoDWGnsf" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-NIXWfoItqdMUjhcCc79cI3orngf">The system adopts a simple linear response curve to map the EAR value to the focus level:</div>
</div>
<pre contenteditable="false">def calculate_focus(self, ear_value):
    # 将EAR值限制在预定义的范围内
    clamped_ear = np.clip(ear_value, self.min_ear, self.max_ear)
    ear_range = self.max_ear - self.min_ear
    
    if ear_range &lt;= 0:
        ear_range = 0.15
    
    # 焦点水平：0.0（完全模糊）到1.0（完全清晰）
    focus_level = (self.max_ear - clamped_ear) / ear_range
    focus_level = np.clip(focus_level, 0, 1)
    
    # 计算模糊半径
    blur_range = self.max_blur_radius - self.min_blur_radius
    target_blur_radius = self.max_blur_radius - (focus_level * blur_range)
    
    return focus_level, int(target_blur_radius)</pre>
<p><strong>Real-time Blur Processing</strong></p>
<p>According to the calculated blur radius, apply Gaussian blur to the video frame:</p>
<pre contenteditable="false">def apply_blur(self, frame, blur_radius):
    if blur_radius &lt;= 1:
        return frame.copy()
    
    # 确保内核大小为奇数
    kernel_size = max(1, int(blur_radius * 2 + 1))
    if kernel_size % 2 == 0:
        kernel_size += 1
    
    # 应用高斯模糊
    return cv2.GaussianBlur(frame, (kernel_size, kernel_size), blur_radius)</pre>
<div> </div>
<div><strong>Translation of the Main Code Structure</strong></div>
<div> </div>
<div>
<div>The system's main loop is responsible for processing each frame in real time:</div>
<div> </div>
<div>
<pre contenteditable="false">def main():
    # 初始化摄像头
    camera_index = 0
    cap = cv2.VideoCapture(camera_index)
    
    # 设置摄像头参数
    cap.set(cv2.CAP_PROP_FRAME_WIDTH, 640)
    cap.set(cv2.CAP_PROP_FRAME_HEIGHT, 480)
    
    # 创建焦点控制器
    controller = SimpleEyeSquintController()
    
    print("系统启动：睁眼=模糊，眯眼=清晰")
    
    while True:
        # 读取帧
        ret, frame = cap.read()
        if not ret:
            break
        
        # 镜像翻转
        frame = cv2.flip(frame, 1)
        
        # 处理帧
        processed_frame, focus_info = controller.process_frame(frame)
        
        # 显示处理后的帧
        cv2.imshow('Eye Squint Focus Control', processed_frame)
        
        # 处理按键
        key = cv2.waitKey(1) &amp; 0xFF
        if key == ord('q'):
            break
        elif key == ord('+'):
            controller.adjust_max_blur(controller.max_blur_radius + 3)
        elif key == ord('-'):
            controller.adjust_max_blur(controller.max_blur_radius - 3)
    
    # 释放资源
    cap.release()
    controller.release()
    cv2.destroyAllWindows()</pre>
</div>
</div>
<p>&nbsp;</p>
<div>
<h3>Data Smoothing Processing</h3>
<p>To avoid video flickering caused by fluctuations in the EAR value, the system uses a moving average for data smoothing processing:</p>
<pre contenteditable="false">def __init__(self):
    # EAR值平滑
    self.EAR_SMOOTHING = 5
    self.left_ear_history = deque(maxlen=self.EAR_SMOOTHING)
    self.right_ear_history = deque(maxlen=self.EAR_SMOOTHING)
    
    # 模糊半径平滑
    self.focus_smoothing = 0.3
    self.blur_history = deque(maxlen=3)</pre>
</div>
<div> </div>
<h3>Notes</h3>
<ul>
<li>Parameter Tuning</li>
<li>EAR Threshold Setting</li>
<li>Through experiments, the appropriate EAR threshold range for most people has been identified:
<ul>
<li>Eye-open State (Blurred): EAR ≈ 0.38</li>
<li>Squinting State (Clear): EAR ≈ 0.23</li>
</ul>
<div> </div>
</li>
</ul>
<h3>Response Sensitivity Adjustment</h3>
<div>The system response can be changed by adjusting the following parameters:</div>
<ul>
<li>EAR_SMOOTHING: Increasing it makes the response smoother; decreasing it makes the response more sensitive.</li>
<li>focus_smoothing: Controls the smoothness of blur transition.</li>
<li>max_blur_radius: The maximum blur level, simulating different "degrees of myopia".</li>
</ul>
<h3>Effect Demonstration</h3>
<h4>Operation Effect</h4>
<div>When the system is running, the following information will be displayed on the screen:</div>
<ul>
<li>Real-time EAR value (eye opening degree)</li>
<li>Current focus level (0%-100%)</li>
<li>Current blur radius</li>
<li>System status (BLURRY or CLEAR)</li>
<li>Focus level progress bar</li>
</ul>
10735
10736
<h2>Interaction Experience</h2>
<ul>
<li>Normal eye opening: The video is presented in a blurred state to simulate myopia.</li>
<li>Gradually squinting: As the degree of squinting deepens, the video gradually becomes clearer.</li>
<li>Full squinting: The video reaches the clearest state.</li>
<li>Opening eyes again: The video gradually returns to a blurred state.</li>
</ul>
<h2>Control Functions</h2>
<ul>
<li>Press the "q" key: Exit the program.</li>
<li>Press the "+" key: Increase the maximum blur level (simulate more severe myopia).</li>
<li>Press the "-" key: Decrease the maximum blur level (simulate milder myopia).</li>
</ul>
<h2>Performance Optimization</h2>
<h3>1. Computational Efficiency</h3>
<ul>
<li>MediaPipe Face Mesh runs efficiently on the CPU.</li>
<li>Gaussian blur is implemented with OpenCV optimization.</li>
<li>Only necessary eye key points are calculated to reduce computational load.</li>
</ul>
<h3>2. Memory Usage</h3>
<ul>
<li>Use deque to implement a sliding window, ensuring constant memory usage.</li>
<li>Process in real time without storing historical frames.</li>
<li>Lightweight UI overlay, without significant additional overhead.</li>
</ul>
<h3>3. Real-Time Performance</h3>
<ul>
<li>Achieves 15-20 FPS on Raspberry Pi 5.</li>
<li>Response latency &lt; 100ms.</li>
</ul>
<h2>Application Scenarios</h2>
<ol>
<li>
<div>Vision Training
<div> </div>
Through the feedback loop of squinting for clarity, help users understand their ability to control eye muscles.</div>
<div> </div>
</li>
<li>
<div>Interactive Demonstration
<div> </div>
Vividly show how myopic patients can improve vision by squinting, which is used for popular science education in ophthalmology.</div>
<div> </div>
</li>
<li>
<div>Game Control
<div> </div>
Use squinting as a novel game control method to enhance the sense of immersion in the game.</div>
<div> </div>
</li>
<li>
<div>Assistive Technology
<div> </div>
Provide a contactless interaction method for people with special needs.</div>
</li>
</ol>
<h2>Improvement Directions</h2>
<ol>
<li>
<div>Personalized Calibration
<div> </div>
Add a calibration function to automatically adapt to individual differences of users.</div>
<div> </div>
</li>
<li>
<div>Multi-Level Myopia Simulation
<div> </div>
Provide simulated myopia of different degrees, from mild to severe.</div>
<div> </div>
</li>
<li>
<div>Astigmatism Simulation
<div> </div>
Add directional blur to more realistically simulate astigmatism.</div>
<div> </div>
</li>
<li>
<div>Eye Fatigue Detection
<div> </div>
Remind users of prolonged squinting to prevent eye fatigue.</div>
<div> </div>
</li>
<li>
<div>Cross-Platform Support
<div> </div>
Port to mobile devices and embedded platforms.</div>
</li>
</ol>
<h2>Summary</h2>
<div>This project successfully implements a real-time video focus control system based on the degree of eye squinting. By combining computer vision and image processing technologies, the system can real-time detect the user's eye state and dynamically adjust the video clarity according to the degree of squinting. The system is not only simple and efficient in technical implementation but also has practical application value and educational significance.</div>
<div> </div>
<div>The core values of the project are as follows:</div>
<ul>
<li>Technological Innovation: Convert physiological movements into interactive commands.</li>
<li>Educational Significance: Intuitively show the impact of myopia on vision.</li>
<li>Practical Value: Provide new ideas for contactless interaction.</li>
<li>Open-Source Sharing: The complete code is open-source to promote technical exchange.</li>
</ul>
<div>With the continuous development of computer vision technology, interaction methods based on physiological signals will become more and more common. This project provides a simple and effective example for this direction, demonstrating how technology can combine with natural human behaviors to create novel and practical applications.</div>]]></content:encoded>
						                            <category domain="https://forum.dronebotworkshop.com/"></category>                        <dc:creator>Yassin</dc:creator>
                        <guid isPermaLink="true">https://forum.dronebotworkshop.com/show-tell/real-time-video-focus-control-system-based-on-eye-squinting-degree/</guid>
                    </item>
				                    <item>
                        <title>It&#039;s C++ not C/C++</title>
                        <link>https://forum.dronebotworkshop.com/c-plus-plus/its-c-not-c-c/</link>
                        <pubDate>Thu, 16 Apr 2026 01:35:40 +0000</pubDate>
                        <description><![CDATA[Introduction

This may be well known but I see evidence of it in a number of posts and feel compelled to mention it.
A misconception about the Arduino IDE is that it uses a C/C++ compiler...]]></description>
                        <content:encoded><![CDATA[<h2>Introduction</h2>
<blockquote>
<p>This may be well known but I see evidence of it in a number of posts and feel compelled to mention it.</p>
<p>A misconception about the Arduino IDE is that it uses a C/C++ compiler.</p>
<p>It doesn't.</p>
<p>The Arduino IDE uses a variant of gcc, a compiler that will compile the C++ or C language. The compiler doesn't compile the C/C++ language</p>
<p>There's no C/C++ programming language.</p>
<p>The Arduino IDE compiler default language is C++.</p>
<p>The Arduino IDE can detect a C file (via the file extension) and override the compiler language setting to accept only the C language.</p>
</blockquote>
<h2>Definitive Test</h2>
<blockquote>
<p>Prove this to yourself with the following:</p>
<p>Create an Arduino sketch (<code>cpp_only.ino</code>) that has the files</p>
<ul>
<li><code>cpp_only.ino</code></li>
<li><code>c_file.c</code></li>
<li><code>cpp_file.cpp</code></li>
<li><code>cc_file.cc</code></li>
</ul>
<p>Each file contains</p>
<pre contenteditable="false">#ifndef __cplusplus
#error This is not a C++ compiler
#endif
// ...everything else
</pre>
<p>Compiling the sketch produces the error only for the <code>c_file.c</code>:</p>
<pre contenteditable="false">1&gt;...\cpp_only\c_file.c(2,2): error GD8C7B0A7: #error This is not a C++ compiler
1&gt;    2 | #error This is not a C++ compiler
1&gt;      |  ^~~~~
1&gt;
</pre>
<p>All the other files will successfully compile.</p>
</blockquote>
<h2>What this means</h2>
<blockquote>
<p>As a general rule, the sketch .ino file is a C++ file and unless you're using a marked C language file, you're programming in the C++ language. Specifically, it means that the sketch .ino file can freely use C++ features. FWIW, I have never seen a sketch or library that uses C. You have to go to some effort to do so.</p>
</blockquote>]]></content:encoded>
						                            <category domain="https://forum.dronebotworkshop.com/"></category>                        <dc:creator>TFMcCarthy</dc:creator>
                        <guid isPermaLink="true">https://forum.dronebotworkshop.com/c-plus-plus/its-c-not-c-c/</guid>
                    </item>
				                    <item>
                        <title>Raspberry Pi End-Cloud Collaboration --- Intelligent AI Glasses</title>
                        <link>https://forum.dronebotworkshop.com/show-tell/raspberry-pi-end-cloud-collaboration-intelligent-ai-glasses/</link>
                        <pubDate>Tue, 14 Apr 2026 02:16:04 +0000</pubDate>
                        <description><![CDATA[Hey everyone, I’d like to share a new project I’ve been working on today. I’d really appreciate any valuable feedback from you all. Below are the details of my project—please check it out!
...]]></description>
                        <content:encoded><![CDATA[<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-KxI9fv9CydTQHacgK3KcD2lUnJf">Hey everyone, I’d like to share a new project I’ve been working on today. I’d really appreciate any valuable feedback from you all. Below are the details of my project—please check it out!</div>
</div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div> </div>
<div class=" old-record-id-PGg1fsdKNdPoBHcfcIQc76mmnVg"><strong>Introduction</strong></div>
</div>
<div> </div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-Fiypfe6ERdyLpqcpI5xcfMCinqg">This project is specifically developed to be used with drones, aiming to create a pairable AI glasses solution that can collaborate with UAVs. Focusing on the collaborative architecture of on-device AI acceleration and cloud-based vision large models, it addresses the opportunities and challenges brought by the rapid popularization of generative AI on UAV edge devices.</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;This project is specifically developed to be used with drones, aiming to create a pairable AI glasses solution that can collaborate with UAVs. Focusing on the collaborative architecture of on-device AI acceleration and cloud-based vision large models, it addresses the opportunities and challenges brought by the rapid popularization of generative AI on UAV edge devices.&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+ab&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;5e3fea04-06c1-4c17-95c1-5006271bba86&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div> </div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-OlXofgJ3OdL8xBcWUq8cWxRInqd">Using Raspberry Pi as the core hardware, the project builds a lightweight, low-latency AI glasses prototype with real-time environmental perception and intelligent interaction capabilities. By integrating efficient real-time inference of on-device NPU with the powerful generation and understanding capabilities of the cloud, it forms a complete closed loop from real-time local data processing to in-depth cloud analysis and decision-making. This provides a practical end-cloud collaborative solution for UAV FPV assistance, situational awareness, AR-enhanced flight, real-time image information overlay and other scenarios.</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;Using Raspberry Pi as the core hardware, the project builds a lightweight, low-latency AI glasses prototype with real-time environmental perception and intelligent interaction capabilities. By integrating efficient real-time inference of on-device NPU with the powerful generation and understanding capabilities of the cloud, it forms a complete closed loop from real-time local data processing to in-depth cloud analysis and decision-making. This provides a practical end-cloud collaborative solution for UAV FPV assistance, situational awareness, AR-enhanced flight, real-time image information overlay and other scenarios.&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+hd&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;ecdf298e-1720-46f0-aa36-74746d98c3cf&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div> </div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-L0vafigC6dwjArcGKoWc2wHLn3g"><strong>Project Features</strong></div>
</div>
</div>
<div> </div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-U6b5ffRgNdV9sFcmhAtcpOzgniy">&#x2611;&#xfe0f; On-device NPU real-time vision detection</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;&#x2611;&#xfe0f; On-device NPU real-time vision detection&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+17&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;7ac9032f-e192-4776-b3f7-9d9aff2547cb&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-LuXmfsAoVdFu6AcEtrscb1rwnlb">&#x2611;&#xfe0f; Deep semantic analysis via cloud large models</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;&#x2611;&#xfe0f; Deep semantic analysis via cloud large models&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+1c&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;a4d23971-974f-4f25-ae78-02ed89601da4&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-Suoqf349wdPL3GcTqppcVJ0Vnfe">&#x2611;&#xfe0f; Only upload objects of interest to the cloud, saving bandwidth and protecting the privacy of full images</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;&#x2611;&#xfe0f; Only upload objects of interest to the cloud, saving bandwidth and protecting the privacy of full images&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+2z&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;8b6aa642-beab-47b2-928e-0df77d4a6616&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-Cf5bfNH3idEJmvcbAuVcCLGUnlf">&#x2611;&#xfe0f; Designed for pairing and collaboration with drones</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;&#x2611;&#xfe0f; Designed for pairing and collaboration with drones&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+1h&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;8157fe98-afd8-4e96-92ef-8dd77ad7cb56&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div> </div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-NIZYflmsldUNgscYicYcN104nVe"><strong>Hardware Structure</strong></div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;Hardware Structure&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+i&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;17470bcd-a6e1-4ffe-832a-849b6a1ae40d&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<p>Bill of Materials:</p>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-SufrfMiTkdWQUsckotJcCns9nud">- Raspberry Pi 5 Motherboard</div>
</div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-JZoHfTLq6dvTaBc0wScc6rKsnJY">- Raspberry Pi HAT+ (NPU with 13TOPS computing power)</div>
</div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-B6EUfcGaYdDbr0caC3XcDvtPnEg">- HDMI LCD Display</div>
</div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-MmjIfyg5JdVgv7c2yrecO9Wjn2b">- IMX219 USB Camera</div>
</div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-DoZwfVFwide6ECcj1eCcQs53nNU">- USB HID Buttons</div>
</div>
10712
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-JZuYfMoIWdF5FKcDS4pcVmqsnje"><strong>Software Framework</strong></div>
</div>
<div> </div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-E41rfz1epd8epAcFSS6cwl69nxg">Operating Environment:</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;Operating Environment:&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+m&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;22d799ea-95bc-4289-a8b2-2344fd609f8e&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-AnOxfTPwUdkuUocsX6XceDounbe">- Python 3.11</div>
<div> </div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;- Python 3.11&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+d&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;3dda2683-2c90-4ff6-9a0d-3806c25645c3&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-S4SifQSiUdYgF3c0SDJcSyNdn5c">Software Models:</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;Software Models:&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+g&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;5aecd41a-ef28-4fe8-a1e8-548cab2eae80&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-PQQNfJfybdTegect4Mpcm89InFe">- On-device Small Model: YOLO</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;- On-device Small Model: YOLO&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+t&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;a1a50cd6-732e-46be-969c-38a30569638e&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-H8YCffTuNdeehQcb4HBcIGlSnrf">- Cloud-based Large Model: Qwen3-VL-Plus</div>
<div> </div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;- Cloud-based Large Model: Qwen3-VL-Plus&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+14&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;f2817435-26f4-41af-8fa8-f8b4b704926a&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-V8Dif2Y3mdaVcTcE9MMc5mMXnee">Collaboration Mechanism:</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;Collaboration Mechanism:&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+o&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;ac8a2f01-a5a0-4a0d-83e2-93ee4c9234f5&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-PqWCfZyeNdf1hFcNBAhcpIpvnyX">- HTTPS Network Communication</div>
</div>
</div>
<div> </div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class=" old-record-id-EtHEfI0Wod2g5zcZ4JDctRVInng"><strong>AI HAT+ Installation and Configuration</strong></div>
<div> </div>
</div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-LSO2f2Ixsd0nvmcAaJUcB6tmnyc">Introduction:</div>
<div> </div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-CU1HfYXr3dgzDLcJHO9cnDQzn3e">AI HAT+ is based on Hailo-8L and Hailo-8 neural network inference accelerators, offering two models with 13 and 26 TOPS. This project uses the 13 TOPS model, which is suitable for medium workloads and has performance similar to AI kits. AI HAT+ communicates using the PCIe interface of Raspberry Pi 5. The host Raspberry Pi 5 will automatically detect the on-board Hailo accelerator and use the NPU to perform supported AI computing tasks.</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;AI HAT+ is based on Hailo-8L and Hailo-8 neural network inference accelerators, offering two models with 13 and 26 TOPS. This project uses the 13 TOPS model, which is suitable for medium workloads and has performance similar to AI kits. AI HAT+ communicates using the PCIe interface of Raspberry Pi 5. The host Raspberry Pi 5 will automatically detect the on-board Hailo accelerator and use the NPU to perform supported AI computing tasks.&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+c7&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;b4e3d498-f1b7-4059-a04d-1dc611278529&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;Introduction:&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+d&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;db6d400c-95b5-4a74-bebc-0b023404d2a9&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;AI HAT+ Installation and Configuration&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+12&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;129f167b-4152-483b-a9a1-384cc4420831&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div> </div>
<div class="ace-line ace-line old-record-id-QkRmfGUmFd8KdLc1NZBcfRyGnCd">Hardware Installation:</div>
</div>
<div> </div>
<div>
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-BiycfTJCndbVOZc1Z6GcaRWDnNb">Connect the AI HAT+ kit via the PCIe interface, insert it into the pin header of the Raspberry Pi, and then fix it with copper pillars, as shown in the figure below.</div>
</div>
<span class="lark-record-clipboard" data-lark-record-data="{&quot;rootId&quot;:&quot;NOLcfZ72vd0Az6cGr1ncpO8Nnth&quot;,&quot;text&quot;:{&quot;initialAttributedTexts&quot;:{&quot;text&quot;:{&quot;0&quot;:&quot;Connect the AI HAT+ kit via the PCIe interface, insert it into the pin header of the Raspberry Pi, and then fix it with copper pillars, as shown in the figure below.&quot;},&quot;attribs&quot;:{&quot;0&quot;:&quot;*0+4l&quot;}},&quot;apool&quot;:{&quot;numToAttrib&quot;:{&quot;0&quot;:},&quot;nextNum&quot;:1}},&quot;type&quot;:&quot;text&quot;,&quot;referenceRecordMap&quot;:{},&quot;extra&quot;:{&quot;channel&quot;:&quot;saas&quot;,&quot;pasteRandomId&quot;:&quot;3b11f328-6c31-497e-8236-9ef28d6151fb&quot;,&quot;mention_page_title&quot;:{},&quot;external_mention_url&quot;:{}},&quot;isKeepQuoteContainer&quot;:false,&quot;isFromCode&quot;:false,&quot;selection&quot;:,&quot;payloadMap&quot;:{},&quot;isCut&quot;:false}" data-lark-record-format="docx/text"></span></div>
<div>
10713
</div>
<div><strong>Software Installation:</strong>
<div> </div>
Update System Software:</div>
<div><span style="color: #00ccff">     1  sudo apt update &amp;&amp; sudo apt full-upgrade</span></div>
<div>
10714
</div>
<p><strong>Check EEPROM Firmware Version:</strong></p>
<div><span style="color: #00ccff">     1 sudo rpi-eeprom-update</span></div>
<div>
10715
<div data-page-id="NOLcfZ72vd0Az6cGr1ncpO8Nnth" data-lark-html-role="root" data-docx-has-block-data="false">
<div class="ace-line ace-line old-record-id-F62sf9FmndMDahco8QcchpJinNh"><strong>Install NPU Dependencies:</strong></div>
<div>
<div><span style="color: #00ccff">     1 sudo apt install hailo-all</span></div>
<div>
10716
<p><strong>Restart to Take Effect：</strong></p>
<div><span style="color: #00ccff">     1 sudo reboot</span></div>
<div> </div>
<div>
<div><strong>Verify Installation：</strong></div>
<div><span style="color: #00ccff"><span style="color: #00ccff">     1 hailortcli fw-control identify</span></span>
10717
</div>
<div><strong><span style="color: #000000">Run the Demo：</span></strong></div>
<div><span style="color: #00ccff">     1 sudo apt update &amp;&amp; sudo apt install rpicam-apps<br />     2 rpicam-hello -t 0 --post-process-file /usr/share/rpi-camera-assets/hailo_yolov6_inference.json</span></div>
<div>
10718
</div>
</div>
<div><strong>Software Development：</strong>
<div> </div>
The software of this project is secondary developed based on the official detection routine. To realize the selection of interested targets, it is first necessary to draw a crosshair at the center of the screen.</div>
<div>
10719
<div>Then add a text display layer to the screen to present the in-depth analysis results of the cloud-based large model on the screen.</div>
<div>
10720
</div>
</div>
</div>
<div>Since this project does not upload the entire image to the cloud for large model analysis (which would waste network bandwidth and may leak background privacy information), the program will intercept the image of the target of interest. The key code for intercepting the image of the target of interest is as follows:</div>
</div>
</div>
</div>
<div> </div>
<div><span style="color: #3366ff"> # 检查检测框是否包含中心点（使用归一化坐标）</span><br /><span style="color: #3366ff">if (bbox.xmin() &lt;= frame_center_x &lt;= bbox.xmax() and </span><br /><span style="color: #3366ff">bbox.ymin() &lt;= frame_center_y &lt;= bbox.ymax()):</span><br /><span style="color: #3366ff"># Calculate bbox area</span><br /><span style="color: #3366ff">bbox_area = bbox.width() + bbox.height()</span><br /><br /><span style="color: #3366ff"># Find the detection closest to center</span><br /><span style="color: #3366ff">if bbox_area &lt; min_area_in_center:</span><br /><span style="color: #3366ff">min_area_in_center = bbox_area</span><br /><span style="color: #3366ff">center_detection = {</span><br /><span style="color: #3366ff">'detection': detection,</span><br /><span style="color: #3366ff">'bbox': bbox,</span><br /><span style="color: #3366ff">'confidence': confidence,</span><br /><span style="color: #3366ff">'distance': bbox_area</span><br /><span style="color: #3366ff">}</span><br /><span style="color: #3366ff">if center_detection is not None:</span><br /><span style="color: #3366ff">center_bbox = center_detection</span><br /><br /><span style="color: #3366ff"># 转换归一化坐标到像素坐标</span><br /><span style="color: #3366ff">xmin = int(center_bbox.xmin() * frame.shape)</span><br /><span style="color: #3366ff">ymin = int(center_bbox.ymin() * frame.shape)</span><br /><span style="color: #3366ff">xmax = int(center_bbox.xmax() * frame.shape)</span><br /><span style="color: #3366ff">ymax = int(center_bbox.ymax() * frame.shape)</span><br /><br /><span style="color: #3366ff"># 确保坐标在有效范围内</span><br /><span style="color: #3366ff">xmin = max(0, xmin)</span><br /><span style="color: #3366ff">ymin = max(0, ymin)</span><br /><span style="color: #3366ff">xmax = min(frame.shape - 1, xmax)</span><br /><span style="color: #3366ff">ymax = min(frame.shape - 1, ymax)</span><br /><br /><span style="color: #3366ff">if xmax &gt; xmin and ymax &gt; ymin:</span><br /><span style="color: #3366ff"># 裁剪图像</span><br /><span style="color: #3366ff">cropped_frame = frame</span><br /><span style="color: #3366ff"># 转换为BGR并保存</span><br /><span style="color: #3366ff">cropped_bgr = cv2.cvtColor(cropped_frame, cv2.COLOR_RGB2BGR)</span><br /><span style="color: #3366ff">cv2.imwrite("test.jpg?x-oss-process=image/watermark,g_center,image_YXJ0aWNsZS9wdWJsaWMvd2F0ZXJtYXJrLnBuZz94LW9zcy1wcm9jZXNzPWltYWdlL3Jlc2l6ZSxQXzQwCg,t_20", cropped_bgr)</span><br /><span style="color: #3366ff">trig_ai_vl_event.set()</span></div>
<div>
<p>Call the cloud-based large model for in-depth analysis through the intercepted target of interest image. The key code is as follows:</p>
<p><span style="color: #3366ff">def ai_vl_thread():</span><br /><span style="color: #3366ff"># 初始化OpenAI客户端</span><br /><span style="color: #3366ff">client = OpenAI(</span><br /><span style="color: #3366ff">api_key="sk-xxxxxxxxxxxxxx",</span><br /><span style="color: #3366ff">base_url="https://dashscope.aliyuncs.com/compatible-mode/v1"</span><br /><span style="color: #3366ff">)</span><br /><span style="color: #3366ff">while True:</span><br /><span style="color: #3366ff">trig_ai_vl_event.wait()</span><br /><span style="color: #3366ff">trig_ai_vl_event.clear()</span><br /><br /><span style="color: #3366ff"># 将本地图片转换为base64</span><br /><span style="color: #3366ff">base64_image = encode_image("test.jpg?x-oss-process=image/watermark,g_center,image_YXJ0aWNsZS9wdWJsaWMvd2F0ZXJtYXJrLnBuZz94LW9zcy1wcm9jZXNzPWltYWdlL3Jlc2l6ZSxQXzQwCg,t_20")</span><br /><br /><span style="color: #3366ff"># 创建聊天完成请求</span><br /><span style="color: #3366ff">completion = client.chat.completions.create(</span><br /><span style="color: #3366ff">model="qwen3-vl-plus",</span><br /><span style="color: #3366ff">messages=[</span><br /><span style="color: #3366ff">{</span><br /><span style="color: #3366ff">"role": "user",</span><br /><span style="color: #3366ff">"content": ,</span><br /><span style="color: #3366ff">},</span><br /><span style="color: #3366ff">],</span><br /><span style="color: #3366ff">stream=True</span><br /><span style="color: #3366ff">)</span><br /><br /><span style="color: #3366ff">#print("AI回复：")</span><br /><span style="color: #3366ff">ai_vl_reply = ""</span><br /><span style="color: #3366ff">for chunk in completion:</span><br /><span style="color: #3366ff">if chunk.choices:</span><br /><span style="color: #3366ff">delta = chunk.choices.delta</span><br /><span style="color: #3366ff">if hasattr(delta, 'content') and delta.content:</span><br /><span style="color: #3366ff">ai_vl_reply += delta.content</span><br /><span style="color: #3366ff">display_text_pipeline.set_property("text", ai_vl_reply)</span><br /><span style="color: #3366ff">#print(delta.content, end='', flush=True)</span></p>
<div> </div>
<div>After completing the hardware installation and software configuration, the project has initially realized the basic functions of drone-matched AI glasses. However, in the actual application process, there are still some key challenges and questions to be solved, which also need the valuable suggestions of all engineers:</div>
<div> </div>
<ol>
<li>How to further optimize the compatibility between the AI HAT and the Raspberry Pi main board to avoid the problem of unstable NPU operation caused by firmware version differences?</li>
<li>In the actual flight scenario of the drone, how to reduce the delay of cloud model calls to ensure that the analysis results can be fed back in real time to assist flight decision-making?</li>
<li>How to balance the performance of on-device AI inference and the power consumption of the device, so as to adapt to the long-time operation needs of the drone’s outdoor flight?</li>
<li>For the privacy protection of drone flight data, what more targeted optimization measures can be taken to avoid the leakage of sensitive information?</li>
</ol>
<div> </div>
<div>We sincerely look forward to your valuable opinions and solutions to the above problems, so as to continuously improve the stability and practicality of the project and better realize the matching and collaboration between AI glasses and drones.</div>
</div>
<div> </div>
<div>Cheers,Yassin</div>]]></content:encoded>
						                            <category domain="https://forum.dronebotworkshop.com/"></category>                        <dc:creator>Yassin</dc:creator>
                        <guid isPermaLink="true">https://forum.dronebotworkshop.com/show-tell/raspberry-pi-end-cloud-collaboration-intelligent-ai-glasses/</guid>
                    </item>
				                    <item>
                        <title>converting from breadboard to completed project question</title>
                        <link>https://forum.dronebotworkshop.com/electronic-components/converting-from-breadboard-to-completed-project-question/</link>
                        <pubDate>Mon, 13 Apr 2026 22:11:20 +0000</pubDate>
                        <description><![CDATA[Hello,
I&#039;m sure this is a dumb question but I am new to Arduino.
The question is once I have a breadboard working correctly with the correct sketch in the Arduino Uno, How do I transition ...]]></description>
                        <content:encoded><![CDATA[<p>Hello,</p>
<p>I'm sure this is a dumb question but I am new to Arduino.</p>
<p>The question is once I have a breadboard working correctly with the correct sketch in the Arduino Uno, How do I transition to a in the real world implementation?</p>
<p>I know how to wire up the hardware, but with the Uno once it is disconnected from the USB on the computer and powered down, does it retain the program so when power is applied it keeps running the script?</p>
<p>Does it keep doing this after it is powered down and up again?</p>
<p>Sorry I'm not clear on the process.</p>]]></content:encoded>
						                            <category domain="https://forum.dronebotworkshop.com/"></category>                        <dc:creator>Lom</dc:creator>
                        <guid isPermaLink="true">https://forum.dronebotworkshop.com/electronic-components/converting-from-breadboard-to-completed-project-question/</guid>
                    </item>
				                    <item>
                        <title>Ble project: what version of the esp32 ble library</title>
                        <link>https://forum.dronebotworkshop.com/help-wanted/ble-project-what-version-of-the-esp32-ble-library/</link>
                        <pubDate>Mon, 13 Apr 2026 20:20:25 +0000</pubDate>
                        <description><![CDATA[Hi, I copied the source of the BLE client from the dronebot workshop and I keep getting compiler errors that classble has not a member SETmtu. When I comment it I get another weird error.
&amp;...]]></description>
                        <content:encoded><![CDATA[<p>Hi, I copied the source of the BLE client from the dronebot workshop and I keep getting compiler errors that classble has not a member SETmtu. When I comment it I get another weird error.</p>
<p>&nbsp;</p>
<p>I know that there are lots of versions of libraries written by different programmers. What version did you use?</p>
<p>&nbsp;</p>
<p>deedee</p>]]></content:encoded>
						                            <category domain="https://forum.dronebotworkshop.com/"></category>                        <dc:creator>Deedee</dc:creator>
                        <guid isPermaLink="true">https://forum.dronebotworkshop.com/help-wanted/ble-project-what-version-of-the-esp32-ble-library/</guid>
                    </item>
				                    <item>
                        <title>I Got a $1 8051 Chip to Work with Arduino IDE</title>
                        <link>https://forum.dronebotworkshop.com/show-tell/i-got-a-1-8051-chip-to-work-with-arduino-ide/</link>
                        <pubDate>Sat, 11 Apr 2026 06:23:39 +0000</pubDate>
                        <description><![CDATA[Hey everyone!

8051 chips from STC Micro are ridiculously cheap — under $1 — and they actually have pretty solid specs. But they&#039;re a pain for beginners: complicated setup, and to this day...]]></description>
                        <content:encoded><![CDATA[<div><span><span> </span></span>
10699
<span>Hey everyone!</span></div>
<br />
<div><span>8051 chips from STC Micro are ridiculously cheap — under $1 — and they actually have pretty solid specs. But they're a pain for beginners: complicated setup, and to this day there's no open-source C++ compiler for 8051. The only option is SDCC, which only supports C. You can use Arduino IDE with it, but without C++, most Arduino libraries just won't compile.</span></div>
<br />
<div>
<div>
<div><span>So I decided to fix that. It took a while, but I now have a full Arduino Core for the STC8H8K64U. You install it through the Board Manager, write normal Arduino code — digitalWrite, analogRead, Serial, etc. — and it just works. No external programmer needed either, just a USB cable, same as an Uno.</span></div>
</div>
</div>
<br />
<div><span>The chip itself is honestly pretty nice for the price: 64KB flash, 8KB RAM, native USB, 4 UARTs, 12-bit ADC, SPI, I2C, PWM, up to 45MHz. I even got FreeRTOS running on it, so you can do multitasking.</span></div>
<br />
<div><span>The fun part is how it works under the hood — since there's no C++ compiler for 8051, I use RISC-V GCC to compile the Arduino code, then a tiny emulator written in 8051 assembly interprets it on the chip. Sounds crazy, but it works!</span></div>
<br />
<div><span>It's all open-source if anyone wants to check it out: https://github.com/thevien257/STC_Arduino_Core</span></div>
<br />
<div><span>Would love to hear what you all think!</span></div>]]></content:encoded>
						                            <category domain="https://forum.dronebotworkshop.com/"></category>                        <dc:creator>thevien257</dc:creator>
                        <guid isPermaLink="true">https://forum.dronebotworkshop.com/show-tell/i-got-a-1-8051-chip-to-work-with-arduino-ide/</guid>
                    </item>
				                    <item>
                        <title>Arduino on Sub-$1 Microcontrollers — STC8H 8051 Series</title>
                        <link>https://forum.dronebotworkshop.com/suggest-content/arduino-on-sub-1-microcontrollers-stc8h-8051-series/</link>
                        <pubDate>Sat, 11 Apr 2026 06:10:33 +0000</pubDate>
                        <description><![CDATA[Hi Bill,

I&#039;d love to see a video about using the Arduino IDE with STC8H microcontrollers.

These are modern 8051 chips that cost under $1, but they&#039;re surprisingly capable — 64KB flash,...]]></description>
                        <content:encoded><![CDATA[<div>
<div><span>Hi Bill,</span></div>
<br />
<div><span>I'd love to see a video about using the Arduino IDE with STC8H microcontrollers.</span></div>
<br />
<div><span>These are modern 8051 chips that cost under $1, but they're surprisingly capable — 64KB flash, 8KB RAM, built-in USB, 4 UARTs, 12-bit ADC, SPI, I2C, PWM, and they run at up to 45MHz. No external programmer needed — you just plug in a USB cable and hit upload, same as an Arduino Uno.</span></div>
<br />
<div><span>There's an open-source Arduino Core that makes these chips fully Arduino-compatible. You write standard Arduino code — digitalWrite, analogRead, Serial.print, Wire, SPI — and it just works. Most Arduino libraries are compatible too, so you're not starting from scratch.</span></div>
<br />
<div><span>It even has a FreeRTOS port, so you can run multiple tasks on a chip that costs less than a cup of coffee.</span></div>
<br />
<div><span>The whole project is open-source: https://github.com/thevien257/STC_Arduino_Core</span></div>
<br />
<div><span>I think the community would find it interesting — a legit Arduino-compatible board for under a dollar, no programmer, no adapters, just USB and go.</span></div>
<br />
<div><span>Thanks!</span></div>
</div>]]></content:encoded>
						                            <category domain="https://forum.dronebotworkshop.com/"></category>                        <dc:creator>thevien257</dc:creator>
                        <guid isPermaLink="true">https://forum.dronebotworkshop.com/suggest-content/arduino-on-sub-1-microcontrollers-stc8h-8051-series/</guid>
                    </item>
							        </channel>
        </rss>
		