<?xml version='1.0'?><rss version="2.0" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:georss="http://www.georss.org/georss" xmlns:atom="http://www.w3.org/2005/Atom" >
<channel>
	<title><![CDATA[PublMe - Space: Posted Reaction by PublMe bot in PublMe]]></title>
	<link>https://publme.space/reactions/v/57679</link>
	<atom:link href="https://publme.space/reactions/v/57679" rel="self" type="application/rss+xml" />
	<description><![CDATA[]]></description>
	
	<item>
	<guid isPermaLink="true">https://publme.space/reactions/v/57679</guid>
	<pubDate>Sat, 23 Aug 2025 22:00:31 +0200</pubDate>
	<link>https://publme.space/reactions/v/57679</link>
	<title><![CDATA[Posted Reaction by PublMe bot in PublMe]]></title>
	<description><![CDATA[
<p>LeRobot Brings Autonomy to Hobby Robots</p>
<div><img width="800" height="449" src="https://hackaday.com/wp-content/uploads/2025/08/Running-AI-robotics-experiments-at-home-with-LeRobot-and-SO-ARM100-35-25-screenshot-banner.png?w=800" alt="" srcset="https://hackaday.com/wp-content/uploads/2025/08/Running-AI-robotics-experiments-at-home-with-LeRobot-and-SO-ARM100-35-25-screenshot-banner.png 1011w, https://hackaday.com/wp-content/uploads/2025/08/Running-AI-robotics-experiments-at-home-with-LeRobot-and-SO-ARM100-35-25-screenshot-banner.png?resize=250, 140 250w, https://hackaday.com/wp-content/uploads/2025/08/Running-AI-robotics-experiments-at-home-with-LeRobot-and-SO-ARM100-35-25-screenshot-banner.png?resize=400, 225 400w, https://hackaday.com/wp-content/uploads/2025/08/Running-AI-robotics-experiments-at-home-with-LeRobot-and-SO-ARM100-35-25-screenshot-banner.png?resize=800, 449 800w" data-attachment-id="805605" data-permalink="https://hackaday.com/2025/08/23/lerobot-brings-autonomy-to-hobby-robots/running-ai-robotics-experiments-at-home-with-lerobot-and-so-arm100-35-25-screenshot-banner/" data-orig-file="https://hackaday.com/wp-content/uploads/2025/08/Running-AI-robotics-experiments-at-home-with-LeRobot-and-SO-ARM100-35-25-screenshot-banner.png" data-orig-size="1011,568" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="Running AI robotics experiments at home with LeRobot and SO-ARM100 35-25 screenshot-banner" data-image-description="" data-image-caption="" data-medium-file="https://hackaday.com/wp-content/uploads/2025/08/Running-AI-robotics-experiments-at-home-with-LeRobot-and-SO-ARM100-35-25-screenshot-banner.png?w=400" data-large-file="https://hackaday.com/wp-content/uploads/2025/08/Running-AI-robotics-experiments-at-home-with-LeRobot-and-SO-ARM100-35-25-screenshot-banner.png?w=800"></div><p>Robotic arms have a lot in common with CNC machines in that they are usually driven by a fixed script of specific positions to move to, and actions to perform. Autonomous behavior isn’t the norm, especially not for hobby-level robotics. That’s changing rapidly with <a rel="nofollow" href="https://github.com/huggingface/lerobot" target="_blank">LeRobot</a>, an open-source machine learning framework from the Hugging Face community.</p><figure aria-describedby="caption-attachment-805607"><a rel="nofollow" href="https://hackaday.com/wp-content/uploads/2025/08/so101.webp"><img data-attachment-id="805607" data-permalink="https://hackaday.com/2025/08/23/lerobot-brings-autonomy-to-hobby-robots/so101/" data-orig-file="https://hackaday.com/wp-content/uploads/2025/08/so101.webp" data-orig-size="2048,1536" data-comments-opened="1" data-image-meta="{&quot;aperture&quot;:&quot;0&quot;,&quot;credit&quot;:&quot;&quot;,&quot;camera&quot;:&quot;&quot;,&quot;caption&quot;:&quot;&quot;,&quot;created_timestamp&quot;:&quot;0&quot;,&quot;copyright&quot;:&quot;&quot;,&quot;focal_length&quot;:&quot;0&quot;,&quot;iso&quot;:&quot;0&quot;,&quot;shutter_speed&quot;:&quot;0&quot;,&quot;title&quot;:&quot;&quot;,&quot;orientation&quot;:&quot;0&quot;}" data-image-title="so101" data-image-description="" data-image-caption="&lt;p&gt;The SO-101 arm is an economical way to get started.&lt;/p&gt;" data-medium-file="https://hackaday.com/wp-content/uploads/2025/08/so101.webp?w=400" data-large-file="https://hackaday.com/wp-content/uploads/2025/08/so101.webp?w=800" src="https://hackaday.com/wp-content/uploads/2025/08/so101.webp?w=400" alt="" width="400" height="300" srcset="https://hackaday.com/wp-content/uploads/2025/08/so101.webp 2048w, https://hackaday.com/wp-content/uploads/2025/08/so101.webp?resize=250, 188 250w, https://hackaday.com/wp-content/uploads/2025/08/so101.webp?resize=400, 300 400w, https://hackaday.com/wp-content/uploads/2025/08/so101.webp?resize=800, 600 800w, https://hackaday.com/wp-content/uploads/2025/08/so101.webp?resize=1536, 1152 1536w"></a><figcaption>The SO-101 arm is an economical way to get started.</figcaption></figure><p>If a quick browse of the project page still leaves you with questions, you’re not alone. Thankfully, [Ilia] has <a rel="nofollow" href="https://www.youtube.com/watch?v=DeBLc2D6bvg" target="_blank">a fantastic video that explains and demonstrates</a> the fundamentals wonderfully. In it, he shows how LeRobot allows one to train an economical 3D-printed robotic arm by example, teaching it to perform a task autonomously. In this case, the task is picking up a ball and putting it into a cup.</p><p>[Ilia] first builds a dataset by manually operating the arm to pick up a ball and place it in a cup. Then, with a dataset consisting of only about fifty such examples, he creates a machine learning model capable of driving the arm to autonomously pick up a ball and place it in a cup, regardless of where the ball and cup actually are. It even gracefully handles things like color changes and [Ilia] moving the cup and ball around mid-task. You can <a rel="nofollow" href="https://youtu.be/DeBLc2D6bvg?t=2056" target="_blank">skip directly to 34:16</a> to see this autonomous behavior in action, but we do recommend watching the whole video for a highly accessible yet deeply technical overview.</p><p></p><p>LeRobot is a very flexible framework, capable of much more than just doing imitation learning on 3D-printed low-cost robot arms. But the main goal is to make this sort of thing accessible to just about anyone, as [Ilia] aptly demonstrates. We have seen tons of <a rel="nofollow" href="https://hackaday.com/2025/08/20/building-a-robotic-arm-without-breaking-the-bank/">high-quality DIY robot arms</a>, and since the LeRobot framework is both developing quickly and isn’t tied to any particular hardware, it might be powering the next robot project sooner than you think.</p><p></p>]]></description>
	<dc:creator>PublMe bot</dc:creator>
</item>

</channel>
</rss>