❌

Normal view

There are new articles available, click to refresh the page.
Today β€” 19 January 2025Main stream

The Swervebot is an omnidirectional robot that combines LEGO and 3D-printed parts

19 January 2025 at 02:35

Robotic vehicles can have a wide variety of drive mechanisms that range from a simple tricycle setup all the way to crawling legs. Alex Le’s project leverages the reliability of LEGO blocks with the customizability of 3D-printed pieces to create aΒ highly mobile omnidirectional robot called Swervebot, which is controllable over Wi-Fi thanks to an Arduino Nano ESP32.

The base mechanism of a co-axial swerve drive robot is a swerve module that uses one axle + motor to spin the wheel and another axle + motor to turn it. When combined with several other swerve modules in a single chassis, the Swervebot is able to perform very complex maneuvers such as spinning while moving in a particular direction. For each of these modules, a pair of DC motors were mounted into custom, LEGO-compatible enclosures and attached to a series of gears for transferring their motion into the wheels. Once assembled into a 2Γ—2 layout, Le moved onto the next steps of wiring and programming the robot.

The Nano ESP32 is attached to two TB6612 motor drivers and a screen for displaying fun, animated eyes while the robot is in-motion or idling. Controlling the swerve bot is easy too, as the ESP32 hosts a webpage full of buttons and other inputs for setting speeds and directions.

For more details on the Swervebot, you canΒ read Le’s write-up here on Instructables.

The post The Swervebot is an omnidirectional robot that combines LEGO and 3D-printed parts appeared first on Arduino Blog.

Before yesterdayMain stream

ARMOR: Egocentric Perception for Humanoid Robot Powered by XIAO ESP32S3

16 January 2025 at 01:31

Daehwa Kim (Carnegie Mellon University), Mario Srouji, Chen Chen, and Jian Zhang (Apple) have developed ARMOR, an innovative egocentric perception hardware and software system for humanoid robots. By combining Seeed Studio XIAO ESP32S3-based wearable depth sensor networks and transformer-based policies, ARMOR tackles the challenges of collision avoidance and motion planning in dense environments. This system enhances spatial awareness and enables nimble and safe motion planning, outperforming traditional perception setups. ARMOR was deployed on the GR1 humanoid robot from Fourier Intelligence, showcasing its real-world applications.

[Source: Daehwa Kim]

Hardwares Used

ARMOR uses the following hardware components:

    • XIAO ESP32S3 microcontrollers: Efficiently collect sensor data and stream it to the robot’s onboard computer via I2C.
    • Onboard Computer: NVIDIA Jetson Xavier NX processes sensor inputs.Β 
    • GPU (NVIDIA GeForce RTX 4090): Handles ARMOR-Policy’s inference-time optimization for motion planning.
    • SparkFun VL53L5CX Time-of-Flight (ToF) lidar sensors: Distributed across the robot’s body for comprehensive point cloud perception.

How the ARMOR Works

The hardware solution of ARMOR’s egocentric perception system uses distributed ToF lidar sensor networks. Groups of four ToF sensors are connected to Seeed Studio XIAO ESP32S3 microcontrollers, capturing high-precision depth information from the environment. The XIAO ESP32S3 serves as a crucial intermediary controller, efficiently managing real-time sensor data transmission. It streams the collected depth data via USB to the robot’s onboard computer, the NVIDIA Jetson Xavier NX, which then wirelessly transmits the data to a powerful Linux machine equipped with an NVIDIA GeForce RTX 4090 GPU for data processing. This sophisticated data pipeline enables the creation of an occlusion-free point cloud around the humanoid robot, providing essential environmental awareness data for the ARMOR neural motion planning algorithm. The distributed and light-weight hardware setup also ensures enhanced spatial awareness and overcomes the limitations of head-mounted or external cameras, which often fail in cluttered or occluded environments.

Daehwa Kim, one of the core developers of this project, mentions why they selected the Seeed Studio XIAO for this project.

β€œWe might imagine a future where users easily plug and play with wearable sensors for humanoids and augment robots' perceptions in various tasks. XIAO ESP32 series makes the wearable sensor system easily modularizable. We specifically adopted the XIAO ESP32S3 in ARMOR because of its powerful computing and tiny form factor.”
Armor Policy - Transformer-based policy [Source: Daehwa Kim]

The neural motion planning system, ARMOR-Policy, is built on a transformer-based architecture called the Action Chunking Transformer. This policy was trained on 86 hours of human motion data from the AMASS dataset using imitation learning. ARMOR-Policy processes the robot’s current state, goal positions, and sensor inputs to predict safe and efficient trajectories in real-time. The system leverages latent variables to explore multiple trajectory solutions during inference, ensuring flexibility and robustness.

Trained on 86 hours of human motion dataset [Source: Daehwa Kim]

ARMOR was rigorously tested in both simulated and real-world scenarios. It demonstrated remarkable improvements in performance, reducing collisions by 63.7% and increasing success rates by 78.7% compared to exocentric systems with dense head-mounted cameras. Additionally, the transformer-based ARMOR-Policy reduced computational latency by 26Γ— compared to sampling-based motion planners like cuRobo, enabling efficient and nimble collision avoidance.

Real World Hardware Deployment [Source: Daehwa Kim]

Discover more about ARMOR

Want to explore ARMOR’s capabilities? The research team will soon release the source code, hardware details, and 3D CAD files on their GitHub repository. Dive deeper into this cutting-edge project by reading their paper on arXiv. Stay tuned for updates to replicate and innovate on this revolutionary approach to humanoid robot motion planning! To see ARMOR in action, check out their demonstration video on YouTube.

End Note

Hey community, we’re curating a monthly newsletter centering around the beloved Seeed Studio XIAO. If you want to stay up-to-date with:

πŸ€–Β Cool Projects from the CommunityΒ to get inspiration and tutorials
πŸ“°Β Product Updates: firmware update, new product spoiler
πŸ“–Β Wiki Updates: new wikis + wiki contribution
πŸ“£Β News: events, contests, and other community stuff

Please click the image belowπŸ‘‡ to subscribe now!

The post ARMOR: Egocentric Perception for Humanoid Robot Powered by XIAO ESP32S3 appeared first on Latest Open Tech From Seeed.

πŸ’Ύ

This robot can dynamically change its wheel diameter to suit the terrainΒ 

14 January 2025 at 02:25

A vehicle’s wheel diameter has a dramatic effect on several aspects of performance. The most obvious is gearing, with larger wheels increasing the ultimate gear ratio β€” though transmission and transfer case gearing can counteract that. But wheel size also affects mobility over terrain, which is why Gourav Moger and Huseyin Atakan Varol’s prototype mobile robot, called Improbability Roller, has the ability to dynamically alter its wheel diameter.

If all else were equal (including final gear ratio), smaller wheels would be better, because they result in less unsprung mass. But that would only be true in a hypothetical world on perfectly flat surfaces. As the terrain becomes more irregular, larger wheels become more practical. Stairs are an extreme example and only a vehicle with very large wheels can climb stairs.

Most vehicles sacrifice either efficiency or capability through wheel size, but this robot doesn’t have to. Each of its wheels is a unique collapsing mechanism that can expand or shrink as necessary to alter the effective rolling diameter. Pulley rope actuators on each wheel, driven by Dynamixel geared motors by an Arduino Mega 2560 board through a Dynamixel shield, perform that change. A single drive motor spins the wheels through a rigid gear set mounted on the axles, and a third omni wheel provides stability.Β 

This unique arrangement has additional benefits beyond terrain accommodation. The robot can, for instance, shrink its wheels in order to fit through tight spaces. It can also increase the size of one wheel, relative to the other, to turn without a dedicated steering rack or differential drive system.Β 

The post This robot can dynamically change its wheel diameter to suit the terrainΒ  appeared first on Arduino Blog.

The LattePanda Mu SoM is now available with Intel Core i3-N305 octa-core SoC

2 January 2025 at 20:35
LattePanda Mu Intel Core i3-N305 CPU

Launched last year with an Intel Processor N100, the LattePanda Mu system-on-module is now available with an Intel Core i3-N305 octa-core processor delivering both higher single-core and multi-core performance, and faster 3D graphics acceleration. All interfaces are the same all exposed through a 260-pin SO-DIMM edge connector including up to 9x PCIe Gen3 lanes, two SATA, eDP, HDMI, and DisplayPort interfaces, twelve USB interfaces, and more.Β  The LattePanda Mu launched with 8GB RAM last year, but both the N100 and Core i3-N305 models are now available with up to 16GB LPDDR5 IBECC memory, while the eMMC flash capacity remains at 64GB for all variants. LattePanda Mu specifications: SoC (one or the other) Intel Processor N100 quad-core Alder Lake-N processor @ up to 3.4 GHz (Turbo) with 6MB cache, 24EU Intel HD graphics @ 750 MHz; TDP: 6W Intel Core i3-N305 octa-core Alder Lake-N processor @ up to 3.8 GHz (Turbo) [...]

The post The LattePanda Mu SoM is now available with Intel Core i3-N305 octa-core SoC appeared first on CNX Software - Embedded Systems News.

❌
❌