Normal view

There are new articles available, click to refresh the page.
Yesterday — 14 January 2025Raspberry Pi

Security through transparency: RP2350 Hacking Challenge results are in

14 January 2025 at 22:00

We launched our second-generation microcontroller, RP2350, in August last year. Building on the success of its predecessor, RP2040, this adds faster processors, more memory, lower power states, and a security model built around Arm TrustZone for Cortex-M. Alongside our own Raspberry Pi Pico 2 board, and numerous partner boards, RP2350 also featured on the DEF CON badge, designed by Entropic Engineering, with firmware by our friend Dmitry Grinberg.

The image shows two RP2350 microcontrollers. The chip on the left features the Raspberry Pi logo, a stylised raspberry with a leaf. The chip on the right shows the underside, highlighting the exposed silicon die pad and the lead frame surrounding it, indicating that these are likely surface-mount chips with a quad flat package (QFP) design.

All chips have vulnerabilities, and most vendors’ strategy is not to talk about them. We consider this to be suboptimal, so instead, we entered into the DEF CON spirit by offering a one-month, $10,000 prize to the first person to retrieve a secret value from the one-time-programmable (OTP) memory on the device. Our aim was to smoke out weaknesses early, so that we could fix them before RP2350 became widely deployed in secure applications. This open approach to security engineering has been generally well received: call it “security through transparency”, in contrast with the “security through obscurity” philosophy of other vendors.

Nobody claimed the prize by the deadline, so in September we extended the deadline to the end of the year and doubled the prize to $20,000. Today, we’re pleased (ish) to announce that we received not one but four valid submissions, all of which require physical access to the chip, with varying degrees of intrusiveness. Outside of the contest, Thomas “stacksmashing” Roth and the team at Hextree also discovered a vulnerability, which we describe below.

So with no further ado, the winners are:

“Hazardous threes” – Aedan Cullen

RP2350’s antifuse OTP memory is a security-critical component: security configuration bits are stored in OTP and read early in the reset process. A state machine called the OTP PSM is responsible for these reads. Unfortunately, it turns out that the OTP PSM has an exploitable weakness.

The antifuse array is powered via the USB_OTP_VDD pin. To protect against power faults, the PSM uses “guard reads”: reads of known data very close to reads of security-critical data. A power fault should cause a mismatch in the known guard data, indicating that the associated security-critical read is untrustworthy. We use a single guard word: 0x333333.

However, the OTP may retain the last sensed read data during a power fault, and subsequent reads return the most-recently-read data from when power was good. This is not itself a flaw, but it interacts poorly with the choice of guard word. If USB_OTP_VDD is dropped precisely after a guard read has occurred, 0x333333 will be read until power is restored. Therefore, an attacker can overwrite security-critical configuration data with this value.

This is a close-up photo of an electronic circuit board (green PCB) with various surface-mounted components. In the center is a black integrated RP2350 circuit chip bearing the Raspberry Pi logo. A gold-colored probe or test pin is making contact with one of the pins on the chip, suggesting testing, debugging, or programming activity. Surrounding the main chip are small resistors, capacitors, and possibly other integrated circuits, all arranged on the green board.
Image courtesy of Aedan Cullen

If the CRIT0 and CRIT1 words are replaced by 0x333333 during the execution of the OTP PSM, the RISCV_DISABLE and ARM_DISABLE bits will be set, and the DEBUG_DISABLE bit will be cleared. ARM_DISABLE takes precedence, so the chip leaves reset with the RISC-V cores running and debugging allowed, regardless of the actual configuration written in the fuses. Dumping secret data from the OTP is then straightforward.

More information can be found in Aedan’s GitHub repository here, and in his Chaos Communication Congress presentation here.

No mitigation is currently available for this vulnerability, which has been assigned erratum number E16. It is likely to be addressed in a future stepping of RP2350.

USB bootloader single-instruction fault with supply-voltage injection – Marius Muench

A foundational security feature of RP2350 is secure boot, which restricts the chip to only run code signed with a specific private key. If an attacker can bypass or break out of secure boot, they can run their own unsigned code, which can potentially dump secret data from the OTP.

Marius discovered a weakness in the boot ROM’s reboot API. This supports several different reboot modes, one of which is REBOOT_TYPE_PC_SP, which reboots and starts execution with a specific program counter and stack pointer. This can only be triggered from secure firmware already running on the chip, but if an attacker could trigger this boot mode externally, and with controlled parameters, we would start executing code at an attacker-supplied address – without verifying the signature of the code!

But how can one enter this boot mode, if it is only accessible to signed and verified firmware?

The answer (of course) is fault injection. By issuing a normal reboot command to the USB bootloader, and injecting a fault (in this case by glitching the supply voltage) so that an instruction is skipped just at the right time, it is possible to trick the reboot API into believing that REBOOT_TYPE_PC_SP was requested. If an attacker has loaded malicious code beforehand into the RAM, this code can be executed and used to extract the secret.

An interesting aspect of this attack is that the code for accepting the reboot command is actually hardened against fault injection. Unfortunately, the function implementing the reboot logic itself assumes that the incoming parameters (including the requested boot mode) are sanitised. Due to an unlucky arrangement of instructions emitted by the compiler, injecting a fault which skips one out of two very specific instructions confuses the chip into rebooting to the hazardous boot type.

Marius says: “While this break may seem straightforward in retrospect, reality is quite different. Identifying and exploiting these types of issues is far from trivial. Overall, this hacking challenge was a multi-month project for me, with many dead-ends explored along the way and countless iterations of attack code and setups to confirm or refute potential findings. Nonetheless, I had plenty of fun digging deep into the intricacies of the new RP2350 microcontroller, and I would like to thank Raspberry Pi and Hextree for hosting the challenge!”

Several effective mitigations are available against this attack, which has been assigned erratum number E20. The most precise mitigation is to set the OTP flag BOOT_FLAGS0.DISABLE_WATCHDOG_SCRATCH, which disables the ability to reboot to a particular PC/SP where that function is not required by application code.

Signature check single-instruction fault with laser injection – Kévin Courdesses

Kévin discovered an exploitable weakness in the secure boot path, just after the firmware to be validated has been loaded into RAM, and just before the hash function needed for the signature check is computed. Injecting a single precisely timed fault at this stage can cause the hash function to be computed over a different piece of data, controlled by the attacker. If that data is a valid signed firmware, the signature check will pass, and the attacker’s unsigned firmware will run!

Image courtesy of Kévin Courdesses

The most common method of introducing faults, seen in Marius’s attack, is to briefly pull down the supply voltage, introducing a brief “glitch”, which causes the digital logic in the chip to misbehave. RP2350 contains glitch detector circuitry, which is designed to spot most voltage glitches and to purposely halt the chip in response. To permit the injection of faults without triggering the glitch detectors, Kévin built a custom laser fault injection system; this applies a brief pulse of laser light to the back of the die, which has been exposed by grinding away part of the package. And, although several technical compromises were necessary to keep the setup within a limited budget, it worked!

More information can be found in Kévin’s paper here.

No mitigation is available for this attack, which has been assigned erratum number E24. It is likely to be addressed in a future stepping of RP2350.

Extracting antifuse secrets from RP2350 by FIB/PVC – IOActive

OTP memories based on antifuses are widely used for storing small amounts of data (such as serial numbers, keys, and factory trimming) in integrated circuits because they are inexpensive and require no additional mask steps to fabricate. RP2350 uses an off-the-shelf antifuse memory block for storing secure boot keys and other sensitive configuration data.

Antifuses are widely considered to be a “high security” storage medium, meaning that they are significantly more difficult for an attacker to extract data from than other types of memory, such as flash or mask ROM. However, with this attack, IOActive has (almost) demonstrated that data bits stored in the RP2350 antifuse memory array can be extracted using a well-known semiconductor failure analysis technique: passive voltage contrast (PVC) with a focused ion beam (FIB).

Image courtesy of IOActive

The current form of the attack recovers the bitwise OR of two physically adjacent memory cells sharing common metal-1 contacts. However, with some per-bit effort it may be possible for an attacker to separate the even/odd cell values by taking advantage of the circuit-editing capabilities of the FIB.

IOActive has not yet tested the technique against other antifuse IP blocks or on other process nodes. Nonetheless, it is believed to have broad applicability to all antifuse-based memories. Dr Andrew Zonenberg, who led the technical team on this project along with Antony Moor, Daniel Slone, Lain Agan, and Mario Cop, commented: “Our team found a unique attack vector for reading data out of antifuse memory, which we intend to further develop. Those who rely on antifuse memory for confidentiality should immediately reassess their security posture.”

The suggested mitigation for this attack is to employ a “chaffing” technique, storing either {0, 1} or {1, 0} in each pair of bit cells, as the attack in its current form is unable to distinguish between these two states. To guard against a hypothetical version of the attack which uses circuit editing to distinguish between these states, it is recommended that keys and other secrets be stored as larger blocks of chaffed data, from which the secret is recovered by hashing.

Glitch detector evaluation, and OTP read double-instruction fault with EM injection – Hextree

We commissioned the Hextree team to evaluate the secure boot process, and the effectiveness of the redundancy coprocessor (RCP) and glitch detectors. They found that at the highest sensitivity setting, the glitch detectors can detect many voltage glitches; however, the rate of undetected glitches is still high enough to make attacks feasible with some effort.

The majority of their work focused on electromagnetic fault injection (EMFI), which delivers a high-voltage pulse to a small coil on top of the chip. This creates an electromagnetic field which will collapse in the chip, providing for the injection of very localized faults which do not disturb the glitch detectors. Testing yielded multiple security-relevant results, notably that it is possible to corrupt values read from OTP by injecting faults very early in the boot process, and that random delays provided by the RCP are susceptible to side-channel measurements.

The team also found a path to bypass an aspect of the OTP protection of the chip using a double fault: the s_varm_crit_nsboot function, which locks down the OTP permissions prior to entering BOOTSEL mode, has two instructions which, when both are disturbed by precisely timed faults, can prevent an OTP page from being correctly locked, effectively allowing the user to read-out and write to the OTP even when the chip configuration forbids this. The double fault can be triggered with reasonable reliability by EMFI.

Several effective mitigations are available against this attack, which has been assigned erratum number E21. The attack occurs when the device is running non-secure bootloader code, and the OTP keys are extracted via the PICOBOOT interface. The USB bootloader can be disabled by setting the OTP flags BOOT_FLAGS0.DISABLE_BOOTSEL_USB_PICOBOOT_IFC and BOOT_FLAGS0.DISABLE_BOOTSEL_USB_MSD_IFC, which mitigates this vulnerability at the cost of removing the ability to update firmware on the device over USB.

Image courtesy of NewAE and Fritz

We’d also like to express gratitude to Colin O’Flynn and his team at NewAE for collaborating with both us and Thomas Roth / Hextree on this advanced silicon security research, as well as enabling us with their fantastic ChipWhisperer kit.

What’s next?

We’d like to thank everyone who participated in the challenge. While the rules specify a single $20,000 prize for the “best” attack, we were so impressed by the quality of the submissions that we have chosen to pay the prize in full for each of them.

As expected, we’ve learned a lot. In particular, we’ve revised downward our estimate of the effectiveness of our glitch detection scheme; the difficulty of reliably injecting multiple faults even in the presence of timing uncertainty; and the cost and complexity of laser fault injection. We’ll take these lessons into account as we work to harden future chips, and anticipated future steppings of RP2350.

And while this hacking challenge is over, another one is about to start. As a component of the broader RP2350 security architecture, we’ve been working to develop an implementation of AES which is hardened against side-channel attacks (notably differential power analysis), and we’ll be challenging you to defeat it. Check back next week for more details.

All vendors have security vulnerabilities in their chips. We are unusual because we talk about them, and aim to fix them, rather than brushing them under the carpet. Security through transparency is here to stay.

The post Security through transparency: RP2350 Hacking Challenge results are in appeared first on Raspberry Pi.

Before yesterdayRaspberry Pi

16GB Raspberry Pi 5 on sale now at $120

9 January 2025 at 14:58

We first announced Raspberry Pi 5 back in the autumn of 2023, with just two choices of memory density: 4GB and 8GB. Last summer, we released the 2GB variant, aimed at cost-sensitive applications. And today we’re launching its bigger sibling, the 16GB variant, priced at $120.

Why 16GB, and why now?

We’re continually surprised by the uses that people find for our hardware. Many of these fit into 8GB (or even 2GB) of SDRAM, but the threefold step up in performance between Raspberry Pi 4 and Raspberry Pi 5 opens up use cases like large language models and computational fluid dynamics, which benefit from having more storage per core. And while Raspberry Pi OS has been tuned to have low base memory requirements, heavyweight distributions like Ubuntu benefit from additional memory capacity for desktop use cases.

The optimised D0 stepping of the Broadcom BCM2712 application processor includes support for memories larger than 8GB. And our friends at Micron were able to offer us a single package containing eight of their 16Gbit LPDDR4X die, making a 16GB product feasible for the first time.

Carbon Removal Credits

We’re proud of the low environmental impact of Raspberry Pi computers. They are small and light, which translates directly into a small upfront carbon footprint for manufacturing, logistics and disposal. With an idle power consumption in the 2–3W range, and a fully loaded power consumption of less than 10W, replacing a legacy x86 PC with a Raspberry Pi typically results in a significant reduction in operating power consumption, and thus ongoing carbon footprint.

But while our upfront carbon footprint is small, it is not zero. So today, we’re launching Raspberry Pi Carbon Removal Credits, priced at $4, giving you the option to mitigate the emissions associated with the manufacture and disposal of a modern Raspberry Pi.

How does it work?

We commissioned Inhabit to conduct an independent assessment of the carbon footprint of manufacturing, shipping, and disposing of a Raspberry Pi 4 or 5, which came to 6.5kg of CO₂ equivalent. When you buy a Raspberry Pi Carbon Removal Credit from one of our Approved Resellers, we pay our friends at UNDO Carbon to begin capturing that quantity of CO2 from the atmosphere using enhanced rock weathering (ERW) technology.

It’s that simple.

What is enhanced rock weathering?

As rain falls through the atmosphere, it combines with CO₂ to form carbonic acid. When this weak acid falls on mountains, forests and grassland, the CO₂ interacts with rocks and soil, mineralises, and is safely stored in solid carbonate form. The natural process of weathering already accounts for the removal of one billion tonnes of CO₂ from the atmosphere every year.

ERW accelerates this natural process by spreading crushed silicate rock (in our case, basalt) on agricultural land, increasing the surface area of the rock and therefore increasing its contact with CO₂. Overall, this reduces the timescales involved from millions of years to mere decades. Once the reaction takes place, the CO₂ is permanently locked away for 100,000+ years.

In addition to capturing CO₂, spreading basalt on agricultural land also brings with it significant co-benefits. Silicate rocks are mineral-rich; as they weather, they release nutrients such as magnesium, calcium and potassium, improving soil health and reducing the need for fertilisers. Trials with the University of Newcastle have shown an increase in crop yield following the application of crushed basalt rock. In addition, the alkaline bicarbonate ions captured during the ERW process are eventually washed out to sea, where they help to deacidify our oceans.

You can find out more about UNDO’s work here.

Why capture carbon in the future, not the past?

Generally, when you buy carbon offsets, you are paying for carbon capture which has taken place in the past (for example by planting and growing trees). When you buy Raspberry Pi Carbon Removal Credits, UNDO spreads basalt now, which then captures the rated quantity of carbon over, roughly, the next twenty years.

We’ve chosen ERW because we believe it’s a more rigorous, scalable, verifiable approach to carbon capture than traditional approaches like planting (or, more ridiculously, agreeing not to cut down) trees: quite simply, it’s our best shot at drawing down a material fraction of humanity’s carbon emissions in our lifetimes. But, as it is a relatively new technology, there is no pool of offsets corresponding to historical capture available for us to purchase.

So, we’re doing the next best thing: paying UNDO to start an irrevocable process of carbon capture which will continue over the next two decades and beyond. We hope that our embrace of ERW will help raise awareness of this world-changing technology, and perhaps inspire others to take their first steps with it.

The post 16GB Raspberry Pi 5 on sale now at $120 appeared first on Raspberry Pi.

Raspberry Pi Pico MIDI Gesture Controller

8 January 2025 at 16:57

Extracting an arresting array of sounds from a guitar became a mission for keen coder Gary. In the latest issue of The MagPi, he tells Rosie Hattersley how he built a Raspberry Pi-based expression pedal.

The MIDI Gesture Controller is a sort of musical expression pedal that rotates and rolls around a ball joint, providing six degrees of freedom

Guitarist and keen coder Gary Rigg says he always thought floor-based controllers — particularly expression pedals — should have a more prominent role. They are usually operated by pressing your foot down for a subtle or more obvious wah-wah or delay effect, but only in a single direction, also known as one degree of freedom (DOF). 

You use your foot to “control the pitch of the pedal, and the pitch determines the parameter value.” Gary reasoned that adding degrees of freedom such as yaw (rotation around an axis) and roll to an expression pedal could extend its pitch parameters. He began pondering what new sounds could be achieved by redesigning how the humble foot pedal was operated. The result is the MIDI Gesture Controller, a Raspberry Pi Pico-based expression pedal that can control three parameters, “which ought to lead to more control while playing live.”

The Gesture Controller can be plugged into a PC as a MIDI control device and works with synthesisers and samplers

New musical direction

Gary hit upon a ball and socket setup, since these move through three or more planes of motion in multiple directions. He soon settled on a desk-based rotating puck design, realising that since the expression pedal did not necessarily need to be foot-operated, it could have several additional uses: “it works as well as a hand controller as a foot controller, so could be used for DJs or in a studio.” Camera controllers, stage lighting, and other non-musical applications also came to mind. Gary points out that MIDI is simply a protocol and could be swapped for something else, such as an HID controlling gameplay, for example. Sensor values are sent down a serial line, so the Gesture Controller could theoretically be used in “any situation needing a multi-axis controller.”

Give it a try

Gary uses Python regularly for his job as a software developer for websites and mobile devices. In “paid work land” he’s used Raspberry Pi for IoT projects to control lights and smart devices, in fire alarm panels, and alongside NFC cards and in MQTT Edge devices. As a hobbyist, Gary has created Raspberry Pi-based retro games consoles, set up sensors, and designed a Ghostbusters PKE Meter, so he is fairly confident with prototyping and seeing diverse projects through to completion.

Prototyping the MIDI Gesture Controller with Raspberry Pi Pico, which runs CircuitPython code

He made use of Adafruit’s MIDI library, and says programming in CircuitPython using Thonny IDE on Raspberry Pi Pico made a lot of sense: “an incredible bit of kit as a low-cost microcontroller, and being in Python-land feels like home.” He also found it to be the best value for money, and the most reliable board for his project. Other components — including the 6DOF AHRS IMU sensor, arcade joystick ball, 3D printer, and neoprene rubber for grip — were bought from The Pi Hut and other stores. The wiring setup was straightforward enough, with the IMU (inertial measurement unit) and yaw reset button connected to Raspberry Pi Pico.

Despite Gary’s years of experience as a computer scientist and software engineer, the MIDI Gesture Controller project took him several weeks to complete and provided plenty of challenges. Getting a smooth motion on the ball joint was particularly difficult. Having designed the casing in CAD software, Gary says he must have 3D-printed nearly 20 variants to get it right. Another challenge involved getting actual pitch, yaw, and roll values from the IMU. “It took a bit of effort, as did calibrating the ranges and limits of minimums and maximums.”

Gary’s YouTube video amply demonstrates the extra sound possibilities his Gesture Controller can generate

Having first contemplated a multi-DOF expression pedal a few years ago, the MIDI Gesture Controller is now up and running, and Gary continues to tweak and improve it, planning to add a few extra features. He always likes to have a project on the go, is unafraid to try things, and is a big advocate for experimenting with designs in Tinkercad. A few years ago, he launched a Raspberry Pi-based Wi-Fi blocker that caught the press’ attention. The Kickstarter campaign wasn’t successful, but it was a fun project, and he still owns the trademark for a Wi-Fi ‘notspot’.

The MagPi #149 out NOW!

You can grab the new issue right now from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available at our online store, which ships around the world. You can also get it via our app on Android or iOS.

You can also subscribe to the print version of The MagPi. Not only do we deliver it globally, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico W!

The post Raspberry Pi Pico MIDI Gesture Controller appeared first on Raspberry Pi.

Get started with the Raspberry Pi AI HAT+

6 January 2025 at 17:23

In this quick tutorial, our documentation lead Nate Contino explains how to get your Raspberry Pi AI HAT+ working.

If you’re interested in learning more about Raspberry Pi’s imaging and computer vision tools, our Senior Principal Engineer, Naushir Patuck, recently hosted a webinar with DigiKey. You can sign up to watch it on demand if you missed it.

Raspberry Pi single-board computer with an attached AI accelerator module, likely the Raspberry Pi AI Hat. This hat includes a green circuit board with a central chip that appears to be from Hailo, a company that specializes in artificial intelligence (AI) processors. The board is connected to the Raspberry Pi via the GPIO pins, and it has several components related to AI processing and other features to enable high-performance machine learning on the device. This configuration is designed for AI applications like real-time image processing, neural network acceleration, and other computationally intensive tasks. The text "26 TOPS" refers to the AI hat's ability to perform 26 trillion operations per second, which is a significant performance specification for AI applications.
Raspberry Pi AI HAT+ 26 TOPS model

The Raspberry Pi AI HAT+ add-on board has a built-in Hailo AI accelerator compatible with Raspberry Pi 5. The NPU in the AI HAT+ can be used for applications including process control, security, home automation, and robotics.

The AI HAT+ is available in 13 and 26 tera-operations per second (TOPS) variants, built around the Hailo-8L and Hailo-8 neural network inference accelerators. The 13 TOPS variant works best with moderate workloads. The 26 TOPS variant can run larger networks, can run networks faster, and can more effectively run multiple networks simultaneously.

The AI HAT+ communicates using Raspberry Pi 5’s PCIe interface. The host Raspberry Pi 5 automatically detects the on-board Hailo accelerator and uses the NPU for supported AI computing tasks. Raspberry Pi OS’ built-in rpicam-apps camera applications automatically use the NPU to run compatible post-processing tasks.

Install

To use the AI HAT+, you will need:

Each AI HAT+ comes with a ribbon cable, GPIO stacking header, and mounting hardware. Complete the following instructions to install your AI HAT+:

  1. First, ensure that your Raspberry Pi runs the latest software. Run the following command to update: $ sudo apt update && sudo apt full-upgrade
  2. Next, ensure that your Raspberry Pi firmware is up to date. Run the following command to see what firmware you’re running: $ sudo rpi-eeprom-update. If you see 6 December 2023 or a later date, proceed to the next step. If you see a date earlier than 6 December 2023, run the following command to open the Raspberry Pi Configuration CLI: $ sudo raspi-config. Under Advanced Options > Bootloader Version, choose Latest. Then, exit raspi-config with Finish or the Escape key. Run the following command to update your firmware to the latest version: $ sudo rpi-eeprom-update -a. Then, reboot with sudo reboot.
  3. Disconnect the Raspberry Pi from power before beginning installation.
  4. For the best performance, we recommend using the AI HAT+ with the Raspberry Pi Active Cooler. If you have an Active Cooler, install it before installing the AI HAT+.
  5. Install the spacers using four of the provided screws. Firmly press the GPIO stacking header on top of the Raspberry Pi GPIO pins; orientation does not matter as long as all pins fit into place. Disconnect the ribbon cable from the AI HAT+, and insert the other end into the PCIe port of your Raspberry Pi. Lift the ribbon cable holder from both sides, then insert the cable with the copper contact points facing inward, towards the USB ports. With the ribbon cable fully and evenly inserted into the PCIe port, push the cable holder down from both sides to secure the ribbon cable firmly in place. Set the AI HAT+ on top of the spacers, and use the four remaining screws to secure it in place.
  6. Insert the ribbon cable into the slot on the AI HAT+. Lift the ribbon cable holder from both sides, then insert the cable with the copper contact points facing up. With the ribbon cable fully and evenly inserted into the port, push the cable holder down from both sides to secure the ribbon cable firmly in place.
  7. Congratulations, you have successfully installed the AI HAT+. Connect your Raspberry Pi to power; Raspberry Pi OS will automatically detect the AI HAT+.

Get started with AI on your Raspberry Pi

To start running AI-accelerated applications on your Raspberry Pi, check out our Getting Started with AI guide.

For more information about the AI HAT+, including mechanical specifications and operating environment limitations, see the product brief.

Don’t forget to sign up to watch Naushir’s webinar with DigiKey on demand.

The post Get started with the Raspberry Pi AI HAT+ appeared first on Raspberry Pi.

Did you dream of a Raspberry Pi Christmas?

25 December 2024 at 17:03

Season’s greetings! I set this up to auto-publish while I’m off sipping breakfast champagne, so don’t yell at me in the comments — I’m not really here.

I hope you’re having the best day, and if you unwrapped something made by Raspberry Pi for Christmas, I hope the following helps you navigate the first few hours with your shiny new device.

Power and peripherals

If you’ve received, say, a Raspberry Pi 5 or 500 on its own and have no idea what you need to plug it in, the product pages on raspberrypi.com often feature sensible suggestions for additional items you might need.

Scroll to the bottom of the Raspberry Pi 5 product page, for example, and you’ll find a whole ‘Accessories’ section featuring affordable things specially designed to help you get the best possible performance from your computer.

You can find all our hardware here, so have a scroll to find your particular Christmas gift.

Dedicated documentation

There are full instructions on how everything works if you know where to look. Our fancy documentation site holds the keys to all of your computing dreams.

For beginners, I recommend our ‘Getting started’ guide as your entry point.

I need a book

If, like me, you prefer to scoot through a printed book, then Raspberry Pi Press has you covered.

The Official Raspberry Pi Beginner’s Guide 5th Edition is a good idea if you’re a newbie. If you already know what you’re doing but are in need of some inspiration, then the Book of Making 2025 and The Official Raspberry Pi Handbook 2025 are packed with suggestions for Pi projects to fill the year ahead.

Raspberry Pi Beginner's Guide English edition

We’ve also published bespoke titles to help with Raspberry Pi Camera projects or to fulfil your classic games coding desires.

The Official Raspberry Pi Camera Guide 2nd Edition cover

Your one-stop shop for all your Raspberry Pi questions

If all the suggestions above aren’t working out for you, there are approx. one bajillion experts eagerly awaiting your questions on the Raspberry Pi forums. Honestly, I’ve barely ever seen a question go unanswered. You can throw the most esoteric, convoluted problem out there and someone will have experienced the same issue and be able to help. Lots of our engineers hang out in the forums too, so you may even get an answer direct from Pi Towers.

Be social

Outside of our official forums, you’ve all cultivated an excellent microcosm of Raspberry Pi goodwill on social media. Why not throw out a question or a call for project inspiration on our official Facebook, Threads, Instagram, TikTok, or “Twitter” account? There’s every chance someone who knows what they’re talking about will give you a hand.

Also, tag us in photos of your festive Raspberry Pi gifts! I will definitely log on to see and share those.

Again, we’re not really here, it’s Christmas!

I’m off again now to catch the new Wallace and Gromit that’s dropping on Christmas Day (BIG news here in the UK), but we’ll be back in early January to hang out with you all in the blog comments and on social.

Glad tidings, joy, and efficient digestion wished on you all.

The post Did you dream of a Raspberry Pi Christmas? appeared first on Raspberry Pi.

Third Eye assistive vision | The MagPi #149

19 December 2024 at 19:54

This #MagPiMonday, we take a look at Md. Khairul Alam’s potentially life-changing project, which aims to use AI to assist people living with a visual impairment.

Technology has long had the power to make a big difference to people’s lives, and for those who are visually impaired, the changes can be revolutionary. Over the years, there has been a noticeable growth in the number of assistive apps. As well as JAWS — a popular computer screen reader for Windows — and software that enables users to navigate phones and tablets, there are audio-descriptive apps that use smart device cameras to read physical documents and recognise items in someone’s immediate environment.

Understanding the challenges facing people living with a visual impairment, maker and developer Md. Khairul Alam has sought to create an inexpensive, wearable navigation tool that will free up the user’s hands and describe what someone would see from their own eyes’ perspective. Based around a pair of spectacles, it uses a small camera sensor that gathers visual information which is then sent to a Raspberry Pi 1 Model B for interpretation. The user is able to hear an audio description of whatever is being seen.

There’s no doubting the positive impact this project could have on scores of people around the world. “Globally, around 2.2 billion don’t have the capability to see, and 90% of them come from low-income countries,” Khairul says. “A low-cost solution for people living with a visual impairment is necessary to give them flexibility so they can easily navigate and, having carried out research, I realised edge computer vision can be a potential answer to this problem.”

Cutting edge

Edge computer vision is potentially transformative. It gathers visual data from edge devices such as a camera before processing it locally, rather than sending it to the cloud. Since information is being processed close to the data source, it allows for fast, real-time responses with reduced latency. This is particularly vital when a user is visually impaired and needs to be able to make rapid sense of the environment.

The connections are reasonably straightforward: plug the Xiao ESP32S3 Sense module into a Raspberry Pi

For his project, Khairul chose to use the Xiao ESP32S3 Sense module which, aside from a camera sensor and a digital microphone, has an integrated Xtensa EPS32-S3R8 SoC processor, 8MB of flash memory, and a microSD card slot. This was mounted onto the centre of a pair of spectacles and connected to a Raspberry Pi computer using a USB-C cable, with a pair of headphones then plugged into Raspberry Pi’s audio out port. With those connections made, Khairul could concentrate on the project’s software.

As you can imagine, machine learning is an integral part of this project; it needs to accurately detect and identify objects. Khairul used Edge Impulse Studio to train his object detection model. This tool is well equipped for building datasets and, in this case, one needed to be created from scratch. “When I started working on the project, I did not find any ready-made dataset for this specific purpose,” he tells us. “A rich dataset is very important for good accuracy, so I made a simple dataset for experimental purposes.”

To help test the device, Khairul has been using an inexpensive USB-C portable speaker

Object detection

Khairul initially concentrated on six objects, uploading 188 images to help identify chairs, tables, beds, and basins. The more images he could take of an object, the greater the accuracy — but it posed something of a challenge. “For this type of work, I needed a unique and rich dataset for a good result, and this was the toughest job,” he explains. Indeed, he’s still working on creating a larger dataset, and these things take a lot of time; but upon uploading the model to the Xiao ESP32S3 Sense, it has already begun to yield some positive results.

When an object is detected, the module returns the object’s name and position. “After detecting and identifying the object, Raspberry Pi is then used to announce its name — Raspberry Pi has built-in audio support, and Python has a number of text-to-speech libraries,” Khairul says. The project uses a free software package called Festival, which has been written by The Centre for Speech Technology Research in the UK. This converts the text to speech, which can then be heard by the user.

A tidier solution will be needed — including a waterproof case — for real-world situations

For convenience, all of this is currently being powered by a small rechargeable lithium-ion battery, which is connected by a long wire to enable it to sit in the user’s pocket. “Power consumption has been another important consideration,” Khairul notes, “and because it’s a portable device, it needs to be very power efficient.” Since Third Eye is designed to be worn, it also needs to feel right. “The form factor is a considerable factor — the project should be as compact as possible,” Khairul adds.

Going forward

Third Eye is still in a proof-of-concept stage, and improvements are already being identified. Khairul knows that the Xiao ESP32S3 Sense will eventually fall short of fulfilling his ambitions for the project as it expands in the future and, with a larger machine learning model proving necessary, Raspberry Pi is likely to take on more of the workload.

“To be very honest, the ESP32S3 Sense module is not capable enough to respond using a big model. I’m just using it for experimental purposes with a small model, and Raspberry Pi can be a good alternative,” he says. “I believe for better performance, we may use Raspberry Pi for both inferencing and text-to-speech conversions. I plan to completely implement the system inside a Raspberry Pi computer in the future.”

Other potential future tweaks are also stacking up. “I want to include some control buttons so that users can increase and decrease the volume and mute the audio if required,” Khairul reveals. “A depth camera would also give the user important information about the distance of an object.” With the project shared on Hackster, it’s hoped the Raspberry Pi community could also assist in pushing it forward. “There is huge potential for a project such as this,” he says.

The MagPi #149 out NOW!

You can grab the new issue right now from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available at our online store, which ships around the world. You can also get it via our app on Android or iOS.

You can also subscribe to the print version of The MagPi. Not only do we deliver it globally, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico W!

The post Third Eye assistive vision | The MagPi #149 appeared first on Raspberry Pi.

PIOLib: A userspace library for PIO control

By: PhilE
17 December 2024 at 19:51

Dip your toes into the world of PIO on Raspberry Pi 5 using PIOLib

The launch of Raspberry Pi 5 represented a significant change from previous models. Building chips that run faster and use less power, while continuing to support 3.3V I/O, presents real, exciting challenges. Our solution was to split the main SoC (System on Chip) in two — the compute half, and the I/O half — and put a fast interconnect (4-lane PCIe Gen 3) between them. The SoC on Raspberry Pi 5 is the Broadcom BCM2712, and the I/O processor (which used to be known in the PC world as the ‘southbridge’) is Raspberry Pi RP1.

PIOLib: A userspace library for PIO control

Along with all the usual peripherals — USB, I2C, SPI, DMA, and UARTs — RP1 included something a bit more interesting. One of RP2040’s distinguishing features was a pair of PIO blocks, deceptively simple bits of Programmable I/O capable of generating and receiving patterns on a number of GPIOs. With sufficient cunning, users have been able to drive NeoPixel LEDs and HDMI displays, read from OneWire devices, and even connect to an Ethernet network.

RP1 is blessed with a single PIO block — almost identical to the two that RP2040 has — as well as four state machines and a 32-entry instruction memory. However, apart from a few hackers out there, it has so far lain dormant; it would be great to make this resource available to users for their own projects, but there’s a catch.

Need for speed

The connection between RP1’s on-board ARM M3 microcontrollers and the PIO hardware was made as fast as possible, but at the cost of making the PIO registers inaccessible over PCIe; the only exceptions are the state machine FIFOs — the input and output data pipes — that can be reached by DMA (direct memory access). This makes it impossible to control PIO directly from the host processors, so an alternative is required. One option would be to allow the uploading of code to run on the M3 cores, but there are a number of technical problems with that approach:

1. We need to “link” the uploaded code with what is already present in the firmware — think of it as knitting together squares to make a quilt (or a cardigan for Harry Styles). For that to work, the firmware needs a list of the names and addresses of everything the uploaded code might want to access, something that the current firmware doesn’t have.

2. Third-party code running on M3 cores presents a security risk — not in the sense that it might steal your data (although that might be possible…), but that by accident or design it could disrupt the operation of your Raspberry Pi 5.

3. Once the M3s have been opened up in that way, we can’t take it away, and that’s not a step we’re prepared to take.

Not like that, like this

For these reasons, we took a different path. 

The latest RP1 firmware implements a mailbox interface: a simple mechanism for sending messages between two parties. The kernel has corresponding mailbox and firmware drivers, and an rp1-pio driver that presents an ioctl() interface to user space. The end result of adding all this software is the ability to write programs using the PIO SDK that can run in user space or in kernel drivers.

Latency trade-off

Most of the PIOLib functions cause a message to be sent to the RP1 firmware, which performs the operation — possibly just a single I/O access — and replies. Although this makes it simple to run PIO programs on Raspberry Pi 5 (and the rest of the Raspberry Pi family), it does come at a cost. All that extra software adds latency; most PIOLib operations take at least 10 microseconds. For PIO software that just creates a state machine and then reads or writes data, this is no problem — the WS2812 LED and PWM code are good examples of this. But anything that requires close coupling between the state machine and driver software is likely to have difficulties.

The first official use of PIOLib is the new pwm-pio kernel driver. It presents a standard Linux PWM interface via sysfs, and creates a very stable PWM signal on any GPIO on the 40-pin header (GPIOs 0 to 27). You can configure up to four of these PWM interfaces on Raspberry Pi 5; you are limited by the number of state machines. Like many peripherals, you create one with a Device Tree overlay:

dtoverlay=pwm-pio,gpio=7

One feature absent from this first release is interrupt support. RP1 provides two PIO interrupts, which can be triggered by the PIO instruction IRQ (interrupt request), and these could be used to trigger actions on the SoC.

Over time, we may discover that there are some common usage patterns — groups of the existing PIOLib functions that often appear together. Adding those groups to the firmware as single, higher-level operations may allow more complex PIO programs to run. These and other extensions are being considered.

Let me play!

If you’d like to try PIOLib, you will need:

  • The library (and examples)
  • The latest kernel (sudo apt update; sudo apt upgrade)
  • The latest EEPROM (see the ‘Advanced Options’ section of raspi-config)

I’ll leave you with a video of some flashing lights — two strings of WS2812 LEDs being driven from a Raspberry Pi 5. It’s beginning to look a bit festive!

The post PIOLib: A userspace library for PIO control appeared first on Raspberry Pi.

We made our own WOPR for Pi Towers

16 December 2024 at 19:48

Ah, the WOPR — or “War Operation Plan Response” for those who enjoy abbreviations that sound like a robot from the future, only less like a friend and more like an overzealous maths teacher.

The WOPR is the supercomputer from the 1983 movie WarGames. It doesn’t understand sarcasm, it can’t sense when it’s being pranked, and it certainly doesn’t know when it’s been told to “play a game” — much like our Maker in Residence, Toby, who built it to delight and entertain all visitors to the Pi Towers Maker Lab.

What’s inside?

A script runs on boot, which twinkles the NeoPixels in the traditional 1980s supercomputer colours, yellow and red.

Another script can be run to play a short clip from the film WarGames on the Touch Display 2 screen, explaining the WOPR. At the press of a button on the Touch Display, our faux WOPR also parrots famous lines from the film, such as: “Shall we play a game?” and “How about a nice game of chess?”

For those who wish to linger a little longer in the Maker Lab, Toby devised a game in which clips from 1980s films and music videos flash (a little too fast, in my opinion) up on the screen, with your job being to enthusiastically shout out where each clip is from.

Authentic enclosure

The body of the WOPR is a combination of 3D-printed plastics and laser-cut MDF painted in industrial grey, with Cricut silver lettering on the side. Everything is glued together, and a lot of sanding was required to make it appear as though it’s a sleek, fancy contraption from the future.

The post We made our own WOPR for Pi Towers appeared first on Raspberry Pi.

New Raspberry Pi 500 and Monitor: reviews, teardowns, builds

12 December 2024 at 22:30

After a bumper autumn of product launches, we thought why not go full Santa as we head towards our winter break and give you all another double product launch? On Monday, we released Raspberry Pi 500 and the Raspberry Pi Monitor into the world. Here’s what some of your favourite YouTubers did with them.

VEEB Projects

VEEB get major points for their impossibly simple yet genius idea, leaving us at Pi Towers wondering “why didn’t I think of that?” They mounted an SD card holder on the back of the Raspberry Pi Monitor’s kickstand, making it super easy to switch them out and giving them access to three different PC systems at their fingertips — a desktop PC, a retro gaming centre, and a music streamer.

VEEB Project Pi 500 Monitor SD card holder
If you’d like to perform the sincerest form of flattery, you can download the printable files for VEEB’s SD card storage case and make your own.

NetworkChuck

Chuck asks the question that Mad Men‘s Don Draper — actually, no, copywriter extraordinaire Peggy — would begin with: “who is this for?” Adorable cameos from The Littles in his review answer it for him, with the very littlest ably assisting in the plug-and-play set up of her new desktop PC before settling in to play some Roblox.

He also gives us a handy side-by-side comparison with his Raspberry Pi 400.

Disclaimer: Raspberry Pi 500 is not edible

Jeff Geerling

Jeff gets straight to the point: “the keyboard is the computer”. He also wins the prize for most avant-garde presentation of the Monitor and Pi 500 side by side in the above video thumbnail.

And while Jeff proper has decorum and self restraint, Level 2 Jeff couldn’t help himself, going right ahead and cracking his Pi 500 open to see what’s inside.

Kevin McAleer

Kevin could not wait until his usual Sunday night livestream, and went live with a detailed demo of Raspberry Pi 500 and the Raspberry Pi Monitor the day after launch. If deep dives are your bag, grab snacks and settle in for this hour-long opus.

Kev’s a professional YouTuber, though, so if you haven’t the time, he obviously also rolled out a succinct six-minute video on our latest creations.

leepspvideo

And if you can’t get enough destruction, leepspvideo also did a nice teardown of Raspberry Pi 500, and tested the audio output on the Raspberry Pi Monitor, checking that it works great with his Raspberry Pi 5. Furthermore, he is accompanied by an excellent cat for the majority of the review.

Gary Explains

We really liked Gary’s straightforward “what is it, what does it do, how much does it cost?” approach. He too pops the hood to give you a nice clear look inside Raspberry Pi 500.

ETA Prime

We know where ETA Prime’s heart lies when they proclaim Raspberry Pi 500’s gaming possibilities right at the start of their review and teardown. In preparation for their gaming bonanza, a little overclocking is tested and some benchmarks run, but you’ll need to subscribe to ETA Prime’s channel to keep up with the promised gaming videos.

Did we miss anyone? These were all the videos we’d seen at the time of writing, but we’re 89% sure we’re horribly behind the times already. Drop a link to more reviews and leave a comment if you have an idea for a Raspberry Pi 500 project you’d like to see.

The post New Raspberry Pi 500 and Monitor: reviews, teardowns, builds appeared first on Raspberry Pi.

Raspberry Pi 500 and Raspberry Pi Monitor on sale now

9 December 2024 at 15:00

Just in time for Christmas, we’re delighted to announce the release of two hotly anticipated products that we think will look great under the tree. One of them might even fit in a stocking if you push hard enough. Introducing Raspberry Pi 500, available now at $90, and the Raspberry Pi Monitor, on sale at $100: together, they’re your complete Raspberry Pi desktop setup.

With Raspberry Pi, your desk can look this good

Integral calculus

Our original mission at Raspberry Pi was to put affordable, programmable personal computers in the hands of young people all over the world. And while we’ve taken some detours along the way – becoming one of the world’s largest manufacturers of industrial and embedded computers – this mission remains at the heart of almost everything we do. It drives us to make lower-cost products like the $15 Raspberry Pi Zero 2 W, and more powerful products, like our flagship Raspberry Pi 5 SBC. These products provide just the essential processing element of a computer, which can be combined with the family television, and second-hand peripherals, to build a complete and cost-effective system.

But over time we have come to understand the benefits of integration: some people are better served by a system that is ready to use straight out of the box. This need was dramatized during the early days of the COVID pandemic, when we worked with the Raspberry Pi Foundation to deliver thousands of Raspberry Pi 4 Desktop Kits and monitors to young people studying from home in the UK. Our experiences with that programme informed the development of Raspberry Pi 400, our all-in-one PC, whose form factor (and name) harks back to the great 8-bit and 16-bit computers – the BBC Micro, Sinclair Spectrum, and Commodore Amiga – of the 1980s and 1990s.

Meet Raspberry Pi 500

In the four years since it launched, Raspberry Pi 400 has become a hugely popular choice for enthusiasts and educators. And today, we’re launching its successor, Raspberry Pi 500, bringing the features and performance of the Raspberry Pi 5 platform to our all-in-one form factor:

  • 2.4GHz quad-core 64-bit Arm Cortex-A76 processor
  • 8GB LPDDR4X-4267 SDRAM
  • VideoCore VII GPU, supporting OpenGL ES 3.1 and Vulkan 1.3
  • Dual 4Kp60 HDMI® display output
  • Dual-band 802.11ac Wi-Fi® and Bluetooth 5.0
  • 2 × USB 3.0 ports, supporting simultaneous 5Gbps operation
  • 1 × USB 2.0 port
  • Gigabit Ethernet port
  • Horizontal 40-pin Raspberry Pi GPIO connector
The ultimate compact PC

Raspberry Pi 500 is priced at $90, including a 32GB Raspberry Pi-branded SD card, and is also available in a $120 Desktop Kit, which adds:

  • Raspberry Pi Mouse
  • Raspberry Pi 27W USB-C Power Supply
  • 2m micro HDMI to HDMI cable
  • Raspberry Pi Beginner’s Guide, 5th Edition

The vision thing – an official Raspberry Pi Monitor

Although it’s highly integrated, Raspberry Pi 500 is only half the story: to build a complete system, you still need a display device. Which is why we’re also launching the Raspberry Pi Monitor, available now at $100. Designed to coordinate perfectly with your Raspberry Pi 500 or cased Raspberry Pi 5, it incorporates a 15.6″ full HD IPS panel with a 45% colour gamut and an 80° viewing angle, together with a pair of 1.2W speakers, in a slender enclosure with a fold-away integrated stand and VESA mounting points.

The perfect desktop display companion for your Raspberry Pi or lesser computer

Power is provided via a USB-C connector. Cost-conscious users can power the monitor directly from their Raspberry Pi via the included USB-A to USB-C cable; in this mode display brightness is limited to 60% of maximum (still quite bright!) and volume to 50% of maximum (still quite loud!). Using a dedicated USB-C supply capable of delivering 5V/3A, like the Raspberry Pi 15W USB-C Power Supply, enables the full brightness and volume ranges.

Faster, better, cheaper: Raspberry Pi 400 price cuts

While we’re incredibly excited about Raspberry Pi 500, we need to remember that cost remains a barrier to access for many people, young and old. So we’re also taking this opportunity to cut the price of Raspberry Pi 400 from $70 to $60, and the Raspberry Pi 400 Personal Computer Kit from $100 to $80. We’re also bundling a Raspberry Pi-branded SD card with every Raspberry Pi 400, to help you get the best possible performance out of the system.

We know that quite a few of you have been eagerly awaiting both of our new products, and we hope you enjoy them now they’re here. We’ve seen Raspberry Pi 400 everywhere from retro gaming setups to university exam facilities and hospital offices; we’re really looking forward to finding out where Raspberry Pi 500 and our new Raspberry Pi Monitor end up.

The post Raspberry Pi 500 and Raspberry Pi Monitor on sale now appeared first on Raspberry Pi.

Raspberry Pi Connect for Organisations, plus full-screen support

6 December 2024 at 22:48

Earlier this year we told you all about our awesome new remote access service, Raspberry Pi Connect. We said we wanted to make it as useful as possible for our individual users, and provide it for free on Raspberry Pi devices. But we knew our industrial and embedded customers would like to use the functionality it provided, and more. Since launching Raspberry Pi Connect, we’ve been gathering information from these customers to understand what they are using it for and what they’d like to see.

Also, for all you individual users, we’ve not stopped developing the service, so read on for new functionality for you too!

Connect for Organisations

Feedback from our commercial customers shows that Connect has hit on a particular problem many of them have. When supporting their products in the field, whether that’s fifty metres up a radio transmission tower or at a customer site, it is difficult to maintain those systems when things go wrong. There are many commercial customers who have found Connect the perfect solution to this problem. But the service had a limitation: the devices are ‘owned’ by a single user, and no other users can access them. The worry one customer had, about one of their IT team disappearing with control of all their customers’ devices, was clear!

There are also situations where a customer has only a single Raspberry Pi, but wants to provide many users with access to it. Or where a school with a set of Raspberry Pis is giving each of their students access to them, so they can develop software remotely. Introducing Raspberry Pi Connect for Organisations!

Connect for Organisations allows you to create an organisation account which can own the Raspberry Pi devices registered to it:

Much like Raspberry Pi Connect for individual users, devices are added to the organisation’s account and can be controlled through the web page. To switch between your personal account and an organisation account, you can just click on the switch icon in the top left. Of course, now you have an organisation, it is going to need users:

Users can be invited into the organisation easily. Currently we’re not limiting the number of users or charging for the number of users — we don’t anticipate users per se to consume much bandwidth, storage, or processing resource, so we suspect that would be an unnecessary complication. As you can see, there are only two roles, administrator and member; only administrators can add or remove devices.

What does it cost?

We’ve kept pricing simple. Raspberry Pi Connect for Organisations costs $0.50 per device per month, based on the maximum number of devices registered in the month, and you get unlimited users.

Next up

Now that organisation functionality is available, we’ve got some other things to start working on. To give you an idea of where we’re going with Connect, some of these are:

  • Device tagging: tag devices with your own labels, and use those tags to search and identify different classes of device
  • Access control lists: using tags to give users different levels of access to devices
  • Ability to sign devices up from Raspberry Pi Imager: boot direct to headless installation!
  • Capacity for bulk provisioning of Raspberry Pi Connect device secrets during manufacture of Compute Module- and Raspberry Pi-based products

Now for the eye candy

Some of you may have noticed a new button on the screen sharing interface:

The ability to enter full-screen mode at the click of a button is great for people who want to be able to get a better view of the destination screen, making it work more obviously — a little bit of useful functionality for all Connect users. We hope you like it!

The post Raspberry Pi Connect for Organisations, plus full-screen support appeared first on Raspberry Pi.

Valve’s Steam Link on Raspberry Pi

3 December 2024 at 17:38

Earlier this year we released Raspberry Pi Connect, which lets you access your Raspberry Pi from anywhere, either through a remote shell interface or by screen sharing. But perhaps, occasionally, you might need to screen share some other computer; what if you want to screen share your big PC, with its gaming graphics capabilities, around your house? Is it possible to use it to play your games from anywhere? Happily, thanks to Valve’s hugely popular Steam Link product, the answer is yes. With Steam Link, our kids can — OK, we can — play PC games on any computer in the house, without having to lug the PC around. And now, you can run Steam Link on your Raspberry Pi 5!

steam link running on Jeff Geerling's set up
Thanks for the image, Jeff Geerling!

Steam Link is actually tackling some quite difficult challenges to enable us to play graphics-heavy games remotely. Firstly, screen sharing is not normally optimised for sending high quality images, since you have to work quite hard to keep both the bitrate and the latency down; you also don’t normally transmit audio as well as video, and you need to do a bit of magic to talk to game controllers. So the smart folks at Valve have successfully solved quite a few hard problems to bring this into being.

Even better, Sam Lantinga from Valve — who is also the developer of SDL, a simple multimedia programming library — has been working for a little while on getting Steam Link to run on Raspberry Pi 5. The previous method used to run Steam Link on Raspberry Pi OS no longer worked very well after we moved away from the closed-source Broadcom multimedia libraries, and with Wayland, a different approach was needed. Sam has been working with the Raspberry Pi software team to use our hardware in the most efficient way possible.

Valve’s announcement of Steam Link v1.3.13 shows that Sam has been able to get Steam Link working at some amazing rates on Raspberry Pi 5, including 4kp60 and even 1080p240 (obviously you’ll need a suitable monitor for that!).

Jeff running Steam Link on Raspberry Pi 5

To install Steam Link yourself, grab yourself an up-to-date Raspberry Pi OS image and type:

sudo apt update
sudo apt upgrade
sudo apt install steamlink
steamlink

Enjoy!

The post Valve’s Steam Link on Raspberry Pi appeared first on Raspberry Pi.

Deploying Ultralytics YOLO models on Raspberry Pi devices

29 November 2024 at 21:18

In this guest post, Ultralytics, creators of the popular YOLO (You Only Look Once) family of convolutional neural networks, share their insights on deploying and running their powerful AI models on Raspberry Pi devices, offering solutions for a wide range of real-world problems.

Computer vision is redefining industries by enabling machines to process and understand visual data like images and videos. To truly grasp the impact of vision AI, consider this: Ultralytics YOLO models, such as Ultralytics YOLOv8 and the newly launched Ultralytics YOLO11, which support computer vision tasks like object detection and image classification, have been used over 100 billion times. There are 500 to 600 million uses every day and thousands of uses every second across applications like robotics, agriculture, and more.

YOLO can be used in the agriculture sector

To take this a step further, Ultralytics has partnered with Raspberry Pi to bring vision AI to one of the most accessible and versatile computing platforms. This collaboration makes it possible to deploy YOLO models directly on Raspberry Pi, enabling real-time computer vision applications in a compact, cost-effective, and easy-to-use way.

By supporting such integrations, Ultralytics aims to enhance model compatibility across diverse deployment environments. For instance, the Sony IMX500, the intelligent vision sensor with on-sensor AI processing capabilities included in the Raspberry Pi AI Camera, works with Raspberry Pi to run YOLO models, enabling advanced edge AI applications.

In this article, we’ll explore how YOLO models can be deployed on Raspberry Pi devices, look at real-world use cases, and highlight the benefits of this exciting collaboration for vision AI projects. Let’s get started!

Enabling edge AI solutions with Raspberry Pi and Ultralytics YOLO

Raspberry Pi is an affordable and widely used device, making it a great choice for deploying vision AI models like YOLO. Running Ultralytics YOLO models on Raspberry Pi enables real-time computer vision capabilities, such as object detection, directly on the device, eliminating the need for cloud resources. Local processing reduces latency and improves privacy, making it ideal for applications where speed and data security are essential.

Ultralytics offers optimized models, like YOLO11, that can run efficiently on relatively resource-constrained devices, with the Nano and Small model variants providing the best performance on lower-power hardware. Leveraging these optimized models on Raspberry Pi devices is easy with the Ultralytics Python API or CLI, ensuring smooth deployment and operation. In addition to this, Ultralytics also supports automated testing for Raspberry Pi devices on GitHub Actions to regularly check for bugs and ensure the models are ready for deployment.

Another interesting feature of the Ultralytics YOLO models is that they can be exported in various formats (as shown in the image below), including NCNN (Neural Network Compression and Optimization). Designed for devices with relatively constrained computing power, such as Raspberry Pi’s ARM64 architecture, NCNN ensures faster inference times by optimizing model weights and activations through techniques like quantization.

Benchmarking Ultralytics YOLO11 inferencing On Raspberry Pi

Raspberry Pi, Sony IMX500, and YOLO for real-time AI applications

The Raspberry Pi AI Camera is a perfect example of how this integration helps support compatibility across a range of deployment environments. Its IMX500 intelligent vision sensor comes with on-sensor AI processing, allowing it to analyze visual data directly and output metadata rather than raw images. While the IMX500 is powerful on its own, it needs to be paired with a device like Raspberry Pi to run YOLO models effectively. In this setup, a Raspberry Pi acts as the host device, facilitating communication with the AI Camera and enabling real-time AI applications powered by YOLO.

Raspberry Pi AI Camera incorporates the Sony IMX500

Real-world examples of YOLO applications on Raspberry Pi

Raspberry Pi, combined with the Ultralytics YOLO models, unlocks countless possibilities for real-world applications. This collaboration bridges the gap between experimental AI setups and production-ready solutions, offering an affordable, scalable, and practical tool for a wide range of industries. 

Here are a few impactful use cases:

  • Robotics: YOLO can enable robots to navigate environments, recognize objects, and perform tasks with precision, making them more autonomous and efficient
  • Drones: With YOLO running on Raspberry Pi, drones can detect obstacles, track objects, and perform surveillance in real-time, enhancing their capabilities in industries like delivery and security
  • Quality control in manufacturing: YOLO can help identify defects in production lines, ensuring higher quality standards with automated inspection
  • Smart farming: By using YOLO to monitor crop health and detect pests, farmers can make data-driven decisions, improving yields and reducing resource waste

Benefits of running Ultralytics YOLO models on Raspberry Pi for edge AI

There are many advantages to deploying YOLO models on Raspberry Pi, making it a practical and affordable option for edge AI applications. For instance, performance can be boosted by using hardware accelerators like Google Coral Edge TPU, enabling faster and more efficient real-time processing.

Coral Edge TPU connected to a Raspberry Pi

Here are some of the other key benefits:

  • Scalability: The setup can be extended to multiple devices, making it a great choice for larger projects such as factory automation or smart city systems
  • Flexibility: YOLO’s compatibility ensures that developers can create solutions that work seamlessly across a variety of hardware setups, offering versatility for different applications
  • Community and support: With extensive resources, tutorials, and an active community, Ultralytics provides the support needed for smooth deployment and troubleshooting of YOLO models on Raspberry Pi

To the edge and beyond with Ultralytics YOLO and Raspberry Pi

YOLO and Raspberry Pi are making edge AI applications more accessible, impactful, and transformative than ever before. By putting together the advanced capabilities of Ultralytics YOLO models with the cost-effectiveness and flexibility of Raspberry Pi, this partnership allows developers, researchers, and hobbyists to bring innovative ideas to life. 

With support for devices like the Raspberry Pi AI Camera and scalable hardware options, this collaboration unlocks opportunities across industries, from robotics and agriculture to manufacturing and beyond.

Explore the possibilities of AI with Ultralytics: visit the Ultralytics GitHub repository to discover how vision AI is making a change in sectors like healthcare and self-driving cars, and join the Ultralytics community to be part of the future of vision AI.

The post Deploying Ultralytics YOLO models on Raspberry Pi devices appeared first on Raspberry Pi.

Compute Module 5 on sale now from $45

27 November 2024 at 14:59

Today we’re happy to announce the much-anticipated launch of Raspberry Pi Compute Module 5, the modular version of our flagship Raspberry Pi 5 single-board computer, priced from just $45.

An unexpected journey

We founded the Raspberry Pi Foundation back in 2008 with a mission to give today’s young people access to the sort of approachable, programmable, affordable computing experience that I benefitted from back in the 1980s. The Raspberry Pi computer was, in our minds, a spiritual successor to the BBC Micro, itself the product of the BBC’s Computer Literacy Project.

But just as the initially education-focused BBC Micro quickly found a place in the wider commercial computing marketplace, so Raspberry Pi became a platform around which countless companies, from startups to multi-billion-dollar corporations, chose to innovate. Today, between seventy and eighty percent of Raspberry Pi units go into industrial and embedded applications.

While many of our commercial customers continue to use the “classic” single-board Raspberry Pi form factor, there are those whose needs aren’t met by that form factor, or by the default set of peripherals that we choose to include on the SBC product. So, in 2014 we released the first Raspberry Pi Compute Module, providing just the core functionality of Raspberry Pi 1 – processor, memory, non-volatile storage and power regulation – in an easy-to-integrate SODIMM module.

Compute Modules make it easier than ever for embedded customers to build custom products which benefit from our enormous investments in the Raspberry Pi hardware and software platform. Every subsequent generation of Raspberry Pi, except for Raspberry Pi 2, has spawned a Compute Module derivative. And today, we’re happy to announce the launch of Compute Module 5, the modular version of our flagship Raspberry Pi 5 SBC.

Meet Compute Module 5

Compute Module 5 gives you everything you love about Raspberry Pi 5, but in a smaller package:

  • A 2.4GHz quad-core 64-bit Arm Cortex-A76 CPU
  • A VideoCore VII GPU, supporting OpenGL ES 3.1 and Vulkan 1.3
  • Dual 4Kp60 HDMI® display output
  • A 4Kp60 HEVC decoder
  • Optional dual-band 802.11ac Wi-Fi® and Bluetooth 5.0
  • 2 × USB 3.0 interfaces, supporting simultaneous 5Gbps operation
  • Gigabit Ethernet, with IEEE 1588 support
  • 2 × 4-lane MIPI camera/display transceivers
  • A PCIe 2.0 x1 interface for fast peripherals
  • 30 GPIOs, supporting 1.8V or 3.3V operation
  • A rich selection of peripherals (UART, SPI, I2C, I2S, SDIO, and PWM)

It is available with 2GB, 4GB, or 8GB of LPDDR4X-4267 SDRAM, and with 16GB, 32GB, or 64GB of MLC eMMC non-volatile memory. 16GB SDRAM variants are expected to follow in 2025.

Compute Module 5 is mechanically compatible with its predecessor, Compute Module 4, exposing all signals through a pair of high-density perpendicular connectors, which attach to corresponding parts on the customer’s carrier board. Additional stability is provided by four M2.5 mounting holes arranged at the corners of the board.

There are a small number of changes to the pin-out and electrical behaviour of the module, mostly associated with the removal of the two two-lane MIPI interfaces, and the addition of two USB 3.0 interfaces. A detailed summary of these changes can be found in the Compute Module 5 datasheet.

Accessories accessorise

But Compute Module 5 is only part of the story. Alongside it, we’re offering a range of new accessories to help you get the most out of our new modular platform.

IO Board

Every generation of Compute Module has been accompanied by an IO board, and Compute Module 5 is no exception.

The Raspberry Pi Compute Module 5 IO Board breaks out every interface from a Compute Module 5. It serves both as a development platform and as reference baseboard (with design files in KiCad format), reducing the time to market for your Compute Module 5-based designs.

The IO Board features:

  • A standard 40-pin GPIO connector
  • 2 × full-size HDMI 2.0 connectors
  • 2 × 4-lane MIPI DSI/CSI-2 FPC connectors (22-pin, 0.5mm pitch cable)
  • 2 × USB 3.0 connectors
  • A Gigabit Ethernet jack with PoE+ support (requires a separate Raspberry Pi PoE+ HAT+)
  • An M.2 M-key PCIe socket (for 2230, 2242, 2260 and 2280 modules)
  • A microSD card socket (for use with Lite modules)
  • An RTC battery socket
  • A 4-pin fan connector

Power is provided by a USB-C power supply (sold separately).

IO Case

As in previous generations, we expect some users to deploy the IO Board and Compute Module combination as a finished product in its own right: effectively an alternative Raspberry Pi form factor with all the connectors on one side. To support this, we are offering a metal case which turns the IO Board into a complete encapsulated industrial-grade computer. The Raspberry Pi IO Case for Raspberry Pi Compute Module 5 includes an integrated fan, which can be connected to the 4-pin fan connector on the IO Board to improve thermal performance.

Cooler

While Compute Module 5 is our most efficient modular product yet in terms of energy consumed per instruction executed, like all electronic products it gets warm under load. The Raspberry Pi Cooler for Raspberry Pi Compute Module 5 is a finned aluminium heatsink, designed to fit on a Compute Module 5, and including thermal pads to optimise heat transfer from the CPU, memory, wireless module and eMMC.

Antenna Kit

Wireless-enabled variants of Compute Module 5 provide both an onboard PCB antenna, and a UFL connector for an external antenna. Use of the Raspberry Pi Antenna Kit (identical to that already offered for use with Compute Module 4) with Compute Module 5 is covered by our FCC modular compliance.

Development Kit

The Raspberry Pi Development Kit for Raspberry Pi Compute Module 5 comprises a Compute Module 5, an IO Board, and all the other accessories you need to start building your own design:

  • CM5104032 (Compute Module 5, with wireless, 4GB RAM, 32GB eMMC storage)
  • IO Case for Compute Module 5
  • Compute Module 5 IO Board
  • Cooler for Compute Module 5
  • Raspberry Pi 27W USB-C PD Power Supply (local variant as applicable)
  • Antenna Kit
  • 2 × Raspberry Pi standard HDMI to HDMI Cable
  • Raspberry Pi USB-A to USB-C Cable

Early adopters

Today’s launch is accompanied by announcements of Compute Module 5-based products from our friends at KUNBUS and TBS, who have built successful products on previous Raspberry Pi Compute Modules and whom we have supported to integrate our new module into their latest designs. Other customers are preparing to announce their own Compute Module 5-powered solutions over the next weeks and months. The world is full of innovative engineering companies of every scale, and we’re excited to discover the uses to which they’ll put our powerful new module. Try Compute Module 5 for yourself and let us know what you build with it.

The post Compute Module 5 on sale now from $45 appeared first on Raspberry Pi.

Powering industrial innovation: Compute Module 5 meets Revolution Pi

By: Dave Lee
27 November 2024 at 14:58

Revolution Pi has been designing and manufacturing successful products with Raspberry Pi Compute Modules for years. In this guest post, they talk about why they continue to choose Raspberry Pi technology, and discuss their experience designing with our brand-new Compute Module 5.

Revolution Pi has been building flexible industrial devices with Raspberry Pi Compute Modules since the very beginning. As a long-time partner, we have witnessed their impressive evolution from the first to the fifth generation over the past ten years.

Technical advancements that matter

Raspberry Pi Compute Module 5’s enhancements directly address industrial requirements: it provides quad-core CPU performance up to 2.4GHz, a built-in USB 3.2 controller, and an improved PCIe controller. Raspberry Pi’s continuous integration of more interfaces directly on the Compute Module advances its capabilities while freeing up valuable space on our carrier board. These well-integrated interfaces within the Raspberry Pi ecosystem enable more flexible hardware designs. This allowed us to equip the RevPi Connect 5 with up to four multi-Gigabit Ethernet ports, letting industrial users connect multiple industrial fieldbuses and other networks with low latency.

The RevPi Connect 5 consists of two PCBs with a big bolted-on heat sink

Collaborative development process

Working with Raspberry Pi on this has been exceptional. They understand what industrial developers need. We received early samples to test with, which was critical. It allowed us to iterate and optimise our design solutions, especially when developing a custom heat sink. Managing the heat generated by the powerful new Compute Module in a DIN rail enclosure was an important part of the design process. Having real hardware to test with made all the difference.

Systematic thermal management

Maintaining Compute Module 5’s operating temperature below 85°C under heavy load required a methodical development process. We started with thermal simulation analysis to identify hotspots at full operating capacity. This analysis formed the basis for our practical prototyping. Through iterative testing under extreme conditions, we optimised the heatsink design before conducting extensive testing with the final housing inside our climatic chamber. The entire process culminated in establishing precise manufacturing standards with rigorous quality control.

Analysis of simulated airflow in the heatsink

Seamless software integration

On the software side, working with Raspberry Pi’s platform enables smooth integration. When we hit technical challenges, their engineering team was right there to support us. Their unified kernel approach across all products allowed us to focus on integrating new features like the CAN FD interfaces instead of wrestling with compatibility issues. This standardisation benefits Revolution Pi users as well — they can use our industrialised Raspberry Pi OS-based image consistently across all Revolution Pi devices.

A typical Revolution Pi system configuration, consisting of a RevPi Connect 5 and several expansion modules

A proven partnership

From the first Compute Module to now, Raspberry Pi has shown growing commitment to industrial computing. Compute Module 5, purpose-built for products like Revolution Pi, demonstrates what’s possible when combining Raspberry Pi’s innovation with our industrial-grade engineering. We’re excited to continue pushing the boundaries of industrial automation and IIoT applications together.

The post Powering industrial innovation: Compute Module 5 meets Revolution Pi appeared first on Raspberry Pi.

Raspberry Pi Christmas shopping guide

26 November 2024 at 22:40

It’s the most wonderful time of the year… to give someone on your gift list something (or all things) Raspberry Pi. The past year has seen many exciting new releases, so we understand if you’re sat scratching your head at what to buy your favourite Raspberry Pi fanatic. But look no further! For the sake of your peace, and in a show of our goodwill, we elves have gone and done all the work for you. Good tidings we bring.

This image features a Raspberry Pi AI Camera Module connected to a long, curved orange ribbon cable. The small, square-shaped green circuit board has a black camera lens at its center and yellow mounting holes at each corner. The ribbon cable is flexed into a loop and prominently displays white text that reads "Raspberry Pi Camera Cable Standard – Mini – 200mm." The cable is designed to connect the camera to a Raspberry Pi device, and the image is set against a plain gray background.

Our newest stuff

If it’s a Raspberry Pi superfan you’ve got on your list, you might want to plump for one of our latest hardware releases to really impress them. After all, what do you get someone who has everything? The newest, shiniest thing they haven’t managed to get their hands on yet.

Raspberry Pi Pico 2 W

Launched just a couple of days ago, Raspberry Pi Pico 2 W is the wireless variant of Pico 2, giving you even more flexibility in your connected projects. It’s on sale now for just $7.

Raspberry Pi Touch Display 2

We also upgraded our touch display this year. Raspberry Pi Touch Display 2 is a seven-inch 720×1280px touchscreen display for Raspberry Pi. It’s ideal for interactive projects such as tablets, entertainment systems, and information dashboards, and it’s available for $60.

Raspberry Pi AI HAT+

For the more confident Raspberry Pi user, you might want something to tempt them to broaden their skills into the field of AI. The Raspberry Pi AI HAT+ features a built-in neural network accelerator, turning your Raspberry Pi 5 into a high-performance, accessible, and power-efficient AI machine. The Raspberry Pi AI HAT+ allows you to build a wide range of AI-powered applications for process control, home automation, research, and more. It’s on sale now from $70.

The image you uploaded shows a Raspberry Pi single-board computer with an attached AI accelerator module, likely the Raspberry Pi AI Hat. This hat includes a green circuit board with a central chip that appears to be from Hailo, a company that specializes in artificial intelligence (AI) processors. The board is connected to the Raspberry Pi via the GPIO pins, and it has several components related to AI processing and other features to enable high-performance machine learning on the device. This configuration is designed for AI applications like real-time image processing, neural network acceleration, and other computationally intensive tasks. The text "26 TOPS" refers to the AI hat's ability to perform 26 trillion operations per second, which is a significant performance specification for AI applications.

Raspberry Pi AI Camera

For more easy-to-deploy vision AI applications and neural network models, we’d recommend our new Raspberry Pi AI Camera, which takes advantage of Sony’s IMX500 Intelligent Vision Sensor. It’s available now for $70, and it works with any model of Raspberry Pi — including the super low-cost Zero family.

This image features a Raspberry Pi AI Camera Module connected to a long, curved orange ribbon cable. The small, square-shaped green circuit board has a black camera lens at its center and yellow mounting holes at each corner. The ribbon cable is flexed into a loop and prominently displays white text that reads "Raspberry Pi Camera Cable Standard – Mini – 200mm." The cable is designed to connect the camera to a Raspberry Pi device, and the image is set against a plain gray background.
This image shows a Raspberry Pi setup on a wooden surface, featuring a Raspberry Pi board connected to an AI camera module via an orange ribbon cable. The Raspberry Pi board is attached to several cables: a red one on the left for power and a white HDMI cable on the right. The camera module sits in the lower right corner, with its lens facing up. Part of a white and red keyboard is visible on the right side of the image, and a small plant in a white pot is partially visible on the left. The scene suggests a Raspberry Pi project setup in progress.

Stocking stuffers

If you’re looking for some smaller-but-still-mighty bits to fit in a stocking, we have some great affordable options too. Below is a list of some of the very latest, including a recent fan favourite, the…

Raspberry Pi Bumper

Protect and secure your Raspberry Pi 5 with the Raspberry Pi Bumper, a snap-on silicone cover that protects the bottom and edges of the board. This is a lovely, affordable, and super useful gift for any Raspberry Pi user, and it costs just $3.

Raspberry Pi SD Cards

2024 saw the release of our first-party Raspberry Pi SD Cards. Rigorously tested to ensure optimal performance on Raspberry Pi computers, these Class A2 microSD cards help ensure you get the smoothest user experience from your device. They are available in three different capacities to fit your needs.

32GB
64GB
128GB

Raspberry Pi SSD Kit

With a Raspberry Pi M.2 HAT+ and a Raspberry Pi NVMe SSD bundled together, the Raspberry Pi SSD Kit lets you unlock outstanding performance for I/O intensive applications on your Raspberry Pi 5 — including super-fast startup when booting from SSD. The Kit is available now, in 256GB or 512GB capacities, from $40.

You can also grab the SSDs on their own, starting from $30.

Raspberry Pi USB 3 Hub

Our Raspberry Pi USB 3 Hub is the solution to your need for more peripherals than you have ports: it provides extra connectivity for your devices by turning one USB-A port into four, and is compatible with all Raspberry Pi devices. We think it’s the best you can buy. You can get one now for just $12.

Mugs, stickers, and badges

If you’re looking for something super fun and easy, check out our Raspberry Pi-branded merchandise, available to buy online from your local Approved Reseller. If you’re in Cambridge, UK, a trip to the Raspberry Pi Store would put stickers, mugs, water bottles, t-shirts, and more in your hands right away. (More on that below.)

Books, books, and more books

A personal favourite of mine this Christmas, and certainly your dearest retro gamer’s, is Code the Classics Volume II (£24.99), which shows you how to create your own video games inspired by some of the seminal games of the 1980s.

The Official Raspberry Pi Camera Guide 2nd Edition cover

If you were thinking of getting your favourite tinkering photographer a Raspberry Pi Camera, it might also be a good idea to pick up a copy of The Official Raspberry Pi Camera Guide (£14.99) — we released an updated second edition just last week.

That’s not the only new title to hit the Raspberry Pi Press store this year. If it’s our newest releases you’re interested in, you have titles such as the Book of Making 2025 and The Official Raspberry Pi Handbook 2025 (both originally priced at £14) to choose from. A special 30% discount will be applied at checkout if you choose either of these books.

If you’d like to purchase a gift that keeps on giving all year round, you can subscribe to receive a brand new edition of the official Raspberry Pi magazine, The MagPi, on your doorstep each month. You’ll also get a free Raspberry Pi Pico W if you sign up to a six- or twelve-month subscription.

The Raspberry Pi Store

If you’d like to get out into the twinkling streets of Cambridge at Christmas time, the Raspberry Pi Store in the Grand Arcade (we’re upstairs!) has stock of everything above and much, much more. We’ve also picked some excellently knowledgeable staff who can help you choose something if you’re not sure what you’re looking for.

The image depicts the exterior of a Raspberry Pi store. Here are the key details: Store Details: The store prominently displays the Raspberry Pi name and logo above its entrance. Through a large glass window, we can glimpse the well-lit interior with various items on display. Blurred figures of people are seen walking in front of the store, suggesting motion. A metal railing separates the walking area from a lower level in the mall. The architecture features beige-colored walls and pillars. Raspberry Pi: The store specializes in products related to Raspberry Pi, a popular single-board computer used for various projects and educational purposes.

The post Raspberry Pi Christmas shopping guide appeared first on Raspberry Pi.

Raspberry Pi Pico 2 W on sale now at $7

25 November 2024 at 14:59

Update: In advance of official MicroPython support for Pico 2 W, you can download our unofficial MicroPython build here; you’ll find the README here.

Today our epic autumn of product launches continues with Raspberry Pi Pico 2 W, the wireless-enabled variant of this summer’s Pico 2. Built around our brand new RP2350 microcontroller, featuring the tried and tested wireless modem from the original Pico W, and priced at just $7, it’s the perfect centrepiece for your connected Internet of Things projects.

raspberry pi pico 2 w hero

RP2350: the connoisseur’s microcontroller, redux

When we launched our debut microcontroller, RP2040, way back in 2021, we couldn’t have imagined the incredible range of products that would be built around it, or the uses that the community would put them to. Combining a symmetric pair of fast integer cores; a large, banked, on-chip memory; rich support for high-level languages; and our patented programmable I/O (PIO) subsystem, it quickly became the go-to device for enthusiasts and professional engineers seeking high-performance, deterministic interfacing at a low price point.

close up raspberry pi pico 2 w

RP2350 builds on this legacy, offering faster cores, more memory, floating point support, on-chip OTP, optimised power consumption, and a rich security model built around Arm’s TrustZone for Cortex-M. It debuted in August on Pico 2, on the DEF CON 32 badge (designed by our friends at Entropic Engineering, with firmware and a gonzo sidewalk badge presentation by the redoubtable Dmitry Grinberg), and on a wide variety of development boards and other products from our early-access partners.

Wireless things

Many of the projects and products that people build on top of our platforms — whether that’s our Linux-capable Raspberry Pi computers, our microcontroller boards, or our silicon products — answer to the general description “Internet of Things”. They combine local compute, storage, and interfacing to the real world with connectivity back to the cloud.

Raspberry Pi Pico 2 W brings all the power of RP2350 to these IoT projects. The on-board CYW43439 modem from our friends at Infineon provides 2.4GHz 802.11n wireless LAN and Bluetooth 5.2 connectivity, and is supported by C and MicroPython libraries. Enthusiasts benefit from the breadboard-friendly Pico form factor, while our upcoming RM2 radio module (already in use on Pimoroni’s Pico Plus 2 W) provides a route to scale for professional products which have been prototyped on the platform.

lifestyle raspberry pi pico 2 w

More to come

We’re very pleased with how Pico 2 W has turned out. And, where the Pico 1 series ended with Pico W, we have a few more ideas in mind for the Pico 2 series. Keep an eye out for more news in early 2025.

The post Raspberry Pi Pico 2 W on sale now at $7 appeared first on Raspberry Pi.

The Official Raspberry Pi Camera Module Guide out now: build amazing vision-based projects

22 November 2024 at 18:02

We are enormously proud to reveal The Official Raspberry Pi Camera Module Guide (2nd edition), which is out now. David Plowman, a Raspberry Pi engineer specialising in camera software, algorithms, and image-processing hardware, authored this official guide.

The Official Raspberry Pi Camera Guide 2nd Edition cover

This detailed book walks you through all the different types of Camera Module hardware, including Raspberry Pi Camera Module 3, High Quality Camera, Global Shutter Camera, and older models; discover how to attach them to Raspberry Pi and integrate vision technology with your projects. This edition also covers new code libraries, including the latest PiCamera2 Python library and rpicam command-line applications, as well as integration with the new Raspberry Pi AI Kit.

Camera Guide - Getting Started page preview

Save time with our starter guide

Our starter guide has clear diagrams explaining how to connect various Camera Modules to the new Raspberry Pi boards. It also explains how to fit custom lenses to HQ and GS Camera Modules using C-CS adaptors. Everything is outlined in step-by-step tutorials with diagrams and photographs, making it quick and easy to get your camera up and running.

Camera Guide - connecting Raspberry Pi pages

Test your camera properly

You’ll discover how to connect your camera to a Raspberry Pi and test it using the new rpicam command-line applications — these replace the older libcam applications. The guide also covers the new PiCamera2 Python library, for integrating Camera Module technology with your software.

Camera Guide - Raw images and Camera Tuning pages

Get more from your images

Discover detailed information about how Camera Module works, and how to get the most from your images. You’ll learn how to use RAW formats and tuning files, HDR modes, and preview windows; custom resolutions, encoders, and file formats; target exposure and autofocus; shutter speed, and gain, enabling you to get the very best out of your imaging hardware.

Camera Guide - Get started with Raspberry Pi AI kit pages

Build smarter projects with AI Kit integration

A new chapter covers the integration of the AI Kit with Raspberry Pi Camera Modules to create smart imaging applications. This adds neural processing to your projects, enabling fast inference of objects captured by the camera.

Camera Guide - Time-lapse capture pages

Boost your skills with pre-built projects

The Official Raspberry Pi Camera Module Guide is packed with projects. Take selfies and stop-motion videos, experiment with high-speed and time-lapse photography, set up a security camera and smart door, build a bird box and wildlife camera trap, take your camera underwater, and much more! All of the code is tested and updated for the latest Raspberry Pi OS, and is available on GitHub for inspection.

Click here to pick up your copy of The Official Raspberry Pi Camera Module Guide (2nd edition).

The post The Official Raspberry Pi Camera Module Guide out now: build amazing vision-based projects appeared first on Raspberry Pi.

Using Python with virtual environments | The MagPi #148

22 November 2024 at 00:17

Raspberry Pi OS comes with Python pre-installed, and you need to use its virtual environments to install packages. The latest issue of The MagPi, out today, features this handy tutorial, penned by our documentation lead Nate Contino, to get you started.

Raspberry Pi OS comes with Python 3 pre-installed. Interfering with the system Python installation can cause problems for your operating system. When you install third-party Python libraries, always use the correct package-management tools.

On Linux, you can install python dependencies in two ways:

  • use apt to install pre-configured system packages
  • use pip to install libraries using Python’s dependency manager in a virtual environment
It is possible to create virtual environments inside Thonny as well as from the command line

Install Python packages using apt

Packages installed via apt are packaged specifically for Raspberry Pi OS. These packages usually come pre-compiled, so they install faster. Because apt manages dependencies for all packages, installing with this method includes all of the sub-dependencies needed to run the package. And apt ensures that you don’t break other packages if you uninstall.

For instance, to install the Python 3 library that supports the Raspberry Pi Build HAT, run the following command:

$ sudo apt install python3-build-hat

To find Python packages distributed with apt, use apt search. In most cases, Python packages use the prefix python- or python3-: for instance, you can find the numpy package under the name python3-numpy.

Install Python libraries using pip

In older versions of Raspberry Pi OS, you could install libraries directly into the system version of Python using pip. Since Raspberry Pi OS Bookworm, users cannot install libraries directly into the system version of Python.

Attempting to install packages with pip causes an error in Raspberry Pi OS Bookworm

Instead, install libraries into a virtual environment (venv). To install a library at the system level for all users, install it with apt.

Attempting to install a Python package system-wide outputs an error similar to the following:

$ pip install buildhat
error: externally-managed-environment

× This environment is externally managed
╰─> To install Python packages system-wide, try apt install
    python3-xyz, where xyz is the package you are trying to
    install.
    
    If you wish to install a non-Debian-packaged Python package,
    create a virtual environment using python3 -m venv path/to/venv.
    Then use path/to/venv/bin/python and path/to/venv/bin/pip. Make
    sure you have python3-full installed.
    
    For more information visit http://rptl.io/venv

note: If you believe this is a mistake, please contact your Python installation or OS distribution provider. You can override this, at the risk of breaking your Python installation or OS, by passing --break-system-packages.
hint: See PEP 668 for the detailed specification.

Python users have long dealt with conflicts between OS package managers like apt and Python-specific package management tools like pip. These conflicts include both Python-level API incompatibilities and conflicts over file ownership.

Starting in Raspberry Pi OS Bookworm, packages installed via pip must be installed into a Python virtual environment (venv). A virtual environment is a container where you can safely install third-party modules so they won’t interfere with your system Python.

Use pip with virtual environments

To use a virtual environment, create a container to store the environment. There are several ways you can do this depending on how you want to work with Python:

per-project environments

Create a virtual environment in a project folder to install packages local to that project

Many users create separate virtual environments for each Python project. Locate the virtual environment in the root folder of each project, typically with a shared name like env. Run the following command from the root folder of each project to create a virtual environment configuration folder:

$ python -m venv env

Before you work on a project, run the following command from the root of the project to start using the virtual environment:

$ source env/bin/activate

You should then see a prompt similar to the following:

(env) $

When you finish working on a project, run the following command from any directory to leave the virtual environment:

$ deactivate

per-user environments

Instead of creating a virtual environment for each of your Python projects, you can create a single virtual environment for your user account. Activate that virtual environment before running any of your Python code. This approach can be more convenient for workflows that share many libraries across projects.

When creating a virtual environment for multiple projects across an entire user account, consider locating the virtual environment configuration files in your home directory. Store your configuration in a folder whose name begins with a period to hide the folder by default, preventing it from cluttering your home folder.

Add a virtual environment to your home directory to use it in multiple projects and share the packages

Use the following command to create a virtual environment in a hidden folder in the current user’s home directory:

$ python -m venv ~/.env

Run the following command from any directory to start using the virtual environment:

$ source ~/.env/bin/activate

You should then see a prompt similar to the following:

(.env) $

To leave the virtual environment, run the following command from any directory:

$ deactivate

Create a virtual environment

Run the following command to create a virtual environment configuration folder, replacing <env-name> with the name you would like to use for the virtual environment (e.g. env):

$ python -m venv <env-name>

Enter a virtual environment

Then, execute the bin/activate script in the virtual environment configuration folder to enter the virtual environment:

$ source <env-name>/bin/activate

You should then see a prompt similar to the following:

(<env-name>) $

The (<env-name>) command prompt prefix indicates that the current terminal session is in a virtual environment named <env-name>.

To check that you’re in a virtual environment, use pip list to view the list of installed packages:

(<env-name>) $ pip list
Package    Version
---------- -------
pip        23.0.1
setuptools 66.1.1

The list should be much shorter than the list of packages installed in your system Python. You can now safely install packages with pip. Any packages you install with pip while in a virtual environment only install to that virtual environment. In a virtual environment, the python or python3 commands automatically use the virtual environment’s version of Python and installed packages instead of the system Python.

Top Tip
Pass the –system-site-packages flag before the folder name to preload all of the currently installed packages in your system Python installation into the virtual environment.

Exit a virtual environment

To leave a virtual environment, run the following command:

(<env-name>) $ deactivate

Use the Thonny editor

We recommend Thonny for editing Python code on the Raspberry Pi.

By default, Thonny uses the system Python. However, you can switch to using a Python virtual environment by clicking on the interpreter menu in the bottom right of the Thonny window. Select a configured environment or configure a new virtual environment with Configure interpreter.

The MagPi #148 out NOW!

You can grab the new issue right now from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available at our online store, which ships around the world. You can also get it via our app on Android or iOS.

You can also subscribe to the print version of The MagPi. Not only do we deliver it globally, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico W!

The post Using Python with virtual environments | The MagPi #148 appeared first on Raspberry Pi.

Bringing real-time edge AI applications to developers

19 November 2024 at 19:25

In this guest post, Ramona Rayner from our partner Sony shows you how to quickly explore different models and AI capabilities, and how you can easily build applications on top of the Raspberry Pi AI Camera.

The recently launched Raspberry Pi AI Camera is an extremely capable piece of hardware, enabling you to build powerful AI applications on your Raspberry Pi. By offloading the AI inference to the IMX500 accelerator chip, more computational resources are available to handle application logic right on the edge! We are very curious to see what you will be creating and we are keen to give you more tools to do so. This post will cover how to quickly explore different models and AI capabilities, and how to easily build applications on top of the Raspberry Pi AI Camera.

If you didn’t have the chance to go through the Getting Started guide, make sure to check that out first to verify that your AI Camera is set up correctly.

Explore pre-trained models

A great way to start exploring the possibilities of the Raspberry Pi AI Camera is to try out some of the pre-trained models that are available in the IMX500 Model Zoo. To simplify the exploration process, consider using a GUI Tool, designed to quickly upload different models and see the real-time inference results on the AI Camera.

In order to start the GUI Tool, make sure to have Node.js installed. (Verify Node.js is installed by running node --version in the terminal.) And build and run the tool by running the following commands in the root of the repository:

make build
./dist/run.sh

The GUI Tool will be accessible on http://127.0.0.1:3001. To see a model in action:

  • Add a custom model by clicking the ADD button located at the top right corner of the interface.
  • Provide the necessary details to add a custom network and upload the network.rpk file, and the (optional) labels.txt file.
  • Select the model and navigate to Camera Preview to see the model in action!

Here are just a few of the models available in the IMX500 Model Zoo:

Network NameNetwork TypePost ProcessorColor FormatPreserve Aspect RatioNetwork FileLabels File
mobilenet_v2packagedClassificationRGBTruenetwork.rpkimagenet_labels.txt
efficientdet_lite0_pppackagedObject Detection (EfficientDet Lite0)RGBTruenetwork.rpkcoco_labels.txt
deeplabv3pluspackagedSegmentationRGBFalsenetwork.rpk
posenetpackagedPose EstimationRGBFalsenetwork.rpk

Exploring the different models gives you insight into the camera’s capabilities and enables you to identify the model that best suits your requirements. When you think you’ve found it, it’s time to build an application.

Building applications

Plenty of CPU is available to run applications on the Raspberry Pi while model inference is taking place on the IMX500. To demonstrate this we’ll run a Workout Monitoring sample application.

The goal is to count real-time exercise repetitions by detecting and tracking people performing common exercises like pull-ups, push-ups, ab workouts and squats. The app will count repetitions for each person in the frame, making sure multiple people can work out simultaneously and compete while getting automated rep counting.

To run the example, clone the sample apps repository and make sure to download the HigherHRNet model from the Raspberry Pi IMX500 Model Zoo.

Make sure you have OpenCV with Qt available:

sudo apt install python3-opencv

And from the root of the repository run:

python3 -m venv venv --system-site-packages
source venv/bin/activate
cd examples/workout-monitor/
pip install -e .

Switching between exercises is straightforward; simply provide the appropriate --exercise argument as one of pullup, pushup, abworkout or squat.

workout-monitor --model /path/to/imx500_network_higherhrnet_coco.rpk
 --exercise pullup

Note that this application is running:

  • Model post-processing to interpret the model’s output tensor into bounding boxes and skeleton keypoints
  • A tracker module (ByteTrack) to give the detected people a unique ID so that you can count individual people’s exercise reps
  • A matcher module to increase the accuracy of the tracker results, by matching people over frames so as not to lose their IDs
  • CV2 visualisation to visualise the results of the detections and see the results of the application

And all of this in real time, on the edge, while the IMX500 is taking care of the AI inference!

Now both you and the AI Camera are testing out each other’s limits. How many pull-ups can you do?

We hope by this point you’re curious to explore further; you can discover more sample applications on GitHub.

The post Bringing real-time edge AI applications to developers appeared first on Raspberry Pi.

❌
❌