Season’s greetings! I set this up to auto-publish while I’m off sipping breakfast champagne, so don’t yell at me in the comments — I’m not really here.
I hope you’re having the best day, and if you unwrapped something made by Raspberry Pi for Christmas, I hope the following helps you navigate the first few hours with your shiny new device.
Power and peripherals
If you’ve received, say, a Raspberry Pi 5 or 500 on its own and have no idea what you need to plug it in, the product pages on raspberrypi.com often feature sensible suggestions for additional items you might need.
Scroll to the bottom of the Raspberry Pi 5 product page, for example, and you’ll find a whole ‘Accessories’ section featuring affordable things specially designed to help you get the best possible performance from your computer.
You can find all our hardware here, so have a scroll to find your particular Christmas gift.
Dedicated documentation
There are full instructions on how everything works if you know where to look. Our fancy documentation site holds the keys to all of your computing dreams.
Your one-stop shop for all your Raspberry Pi questions
If all the suggestions above aren’t working out for you, there are approx. one bajillion experts eagerly awaiting your questions on the Raspberry Pi forums. Honestly, I’ve barely ever seen a question go unanswered. You can throw the most esoteric, convoluted problem out there and someone will have experienced the same issue and be able to help. Lots of our engineers hang out in the forums too, so you may even get an answer direct from Pi Towers.
Be social
Outside of our official forums, you’ve all cultivated an excellent microcosm of Raspberry Pi goodwill on social media. Why not throw out a question or a call for project inspiration on our official Facebook, Threads, Instagram, TikTok, or “Twitter” account? There’s every chance someone who knows what they’re talking about will give you a hand.
Also, tag us in photos of your festive Raspberry Pi gifts! I will definitely log on to see and share those.
Again, we’re not really here, it’s Christmas!
I’m off again now to catch the new Wallace and Gromit that’s dropping on Christmas Day (BIG news here in the UK), but we’ll be back in early January to hang out with you all in the blog comments and on social.
Glad tidings, joy, and efficient digestion wished on you all.
This #MagPiMonday, we take a look at Md. Khairul Alam’s potentially life-changing project, which aims to use AI to assist people living with a visual impairment.
Technology has long had the power to make a big difference to people’s lives, and for those who are visually impaired, the changes can be revolutionary. Over the years, there has been a noticeable growth in the number of assistive apps. As well as JAWS — a popular computer screen reader for Windows — and software that enables users to navigate phones and tablets, there are audio-descriptive apps that use smart device cameras to read physical documents and recognise items in someone’s immediate environment.
Understanding the challenges facing people living with a visual impairment, maker and developer Md. Khairul Alam has sought to create an inexpensive, wearable navigation tool that will free up the user’s hands and describe what someone would see from their own eyes’ perspective. Based around a pair of spectacles, it uses a small camera sensor that gathers visual information which is then sent to a Raspberry Pi 1 Model B for interpretation. The user is able to hear an audio description of whatever is being seen.
There’s no doubting the positive impact this project could have on scores of people around the world. “Globally, around 2.2 billion don’t have the capability to see, and 90% of them come from low-income countries,” Khairul says. “A low-cost solution for people living with a visual impairment is necessary to give them flexibility so they can easily navigate and, having carried out research, I realised edge computer vision can be a potential answer to this problem.”
Cutting edge
Edge computer vision is potentially transformative. It gathers visual data from edge devices such as a camera before processing it locally, rather than sending it to the cloud. Since information is being processed close to the data source, it allows for fast, real-time responses with reduced latency. This is particularly vital when a user is visually impaired and needs to be able to make rapid sense of the environment.
The connections are reasonably straightforward: plug the Xiao ESP32S3 Sense module into a Raspberry Pi
For his project, Khairul chose to use the Xiao ESP32S3 Sense module which, aside from a camera sensor and a digital microphone, has an integrated Xtensa EPS32-S3R8 SoC processor, 8MB of flash memory, and a microSD card slot. This was mounted onto the centre of a pair of spectacles and connected to a Raspberry Pi computer using a USB-C cable, with a pair of headphones then plugged into Raspberry Pi’s audio out port. With those connections made, Khairul could concentrate on the project’s software.
As you can imagine, machine learning is an integral part of this project; it needs to accurately detect and identify objects. Khairul used Edge Impulse Studio to train his object detection model. This tool is well equipped for building datasets and, in this case, one needed to be created from scratch. “When I started working on the project, I did not find any ready-made dataset for this specific purpose,” he tells us. “A rich dataset is very important for good accuracy, so I made a simple dataset for experimental purposes.”
To help test the device, Khairul has been using an inexpensive USB-C portable speaker
Object detection
Khairul initially concentrated on six objects, uploading 188 images to help identify chairs, tables, beds, and basins. The more images he could take of an object, the greater the accuracy — but it posed something of a challenge. “For this type of work, I needed a unique and rich dataset for a good result, and this was the toughest job,” he explains. Indeed, he’s still working on creating a larger dataset, and these things take a lot of time; but upon uploading the model to the Xiao ESP32S3 Sense, it has already begun to yield some positive results.
When an object is detected, the module returns the object’s name and position. “After detecting and identifying the object, Raspberry Pi is then used to announce its name — Raspberry Pi has built-in audio support, and Python has a number of text-to-speech libraries,” Khairul says. The project uses a free software package called Festival, which has been written by The Centre for Speech Technology Research in the UK. This converts the text to speech, which can then be heard by the user.
A tidier solution will be needed — including a waterproof case — for real-world situations
For convenience, all of this is currently being powered by a small rechargeable lithium-ion battery, which is connected by a long wire to enable it to sit in the user’s pocket. “Power consumption has been another important consideration,” Khairul notes, “and because it’s a portable device, it needs to be very power efficient.” Since Third Eye is designed to be worn, it also needs to feel right. “The form factor is a considerable factor — the project should be as compact as possible,” Khairul adds.
Going forward
Third Eye is still in a proof-of-concept stage, and improvements are already being identified. Khairul knows that the Xiao ESP32S3 Sense will eventually fall short of fulfilling his ambitions for the project as it expands in the future and, with a larger machine learning model proving necessary, Raspberry Pi is likely to take on more of the workload.
“To be very honest, the ESP32S3 Sense module is not capable enough to respond using a big model. I’m just using it for experimental purposes with a small model, and Raspberry Pi can be a good alternative,” he says. “I believe for better performance, we may use Raspberry Pi for both inferencing and text-to-speech conversions. I plan to completely implement the system inside a Raspberry Pi computer in the future.”
Other potential future tweaks are also stacking up. “I want to include some control buttons so that users can increase and decrease the volume and mute the audio if required,” Khairul reveals. “A depth camera would also give the user important information about the distance of an object.” With the project shared on Hackster, it’s hoped the Raspberry Pi community could also assist in pushing it forward. “There is huge potential for a project such as this,” he says.
The MagPi #149 out NOW!
You can grab the new issue right now from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available at our online store, which ships around the world. You can also get it via our app on Android or iOS.
You can also subscribe to the print version of The MagPi. Not only do we deliver it globally, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico W!
Dip your toes into the world of PIO on Raspberry Pi 5 using PIOLib
The launch of Raspberry Pi 5 represented a significant change from previous models. Building chips that run faster and use less power, while continuing to support 3.3V I/O, presents real, exciting challenges. Our solution was to split the main SoC (System on Chip) in two — the compute half, and the I/O half — and put a fast interconnect (4-lane PCIe Gen 3) between them. The SoC on Raspberry Pi 5 is the Broadcom BCM2712, and the I/O processor (which used to be known in the PC world as the ‘southbridge’) is Raspberry Pi RP1.
Along with all the usual peripherals — USB, I2C, SPI, DMA, and UARTs — RP1 included something a bit more interesting. One of RP2040’s distinguishing features was a pair of PIO blocks, deceptively simple bits of Programmable I/O capable of generating and receiving patterns on a number of GPIOs. With sufficient cunning, users have been able to drive NeoPixel LEDs and HDMI displays, read from OneWire devices, and even connect to an Ethernet network.
RP1 is blessed with a single PIO block — almost identical to the two that RP2040 has — as well as four state machines and a 32-entry instruction memory. However, apart from a few hackers out there, it has so far lain dormant; it would be great to make this resource available to users for their own projects, but there’s a catch.
Need for speed
The connection between RP1’s on-board ARM M3 microcontrollers and the PIO hardware was made as fast as possible, but at the cost of making the PIO registers inaccessible over PCIe; the only exceptions are the state machine FIFOs — the input and output data pipes — that can be reached by DMA (direct memory access). This makes it impossible to control PIO directly from the host processors, so an alternative is required. One option would be to allow the uploading of code to run on the M3 cores, but there are a number of technical problems with that approach:
1. We need to “link” the uploaded code with what is already present in the firmware — think of it as knitting together squares to make a quilt (or a cardigan for Harry Styles). For that to work, the firmware needs a list of the names and addresses of everything the uploaded code might want to access, something that the current firmware doesn’t have.
2. Third-party code running on M3 cores presents a security risk — not in the sense that it might steal your data (although that might be possible…), but that by accident or design it could disrupt the operation of your Raspberry Pi 5.
3. Once the M3s have been opened up in that way, we can’t take it away, and that’s not a step we’re prepared to take.
Not like that, like this
For these reasons, we took a different path.
The latest RP1 firmware implements a mailbox interface: a simple mechanism for sending messages between two parties. The kernel has corresponding mailbox and firmware drivers, and an rp1-pio driver that presents an ioctl() interface to user space. The end result of adding all this software is the ability to write programs using the PIO SDK that can run in user space or in kernel drivers.
Latency trade-off
Most of the PIOLib functions cause a message to be sent to the RP1 firmware, which performs the operation — possibly just a single I/O access — and replies. Although this makes it simple to run PIO programs on Raspberry Pi 5 (and the rest of the Raspberry Pi family), it does come at a cost. All that extra software adds latency; most PIOLib operations take at least 10 microseconds. For PIO software that just creates a state machine and then reads or writes data, this is no problem — the WS2812 LED and PWM code are good examples of this. But anything that requires close coupling between the state machine and driver software is likely to have difficulties.
The first official use of PIOLib is the new pwm-pio kernel driver. It presents a standard Linux PWM interface via sysfs, and creates a very stable PWM signal on any GPIO on the 40-pin header (GPIOs 0 to 27). You can configure up to four of these PWM interfaces on Raspberry Pi 5; you are limited by the number of state machines. Like many peripherals, you create one with a Device Tree overlay:
dtoverlay=pwm-pio,gpio=7
One feature absent from this first release is interrupt support. RP1 provides two PIO interrupts, which can be triggered by the PIO instruction IRQ (interrupt request), and these could be used to trigger actions on the SoC.
Over time, we may discover that there are some common usage patterns — groups of the existing PIOLib functions that often appear together. Adding those groups to the firmware as single, higher-level operations may allow more complex PIO programs to run. These and other extensions are being considered.
The latest kernel (sudo apt update; sudo apt upgrade)
The latest EEPROM (see the ‘Advanced Options’ section of raspi-config)
I’ll leave you with a video of some flashing lights — two strings of WS2812 LEDs being driven from a Raspberry Pi 5. It’s beginning to look a bit festive!
Ah, the WOPR — or “War Operation Plan Response” for those who enjoy abbreviations that sound like a robot from the future, only less like a friend and more like an overzealous maths teacher.
The WOPR is the supercomputer from the 1983 movie WarGames. It doesn’t understand sarcasm, it can’t sense when it’s being pranked, and it certainly doesn’t know when it’s been told to “play a game” — much like our Maker in Residence, Toby, who built it to delight and entertain all visitors to the Pi Towers Maker Lab.
A script runs on boot, which twinkles the NeoPixels in the traditional 1980s supercomputer colours, yellow and red.
Another script can be run to play a short clip from the film WarGames on the Touch Display 2 screen, explaining the WOPR. At the press of a button on the Touch Display, our faux WOPR also parrots famous lines from the film, such as: “Shall we play a game?” and “How about a nice game of chess?”
For those who wish to linger a little longer in the Maker Lab, Toby devised a game in which clips from 1980s films and music videos flash (a little too fast, in my opinion) up on the screen, with your job being to enthusiastically shout out where each clip is from.
Authentic enclosure
The body of the WOPR is a combination of 3D-printed plastics and laser-cut MDF painted in industrial grey, with Cricut silver lettering on the side. Everything is glued together, and a lot of sanding was required to make it appear as though it’s a sleek, fancy contraption from the future.
After a bumper autumn of product launches, we thought why not go full Santa as we head towards our winter break and give you all another double product launch? On Monday, we released Raspberry Pi 500 and the Raspberry Pi Monitor into the world. Here’s what some of your favourite YouTubers did with them.
VEEB Projects
VEEB get major points for their impossibly simple yet genius idea, leaving us at Pi Towers wondering “why didn’t I think of that?” They mounted an SD card holder on the back of the Raspberry Pi Monitor’s kickstand, making it super easy to switch them out and giving them access to three different PC systems at their fingertips — a desktop PC, a retro gaming centre, and a music streamer.
If you’d like to perform the sincerest form of flattery, you can download the printable files for VEEB’s SD card storage case and make your own.
NetworkChuck
Chuck asks the question that Mad Men‘s Don Draper — actually, no, copywriter extraordinaire Peggy — would begin with: “who is this for?” Adorable cameos from The Littles in his review answer it for him, with the very littlest ably assisting in the plug-and-play set up of her new desktop PC before settling in to play some Roblox.
He also gives us a handy side-by-side comparison with his Raspberry Pi 400.
Disclaimer: Raspberry Pi 500 is not edible
Jeff Geerling
Jeff gets straight to the point: “the keyboard is the computer”. He also wins the prize for most avant-garde presentation of the Monitor and Pi 500 side by side in the above video thumbnail.
And while Jeff proper has decorum and self restraint, Level 2 Jeff couldn’t help himself, going right ahead and cracking his Pi 500 open to see what’s inside.
Kevin McAleer
Kevin could not wait until his usual Sunday night livestream, and went live with a detailed demo of Raspberry Pi 500 and the Raspberry Pi Monitor the day after launch. If deep dives are your bag, grab snacks and settle in for this hour-long opus.
Kev’s a professional YouTuber, though, so if you haven’t the time, he obviously also rolled out a succinct six-minute video on our latest creations.
leepspvideo
And if you can’t get enough destruction, leepspvideo also did a nice teardown of Raspberry Pi 500, and tested the audio output on the Raspberry Pi Monitor, checking that it works great with his Raspberry Pi 5. Furthermore, he is accompanied by an excellent cat for the majority of the review.
Gary Explains
We really liked Gary’s straightforward “what is it, what does it do, how much does it cost?” approach. He too pops the hood to give you a nice clear look inside Raspberry Pi 500.
ETA Prime
We know where ETA Prime’s heart lies when they proclaim Raspberry Pi 500’s gaming possibilities right at the start of their review and teardown. In preparation for their gaming bonanza, a little overclocking is tested and some benchmarks run, but you’ll need to subscribe to ETA Prime’s channel to keep up with the promised gaming videos.
Did we miss anyone? These were all the videos we’d seen at the time of writing, but we’re 89% sure we’re horribly behind the times already. Drop a link to more reviews and leave a comment if you have an idea for a Raspberry Pi 500 project you’d like to see.
Just in time for Christmas, we’re delighted to announce the release of two hotly anticipated products that we think will look great under the tree. One of them might even fit in a stocking if you push hard enough. Introducing Raspberry Pi 500, available now at $90, and the Raspberry Pi Monitor, on sale at $100: together, they’re your complete Raspberry Pi desktop setup.
With Raspberry Pi, your desk can look this good
Integral calculus
Our original mission at Raspberry Pi was to put affordable, programmable personal computers in the hands of young people all over the world. And while we’ve taken some detours along the way – becoming one of the world’s largest manufacturers of industrial and embedded computers – this mission remains at the heart of almost everything we do. It drives us to make lower-cost products like the $15 Raspberry Pi Zero 2 W, and more powerful products, like our flagship Raspberry Pi 5 SBC. These products provide just the essential processing element of a computer, which can be combined with the family television, and second-hand peripherals, to build a complete and cost-effective system.
But over time we have come to understand the benefits of integration: some people are better served by a system that is ready to use straight out of the box. This need was dramatized during the early days of the COVID pandemic, when we worked with the Raspberry Pi Foundation to deliver thousands of Raspberry Pi 4 Desktop Kits and monitors to young people studying from home in the UK. Our experiences with that programme informed the development of Raspberry Pi 400, our all-in-one PC, whose form factor (and name) harks back to the great 8-bit and 16-bit computers – the BBC Micro, Sinclair Spectrum, and Commodore Amiga – of the 1980s and 1990s.
Meet Raspberry Pi 500
In the four years since it launched, Raspberry Pi 400 has become a hugely popular choice for enthusiasts and educators. And today, we’re launching its successor, Raspberry Pi 500, bringing the features and performance of the Raspberry Pi 5 platform to our all-in-one form factor:
2.4GHz quad-core 64-bit Arm Cortex-A76 processor
8GB LPDDR4X-4267 SDRAM
VideoCore VII GPU, supporting OpenGL ES 3.1 and Vulkan 1.3
Dual 4Kp60 HDMI® display output
Dual-band 802.11ac Wi-Fi® and Bluetooth 5.0
2 × USB 3.0 ports, supporting simultaneous 5Gbps operation
1 × USB 2.0 port
Gigabit Ethernet port
Horizontal 40-pin Raspberry Pi GPIO connector
The ultimate compact PC
Raspberry Pi 500 is priced at $90, including a 32GB Raspberry Pi-branded SD card, and is also available in a $120 Desktop Kit, which adds:
Raspberry Pi Mouse
Raspberry Pi 27W USB-C Power Supply
2m micro HDMI to HDMI cable
Raspberry Pi Beginner’s Guide, 5th Edition
The vision thing – an official Raspberry Pi Monitor
Although it’s highly integrated, Raspberry Pi 500 is only half the story: to build a complete system, you still need a display device. Which is why we’re also launching the Raspberry Pi Monitor, available now at $100. Designed to coordinate perfectly with your Raspberry Pi 500 or casedRaspberry Pi 5, it incorporates a 15.6″ full HD IPS panel with a 45% colour gamut and an 80° viewing angle, together with a pair of 1.2W speakers, in a slender enclosure with a fold-away integrated stand and VESA mounting points.
The perfect desktop display companion for your Raspberry Pi or lesser computer
Power is provided via a USB-C connector. Cost-conscious users can power the monitor directly from their Raspberry Pi via the included USB-A to USB-C cable; in this mode display brightness is limited to 60% of maximum (still quite bright!) and volume to 50% of maximum (still quite loud!). Using a dedicated USB-C supply capable of delivering 5V/3A, like the Raspberry Pi 15W USB-C Power Supply, enables the full brightness and volume ranges.
Faster, better, cheaper: Raspberry Pi 400 price cuts
While we’re incredibly excited about Raspberry Pi 500, we need to remember that cost remains a barrier to access for many people, young and old. So we’re also taking this opportunity to cut the price of Raspberry Pi 400 from $70 to $60, and the Raspberry Pi 400 Personal Computer Kit from $100 to $80. We’re also bundling a Raspberry Pi-branded SD card with every Raspberry Pi 400, to help you get the best possible performance out of the system.
We know that quite a few of you have been eagerly awaiting both of our new products, and we hope you enjoy them now they’re here. We’ve seen Raspberry Pi 400 everywhere from retro gaming setups to university exam facilities and hospital offices; we’re really looking forward to finding out where Raspberry Pi 500 and our new Raspberry Pi Monitor end up.
Earlier this year we told you all about our awesome new remote access service, Raspberry Pi Connect. We said we wanted to make it as useful as possible for our individual users, and provide it for free on Raspberry Pi devices. But we knew our industrial and embedded customers would like to use the functionality it provided, and more. Since launching Raspberry Pi Connect, we’ve been gathering information from these customers to understand what they are using it for and what they’d like to see.
Also, for all you individual users, we’ve not stopped developing the service, so read on for new functionality for you too!
Connect for Organisations
Feedback from our commercial customers shows that Connect has hit on a particular problem many of them have. When supporting their products in the field, whether that’s fifty metres up a radio transmission tower or at a customer site, it is difficult to maintain those systems when things go wrong. There are many commercial customers who have found Connect the perfect solution to this problem. But the service had a limitation: the devices are ‘owned’ by a single user, and no other users can access them. The worry one customer had, about one of their IT team disappearing with control of all their customers’ devices, was clear!
There are also situations where a customer has only a single Raspberry Pi, but wants to provide many users with access to it. Or where a school with a set of Raspberry Pis is giving each of their students access to them, so they can develop software remotely. Introducing Raspberry Pi Connect for Organisations!
Connect for Organisations allows you to create an organisation account which can own the Raspberry Pi devices registered to it:
Much like Raspberry Pi Connect for individual users, devices are added to the organisation’s account and can be controlled through the web page. To switch between your personal account and an organisation account, you can just click on the switch icon in the top left. Of course, now you have an organisation, it is going to need users:
Users can be invited into the organisation easily. Currently we’re not limiting the number of users or charging for the number of users — we don’t anticipate users per se to consume much bandwidth, storage, or processing resource, so we suspect that would be an unnecessary complication. As you can see, there are only two roles, administrator and member; only administrators can add or remove devices.
What does it cost?
We’ve kept pricing simple. Raspberry Pi Connect for Organisations costs $0.50 per device per month, based on the maximum number of devices registered in the month, and you get unlimited users.
Next up
Now that organisation functionality is available, we’ve got some other things to start working on. To give you an idea of where we’re going with Connect, some of these are:
Device tagging: tag devices with your own labels, and use those tags to search and identify different classes of device
Access control lists: using tags to give users different levels of access to devices
Ability to sign devices up from Raspberry Pi Imager: boot direct to headless installation!
Capacity for bulk provisioning of Raspberry Pi Connect device secrets during manufacture of Compute Module- and Raspberry Pi-based products
Now for the eye candy
Some of you may have noticed a new button on the screen sharing interface:
The ability to enter full-screen mode at the click of a button is great for people who want to be able to get a better view of the destination screen, making it work more obviously — a little bit of useful functionality for all Connect users. We hope you like it!
Earlier this year we released Raspberry Pi Connect, which lets you access your Raspberry Pi from anywhere, either through a remote shell interface or by screen sharing. But perhaps, occasionally, you might need to screen share some other computer; what if you want to screen share your big PC, with its gaming graphics capabilities, around your house? Is it possible to use it to play your games from anywhere? Happily, thanks to Valve’s hugely popular Steam Link product, the answer is yes. With Steam Link, our kids can — OK, we can — play PC games on any computer in the house, without having to lug the PC around. And now, you can run Steam Link on your Raspberry Pi 5!
Steam Link is actually tackling some quite difficult challenges to enable us to play graphics-heavy games remotely. Firstly, screen sharing is not normally optimised for sending high quality images, since you have to work quite hard to keep both the bitrate and the latency down; you also don’t normally transmit audio as well as video, and you need to do a bit of magic to talk to game controllers. So the smart folks at Valve have successfully solved quite a few hard problems to bring this into being.
Even better, Sam Lantinga from Valve — who is also the developer of SDL, a simple multimedia programming library — has been working for a little while on getting Steam Link to run on Raspberry Pi 5. The previous method used to run Steam Link on Raspberry Pi OS no longer worked very well after we moved away from the closed-source Broadcom multimedia libraries, and with Wayland, a different approach was needed. Sam has been working with the Raspberry Pi software team to use our hardware in the most efficient way possible.
Valve’s announcement of Steam Link v1.3.13 shows that Sam has been able to get Steam Link working at some amazing rates on Raspberry Pi 5, including 4kp60 and even 1080p240 (obviously you’ll need a suitable monitor for that!).
In this guest post, Ultralytics, creators of the popular YOLO (You Only Look Once) family of convolutional neural networks, share their insights on deploying and running their powerful AI models on Raspberry Pi devices, offering solutions for a wide range of real-world problems.
Computer vision is redefining industries by enabling machines to process and understand visual data like images and videos. To truly grasp the impact of vision AI, consider this: Ultralytics YOLO models, such as Ultralytics YOLOv8 and the newly launched Ultralytics YOLO11, which support computer vision tasks like object detection and image classification, have been used over 100 billion times. There are 500 to 600 million uses every day and thousands of uses every second across applications like robotics, agriculture, and more.
YOLO can be used in the agriculture sector
To take this a step further, Ultralytics has partnered with Raspberry Pi to bring vision AI to one of the most accessible and versatile computing platforms. This collaboration makes it possible to deploy YOLO models directly on Raspberry Pi, enabling real-time computer vision applications in a compact, cost-effective, and easy-to-use way.
By supporting such integrations, Ultralytics aims to enhance model compatibility across diverse deployment environments. For instance, the Sony IMX500, the intelligent vision sensor with on-sensor AI processing capabilities included in the Raspberry Pi AI Camera, works with Raspberry Pi to run YOLO models, enabling advanced edge AI applications.
In this article, we’ll explore how YOLO models can be deployed on Raspberry Pi devices, look at real-world use cases, and highlight the benefits of this exciting collaboration for vision AI projects. Let’s get started!
Enabling edge AI solutions with Raspberry Pi and Ultralytics YOLO
Raspberry Pi is an affordable and widely used device, making it a great choice for deploying vision AI models like YOLO. Running Ultralytics YOLO models on Raspberry Pi enables real-time computer vision capabilities, such as object detection, directly on the device, eliminating the need for cloud resources. Local processing reduces latency and improves privacy, making it ideal for applications where speed and data security are essential.
Ultralytics offers optimized models, like YOLO11, that can run efficiently on relatively resource-constrained devices, with the Nano and Small model variants providing the best performance on lower-power hardware. Leveraging these optimized models on Raspberry Pi devices is easy with the Ultralytics Python API or CLI, ensuring smooth deployment and operation. In addition to this, Ultralytics also supports automated testing for Raspberry Pi devices on GitHub Actions to regularly check for bugs and ensure the models are ready for deployment.
Another interesting feature of the Ultralytics YOLO models is that they can be exported in various formats (as shown in the image below), including NCNN (Neural Network Compression and Optimization). Designed for devices with relatively constrained computing power, such as Raspberry Pi’s ARM64 architecture, NCNN ensures faster inference times by optimizing model weights and activations through techniques like quantization.
Benchmarking Ultralytics YOLO11 inferencing On Raspberry Pi
Raspberry Pi, Sony IMX500, and YOLO for real-time AI applications
The Raspberry Pi AI Camera is a perfect example of how this integration helps support compatibility across a range of deployment environments. Its IMX500 intelligent vision sensor comes with on-sensor AI processing, allowing it to analyze visual data directly and output metadata rather than raw images. While the IMX500 is powerful on its own, it needs to be paired with a device like Raspberry Pi to run YOLO models effectively. In this setup, a Raspberry Pi acts as the host device, facilitating communication with the AI Camera and enabling real-time AI applications powered by YOLO.
Raspberry Pi AI Camera incorporates the Sony IMX500
Real-world examples of YOLO applications on Raspberry Pi
Raspberry Pi, combined with the Ultralytics YOLO models, unlocks countless possibilities for real-world applications. This collaboration bridges the gap between experimental AI setups and production-ready solutions, offering an affordable, scalable, and practical tool for a wide range of industries.
Here are a few impactful use cases:
Robotics: YOLO can enable robots to navigate environments, recognize objects, and perform tasks with precision, making them more autonomous and efficient
Drones: With YOLO running on Raspberry Pi, drones can detect obstacles, track objects, and perform surveillance in real-time, enhancing their capabilities in industries like delivery and security
Quality control in manufacturing: YOLO can help identify defects in production lines, ensuring higher quality standards with automated inspection
Smart farming: By using YOLO to monitor crop health and detect pests, farmers can make data-driven decisions, improving yields and reducing resource waste
Benefits of running Ultralytics YOLO models on Raspberry Pi for edge AI
There are many advantages to deploying YOLO models on Raspberry Pi, making it a practical and affordable option for edge AI applications. For instance, performance can be boosted by using hardware accelerators like Google Coral Edge TPU, enabling faster and more efficient real-time processing.
Coral Edge TPU connected to a Raspberry Pi
Here are some of the other key benefits:
Scalability: The setup can be extended to multiple devices, making it a great choice for larger projects such as factory automation or smart city systems
Flexibility: YOLO’s compatibility ensures that developers can create solutions that work seamlessly across a variety of hardware setups, offering versatility for different applications
Community and support: With extensive resources, tutorials, and an active community, Ultralytics provides the support needed for smooth deployment and troubleshooting of YOLO models on Raspberry Pi
To the edge and beyond with Ultralytics YOLO and Raspberry Pi
YOLO and Raspberry Pi are making edge AI applications more accessible, impactful, and transformative than ever before. By putting together the advanced capabilities of Ultralytics YOLO models with the cost-effectiveness and flexibility of Raspberry Pi, this partnership allows developers, researchers, and hobbyists to bring innovative ideas to life.
With support for devices like the Raspberry Pi AI Camera and scalable hardware options, this collaboration unlocks opportunities across industries, from robotics and agriculture to manufacturing and beyond.
Today we’re happy to announce the much-anticipated launch of Raspberry Pi Compute Module 5, the modular version of our flagship Raspberry Pi 5 single-board computer, priced from just $45.
An unexpected journey
We founded the Raspberry Pi Foundation back in 2008 with a mission to give today’s young people access to the sort of approachable, programmable, affordable computing experience that I benefitted from back in the 1980s. The Raspberry Pi computer was, in our minds, a spiritual successor to the BBC Micro, itself the product of the BBC’s Computer Literacy Project.
But just as the initially education-focused BBC Micro quickly found a place in the wider commercial computing marketplace, so Raspberry Pi became a platform around which countless companies, from startups to multi-billion-dollar corporations, chose to innovate. Today, between seventy and eighty percent of Raspberry Pi units go into industrial and embedded applications.
While many of our commercial customers continue to use the “classic” single-board Raspberry Pi form factor, there are those whose needs aren’t met by that form factor, or by the default set of peripherals that we choose to include on the SBC product. So, in 2014 we released the first Raspberry Pi Compute Module, providing just the core functionality of Raspberry Pi 1 – processor, memory, non-volatile storage and power regulation – in an easy-to-integrate SODIMM module.
Compute Modules make it easier than ever for embedded customers to build custom products which benefit from our enormous investments in the Raspberry Pi hardware and software platform. Every subsequent generation of Raspberry Pi, except for Raspberry Pi 2, has spawned a Compute Module derivative. And today, we’re happy to announce the launch of Compute Module 5, the modular version of our flagship Raspberry Pi 5 SBC.
Meet Compute Module 5
Compute Module 5 gives you everything you love about Raspberry Pi 5, but in a smaller package:
A 2.4GHz quad-core 64-bit Arm Cortex-A76 CPU
A VideoCore VII GPU, supporting OpenGL ES 3.1 and Vulkan 1.3
Dual 4Kp60 HDMI® display output
A 4Kp60 HEVC decoder
Optional dual-band 802.11ac Wi-Fi® and Bluetooth 5.0
2 × USB 3.0 interfaces, supporting simultaneous 5Gbps operation
Gigabit Ethernet, with IEEE 1588 support
2 × 4-lane MIPI camera/display transceivers
A PCIe 2.0 x1 interface for fast peripherals
30 GPIOs, supporting 1.8V or 3.3V operation
A rich selection of peripherals (UART, SPI, I2C, I2S, SDIO, and PWM)
It is available with 2GB, 4GB, or 8GB of LPDDR4X-4267 SDRAM, and with 16GB, 32GB, or 64GB of MLC eMMC non-volatile memory. 16GB SDRAM variants are expected to follow in 2025.
Compute Module 5 is mechanically compatible with its predecessor, Compute Module 4, exposing all signals through a pair of high-density perpendicular connectors, which attach to corresponding parts on the customer’s carrier board. Additional stability is provided by four M2.5 mounting holes arranged at the corners of the board.
There are a small number of changes to the pin-out and electrical behaviour of the module, mostly associated with the removal of the two two-lane MIPI interfaces, and the addition of two USB 3.0 interfaces. A detailed summary of these changes can be found in the Compute Module 5 datasheet.
Accessories accessorise
But Compute Module 5 is only part of the story. Alongside it, we’re offering a range of new accessories to help you get the most out of our new modular platform.
IO Board
Every generation of Compute Module has been accompanied by an IO board, and Compute Module 5 is no exception.
The Raspberry Pi Compute Module 5 IO Board breaks out every interface from a Compute Module 5. It serves both as a development platform and as reference baseboard (with design files in KiCad format), reducing the time to market for your Compute Module 5-based designs.
A Gigabit Ethernet jack with PoE+ support (requires a separate Raspberry Pi PoE+ HAT+)
An M.2 M-key PCIe socket (for 2230, 2242, 2260 and 2280 modules)
A microSD card socket (for use with Lite modules)
An RTC battery socket
A 4-pin fan connector
Power is provided by a USB-C power supply (sold separately).
IO Case
As in previous generations, we expect some users to deploy the IO Board and Compute Module combination as a finished product in its own right: effectively an alternative Raspberry Pi form factor with all the connectors on one side. To support this, we are offering a metal case which turns the IO Board into a complete encapsulated industrial-grade computer. The Raspberry Pi IO Case for Raspberry Pi Compute Module 5 includes an integrated fan, which can be connected to the 4-pin fan connector on the IO Board to improve thermal performance.
Cooler
While Compute Module 5 is our most efficient modular product yet in terms of energy consumed per instruction executed, like all electronic products it gets warm under load. The Raspberry Pi Cooler for Raspberry Pi Compute Module 5 is a finned aluminium heatsink, designed to fit on a Compute Module 5, and including thermal pads to optimise heat transfer from the CPU, memory, wireless module and eMMC.
Antenna Kit
Wireless-enabled variants of Compute Module 5 provide both an onboard PCB antenna, and a UFL connector for an external antenna. Use of the Raspberry Pi Antenna Kit (identical to that already offered for use with Compute Module 4) with Compute Module 5 is covered by our FCC modular compliance.
Raspberry Pi 27W USB-C PD Power Supply (local variant as applicable)
Antenna Kit
2 × Raspberry Pi standard HDMI to HDMI Cable
Raspberry Pi USB-A to USB-C Cable
Early adopters
Today’s launch is accompanied by announcements of Compute Module 5-based products from our friends at KUNBUS and TBS, who have built successful products on previous Raspberry Pi Compute Modules and whom we have supported to integrate our new module into their latest designs. Other customers are preparing to announce their own Compute Module 5-powered solutions over the next weeks and months. The world is full of innovative engineering companies of every scale, and we’re excited to discover the uses to which they’ll put our powerful new module. Try Compute Module 5 for yourself and let us know what you build with it.
Revolution Pi has been designing and manufacturing successful products with Raspberry Pi Compute Modules for years. In this guest post, they talk about why they continue to choose Raspberry Pi technology, and discuss their experience designing with our brand-new Compute Module 5.
Revolution Pi has been building flexible industrial devices with Raspberry Pi Compute Modules since the very beginning. As a long-time partner, we have witnessed their impressive evolution from the first to the fifth generation over the past ten years.
Technical advancements that matter
Raspberry Pi Compute Module 5’s enhancements directly address industrial requirements: it provides quad-core CPU performance up to 2.4GHz, a built-in USB 3.2 controller, and an improved PCIe controller. Raspberry Pi’s continuous integration of more interfaces directly on the Compute Module advances its capabilities while freeing up valuable space on our carrier board. These well-integrated interfaces within the Raspberry Pi ecosystem enable more flexible hardware designs. This allowed us to equip the RevPi Connect 5 with up to four multi-Gigabit Ethernet ports, letting industrial users connect multiple industrial fieldbuses and other networks with low latency.
The RevPi Connect 5 consists of two PCBs with a big bolted-on heat sink
Collaborative development process
Working with Raspberry Pi on this has been exceptional. They understand what industrial developers need. We received early samples to test with, which was critical. It allowed us to iterate and optimise our design solutions, especially when developing a custom heat sink. Managing the heat generated by the powerful new Compute Module in a DIN rail enclosure was an important part of the design process. Having real hardware to test with made all the difference.
Systematic thermal management
Maintaining Compute Module 5’s operating temperature below 85°C under heavy load required a methodical development process. We started with thermal simulation analysis to identify hotspots at full operating capacity. This analysis formed the basis for our practical prototyping. Through iterative testing under extreme conditions, we optimised the heatsink design before conducting extensive testing with the final housing inside our climatic chamber. The entire process culminated in establishing precise manufacturing standards with rigorous quality control.
Analysis of simulated airflow in the heatsink
Seamless software integration
On the software side, working with Raspberry Pi’s platform enables smooth integration. When we hit technical challenges, their engineering team was right there to support us. Their unified kernel approach across all products allowed us to focus on integrating new features like the CAN FD interfaces instead of wrestling with compatibility issues. This standardisation benefits Revolution Pi users as well — they can use our industrialised Raspberry Pi OS-based image consistently across all Revolution Pi devices.
A typical Revolution Pi system configuration, consisting of a RevPi Connect 5 and several expansion modules
A proven partnership
From the first Compute Module to now, Raspberry Pi has shown growing commitment to industrial computing. Compute Module 5, purpose-built for products like Revolution Pi, demonstrates what’s possible when combining Raspberry Pi’s innovation with our industrial-grade engineering. We’re excited to continue pushing the boundaries of industrial automation and IIoT applications together.
It’s the most wonderful time of the year… to give someone on your gift list something (or all things) Raspberry Pi. The past year has seen many exciting new releases, so we understand if you’re sat scratching your head at what to buy your favourite Raspberry Pi fanatic. But look no further! For the sake of your peace, and in a show of our goodwill, we elves have gone and done all the work for you. Good tidings we bring.
Our newest stuff
If it’s a Raspberry Pi superfan you’ve got on your list, you might want to plump for one of our latest hardware releases to really impress them. After all, what do you get someone who has everything? The newest, shiniest thing they haven’t managed to get their hands on yet.
Raspberry Pi Pico 2 W
Launched just a couple of days ago, Raspberry Pi Pico 2 W is the wireless variant of Pico 2, giving you even more flexibility in your connected projects. It’s on sale now for just $7.
Raspberry Pi Touch Display 2
We also upgraded our touch display this year. Raspberry Pi Touch Display 2 is a seven-inch 720×1280px touchscreen display for Raspberry Pi. It’s ideal for interactive projects such as tablets, entertainment systems, and information dashboards, and it’s available for $60.
Raspberry Pi AI HAT+
For the more confident Raspberry Pi user, you might want something to tempt them to broaden their skills into the field of AI. The Raspberry Pi AI HAT+ features a built-in neural network accelerator, turning your Raspberry Pi 5 into a high-performance, accessible, and power-efficient AI machine. The Raspberry Pi AI HAT+ allows you to build a wide range of AI-powered applications for process control, home automation, research, and more. It’s on sale now from $70.
Raspberry Pi AI Camera
For more easy-to-deploy vision AI applications and neural network models, we’d recommend our new Raspberry Pi AI Camera, which takes advantage of Sony’s IMX500 Intelligent Vision Sensor. It’s available now for $70, and it works with any model of Raspberry Pi — including the super low-cost Zero family.
Stocking stuffers
If you’re looking for some smaller-but-still-mighty bits to fit in a stocking, we have some great affordable options too. Below is a list of some of the very latest, including a recent fan favourite, the…
Raspberry Pi Bumper
Protect and secure your Raspberry Pi 5 with the Raspberry Pi Bumper, a snap-on silicone cover that protects the bottom and edges of the board. This is a lovely, affordable, and super useful gift for any Raspberry Pi user, and it costs just $3.
Raspberry Pi SD Cards
2024 saw the release of our first-party Raspberry Pi SD Cards. Rigorously tested to ensure optimal performance on Raspberry Pi computers, these Class A2 microSD cards help ensure you get the smoothest user experience from your device. They are available in three different capacities to fit your needs.
32GB64GB128GB
Raspberry Pi SSD Kit
With a Raspberry Pi M.2 HAT+ and a Raspberry Pi NVMe SSD bundled together, the Raspberry Pi SSD Kit lets you unlock outstanding performance for I/O intensive applications on your Raspberry Pi 5 — including super-fast startup when booting from SSD. The Kit is available now, in 256GB or 512GB capacities, from $40.
You can also grab the SSDs on their own, starting from $30.
Raspberry Pi USB 3 Hub
Our Raspberry Pi USB 3 Hub is the solution to your need for more peripherals than you have ports: it provides extra connectivity for your devices by turning one USB-A port into four, and is compatible with all Raspberry Pi devices. We think it’s the best you can buy. You can get one now for just $12.
Mugs, stickers, and badges
If you’re looking for something super fun and easy, check out our Raspberry Pi-branded merchandise, available to buy online from your local Approved Reseller. If you’re in Cambridge, UK, a trip to the Raspberry Pi Store would put stickers, mugs, water bottles, t-shirts, and more in your hands right away. (More on that below.)
Books, books, and more books
A personal favourite of mine this Christmas, and certainly your dearest retro gamer’s, is Code the Classics Volume II (£24.99), which shows you how to create your own video games inspired by some of the seminal games of the 1980s.
If you were thinking of getting your favourite tinkering photographer a Raspberry Pi Camera, it might also be a good idea to pick up a copy of The Official Raspberry Pi Camera Guide (£14.99) — we released an updated second edition just last week.
That’s not the only new title to hit the Raspberry Pi Press store this year. If it’s our newest releases you’re interested in, you have titles such as the Book of Making 2025 and The Official Raspberry Pi Handbook 2025 (both originally priced at £14) to choose from. A special 30% discount will be applied at checkout if you choose either of these books.
If you’d like to purchase a gift that keeps on giving all year round, you can subscribe to receive a brand new edition of the official Raspberry Pi magazine, The MagPi, on your doorstep each month. You’ll also get a free Raspberry Pi Pico W if you sign up to a six- or twelve-month subscription.
The Raspberry Pi Store
If you’d like to get out into the twinkling streets of Cambridge at Christmas time, the Raspberry Pi Store in the Grand Arcade (we’re upstairs!) has stock of everything above and much, much more. We’ve also picked some excellently knowledgeable staff who can help you choose something if you’re not sure what you’re looking for.
Update: In advance of official MicroPython support for Pico 2 W, you can download our unofficial MicroPython build here; you’ll find the README here.
Today our epic autumn of product launches continues with Raspberry Pi Pico 2 W, the wireless-enabled variant of this summer’s Pico 2. Built around our brand new RP2350 microcontroller, featuring the tried and tested wireless modem from the original Pico W, and priced at just $7, it’s the perfect centrepiece for your connected Internet of Things projects.
RP2350: the connoisseur’s microcontroller, redux
When we launched our debut microcontroller, RP2040, way back in 2021, we couldn’t have imagined the incredible range of products that would be built around it, or the uses that the community would put them to. Combining a symmetric pair of fast integer cores; a large, banked, on-chip memory; rich support for high-level languages; and our patented programmable I/O (PIO) subsystem, it quickly became the go-to device for enthusiasts and professional engineers seeking high-performance, deterministic interfacing at a low price point.
RP2350 builds on this legacy, offering faster cores, more memory, floating point support, on-chip OTP, optimised power consumption, and a rich security model built around Arm’s TrustZone for Cortex-M. It debuted in August on Pico 2, on the DEF CON 32 badge (designed by our friends at Entropic Engineering, with firmware and a gonzo sidewalk badge presentation by the redoubtable Dmitry Grinberg), and on a wide variety of development boards and other products from our early-access partners.
Wireless things
Many of the projects and products that people build on top of our platforms — whether that’s our Linux-capable Raspberry Pi computers, our microcontroller boards, or our silicon products — answer to the general description “Internet of Things”. They combine local compute, storage, and interfacing to the real world with connectivity back to the cloud.
Raspberry Pi Pico 2 W brings all the power of RP2350 to these IoT projects. The on-board CYW43439 modem from our friends at Infineon provides 2.4GHz 802.11n wireless LAN and Bluetooth 5.2 connectivity, and is supported by C and MicroPython libraries. Enthusiasts benefit from the breadboard-friendly Pico form factor, while our upcoming RM2 radio module (already in use on Pimoroni’s Pico Plus 2 W) provides a route to scale for professional products which have been prototyped on the platform.
More to come
We’re very pleased with how Pico 2 W has turned out. And, where the Pico 1 series ended with Pico W, we have a few more ideas in mind for the Pico 2 series. Keep an eye out for more news in early 2025.
We are enormously proud to reveal The Official Raspberry Pi Camera Module Guide (2nd edition), which is out now. David Plowman, a Raspberry Pi engineer specialising in camera software, algorithms, and image-processing hardware, authored this official guide.
This detailed book walks you through all the different types of Camera Module hardware, including Raspberry Pi Camera Module 3, High Quality Camera, Global Shutter Camera, and older models; discover how to attach them to Raspberry Pi and integrate vision technology with your projects. This edition also covers new code libraries, including the latest PiCamera2 Python library and rpicam command-line applications, as well as integration with the new Raspberry Pi AI Kit.
Save time with our starter guide
Our starter guide has clear diagrams explaining how to connect various Camera Modules to the new Raspberry Pi boards. It also explains how to fit custom lenses to HQ and GS Camera Modules using C-CS adaptors. Everything is outlined in step-by-step tutorials with diagrams and photographs, making it quick and easy to get your camera up and running.
Test your camera properly
You’ll discover how to connect your camera to a Raspberry Pi and test it using the new rpicam command-line applications — these replace the older libcam applications. The guide also covers the new PiCamera2 Python library, for integrating Camera Module technology with your software.
Get more from your images
Discover detailed information about how Camera Module works, and how to get the most from your images. You’ll learn how to use RAW formats and tuning files, HDR modes, and preview windows; custom resolutions, encoders, and file formats; target exposure and autofocus; shutter speed, and gain, enabling you to get the very best out of your imaging hardware.
Build smarter projects with AI Kit integration
A new chapter covers the integration of the AI Kit with Raspberry Pi Camera Modules to create smart imaging applications. This adds neural processing to your projects, enabling fast inference of objects captured by the camera.
Boost your skills with pre-built projects
The Official Raspberry Pi Camera Module Guide is packed with projects. Take selfies and stop-motion videos, experiment with high-speed and time-lapse photography, set up a security camera and smart door, build a bird box and wildlife camera trap, take your camera underwater, and much more! All of the code is tested and updated for the latest Raspberry Pi OS, and is available on GitHub for inspection.
Raspberry Pi OS comes with Python pre-installed, and you need to use its virtual environments to install packages. The latest issue of The MagPi, out today, features this handy tutorial, penned by our documentation lead Nate Contino, to get you started.
Raspberry Pi OS comes with Python 3 pre-installed. Interfering with the system Python installation can cause problems for your operating system. When you install third-party Python libraries, always use the correct package-management tools.
On Linux, you can install python dependencies in two ways:
use apt to install pre-configured system packages
use pip to install libraries using Python’s dependency manager in a virtual environment
It is possible to create virtual environments inside Thonny as well as from the command line
Install Python packages using apt
Packages installed via apt are packaged specifically for Raspberry Pi OS. These packages usually come pre-compiled, so they install faster. Because apt manages dependencies for all packages, installing with this method includes all of the sub-dependencies needed to run the package. And apt ensures that you don’t break other packages if you uninstall.
For instance, to install the Python 3 library that supports the Raspberry Pi Build HAT, run the following command:
$ sudo apt install python3-build-hat
To find Python packages distributed with apt, use apt search. In most cases, Python packages use the prefix python- or python3-: for instance, you can find the numpy package under the name python3-numpy.
Install Python libraries using pip
In older versions of Raspberry Pi OS, you could install libraries directly into the system version of Python using pip. Since Raspberry Pi OS Bookworm, users cannot install libraries directly into the system version of Python.
Attempting to install packages with pip causes an error in Raspberry Pi OS Bookworm
Instead, install libraries into a virtual environment (venv). To install a library at the system level for all users, install it with apt.
Attempting to install a Python package system-wide outputs an error similar to the following:
$ pip install buildhat
error: externally-managed-environment
× This environment is externally managed
╰─> To install Python packages system-wide, try apt install
python3-xyz, where xyz is the package you are trying to
install.
If you wish to install a non-Debian-packaged Python package,
create a virtual environment using python3 -m venv path/to/venv.
Then use path/to/venv/bin/python and path/to/venv/bin/pip. Make
sure you have python3-full installed.
For more information visit http://rptl.io/venv
note: If you believe this is a mistake, please contact your Python installation or OS distribution provider. You can override this, at the risk of breaking your Python installation or OS, by passing --break-system-packages.
hint: See PEP 668 for the detailed specification.
Python users have long dealt with conflicts between OS package managers like apt and Python-specific package management tools like pip. These conflicts include both Python-level API incompatibilities and conflicts over file ownership.
Starting in Raspberry Pi OS Bookworm, packages installed via pip must be installed into a Python virtual environment (venv). A virtual environment is a container where you can safely install third-party modules so they won’t interfere with your system Python.
Use pip with virtual environments
To use a virtual environment, create a container to store the environment. There are several ways you can do this depending on how you want to work with Python:
per-project environments
Create a virtual environment in a project folder to install packages local to that project
Many users create separate virtual environments for each Python project. Locate the virtual environment in the root folder of each project, typically with a shared name like env. Run the following command from the root folder of each project to create a virtual environment configuration folder:
$ python -m venv env
Before you work on a project, run the following command from the root of the project to start using the virtual environment:
$ source env/bin/activate
You should then see a prompt similar to the following:
(env) $
When you finish working on a project, run the following command from any directory to leave the virtual environment:
$ deactivate
per-user environments
Instead of creating a virtual environment for each of your Python projects, you can create a single virtual environment for your user account. Activate that virtual environment before running any of your Python code. This approach can be more convenient for workflows that share many libraries across projects.
When creating a virtual environment for multiple projects across an entire user account, consider locating the virtual environment configuration files in your home directory. Store your configuration in a folder whose name begins with a period to hide the folder by default, preventing it from cluttering your home folder.
Add a virtual environment to your home directory to use it in multiple projects and share the packages
Use the following command to create a virtual environment in a hidden folder in the current user’s home directory:
$ python -m venv ~/.env
Run the following command from any directory to start using the virtual environment:
$ source ~/.env/bin/activate
You should then see a prompt similar to the following:
(.env) $
To leave the virtual environment, run the following command from any directory:
$ deactivate
Create a virtual environment
Run the following command to create a virtual environment configuration folder, replacing <env-name> with the name you would like to use for the virtual environment (e.g. env):
$ python -m venv <env-name>
Enter a virtual environment
Then, execute the bin/activate script in the virtual environment configuration folder to enter the virtual environment:
$ source <env-name>/bin/activate
You should then see a prompt similar to the following:
(<env-name>) $
The (<env-name>) command prompt prefix indicates that the current terminal session is in a virtual environment named <env-name>.
To check that you’re in a virtual environment, use pip list to view the list of installed packages:
(<env-name>) $ pip list
Package Version
---------- -------
pip 23.0.1
setuptools 66.1.1
The list should be much shorter than the list of packages installed in your system Python. You can now safely install packages with pip. Any packages you install with pip while in a virtual environment only install to that virtual environment. In a virtual environment, the python or python3 commands automatically use the virtual environment’s version of Python and installed packages instead of the system Python.
Top Tip Pass the –system-site-packages flag before the folder name to preload all of the currently installed packages in your system Python installation into the virtual environment.
Exit a virtual environment
To leave a virtual environment, run the following command:
(<env-name>) $ deactivate
Use the Thonny editor
We recommend Thonny for editing Python code on the Raspberry Pi.
By default, Thonny uses the system Python. However, you can switch to using a Python virtual environment by clicking on the interpreter menu in the bottom right of the Thonny window. Select a configured environment or configure a new virtual environment with Configure interpreter.
The MagPi #148 out NOW!
You can grab the new issue right now from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available at our online store, which ships around the world. You can also get it via our app on Android or iOS.
You can also subscribe to the print version of The MagPi. Not only do we deliver it globally, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico W!
In this guest post, Ramona Rayner from our partner Sony shows you how to quickly explore different models and AI capabilities, and how you can easily build applications on top of the Raspberry Pi AI Camera.
The recently launched Raspberry Pi AI Camera is an extremely capable piece of hardware, enabling you to build powerful AI applications on your Raspberry Pi. By offloading the AI inference to the IMX500 accelerator chip, more computational resources are available to handle application logic right on the edge! We are very curious to see what you will be creating and we are keen to give you more tools to do so. This post will cover how to quickly explore different models and AI capabilities, and how to easily build applications on top of the Raspberry Pi AI Camera.
If you didn’t have the chance to go through the Getting Started guide, make sure to check that out first to verify that your AI Camera is set up correctly.
Explore pre-trained models
A great way to start exploring the possibilities of the Raspberry Pi AI Camera is to try out some of the pre-trained models that are available in the IMX500 Model Zoo. To simplify the exploration process, consider using a GUI Tool, designed to quickly upload different models and see the real-time inference results on the AI Camera.
In order to start the GUI Tool, make sure to have Node.js installed. (Verify Node.js is installed by running node --version in the terminal.) And build and run the tool by running the following commands in the root of the repository:
Exploring the different models gives you insight into the camera’s capabilities and enables you to identify the model that best suits your requirements. When you think you’ve found it, it’s time to build an application.
Building applications
Plenty of CPU is available to run applications on the Raspberry Pi while model inference is taking place on the IMX500. To demonstrate this we’ll run a Workout Monitoring sample application.
The goal is to count real-time exercise repetitions by detecting and tracking people performing common exercises like pull-ups, push-ups, ab workouts and squats. The app will count repetitions for each person in the frame, making sure multiple people can work out simultaneously and compete while getting automated rep counting.
The latest release of Raspberry Pi OS includes an all-new, native panel plugin for Raspberry Pi Connect, our secure remote access solution that allows you to connect to your Raspberry Pi desktop and command line directly from your web browser.
By default, Raspberry Pi Connect will be installed but disabled, only becoming active for your current user if you choose ‘Turn On Raspberry Pi Connect’ from the menu bar, or by running rpi-connect on from the terminal.
If this is your first time trying the service, using the menu bar will open your browser to sign up for a free Raspberry Pi Connect account; alternatively, you can run rpi-connect signin from the terminal to print a unique URL that you can open on any device you like. Once signed up and signed in, you can then connect to your device either via screen sharing (if you’re using Raspberry Pi desktop) or via remote shell from your web browser on any computer.
You can now stop and disable the service for your current user by choosing ‘Turn Off Raspberry Pi Connect’ or running rpi-connect off from the terminal.
With the latest release of 2.1.0 (available via software update), we now include a new rpi-connect doctor command that runs a series of connectivity tests to check the service can establish connections properly. We make every effort to ensure you can connect to your device without having to make any networking changes or open ports in your firewall — but if you’re having issues, run the command like so:
$ rpi-connect doctor
✓ Communication with Raspberry Pi Connect API
✓ Authentication with Raspberry Pi Connect API
✓ Peer-to-peer connection candidate via STUN
✓ Peer-to-peer connection candidate via TURN
Full documentation for Raspberry Pi Connect can be found on our website, or via man rpi-connect in the terminal when installed on your device.
Updates on updates
We’ve heard from lots of users about the features they’d most like to see next, and we’ve tried to prioritise the things that will bring the largest improvements in functionality to the largest number of users. Keep an eye on this blog to see our next updates.
This #MagPiMonday, we’re hoping to inspire you to add artificial intelligence to your Raspberry Pi designs with this feature by Phil King, from the latest issue of The MagPi.
With their powerful AI accelerator modules, Raspberry Pi’s Camera Module and AI Kit open up exciting possibilities in computer vision and machine learning. The versatility of the Raspberry Pi platform, combined with AI capabilities, opens up a world of new possibilities for innovative smart projects. From creative experiments to practical applications like smart pill dispensers, makers are harnessing the kit’s potential to push the boundaries of AI. In this feature, we explore some standout projects, and hope they inspire you to embark on your own.
AI computer vision can identify objects within a live camera view. In this project, VEEB’s Martin Spendiff and Vanessa Bradley have used it to detect humans in the frame, so you can tell if your boss is approaching behind you as you sit at your desk!
The project comprises two parts. A Raspberry Pi 5 equipped with a Camera Module and AI Kit handles the image recognition and also acts as a web server. This uses web sockets to send messages wirelessly to the ‘detector’ part — a Raspberry Pi Pico W and a voltmeter whose needle moves to indicate the level of AI certainty for the ID.
Having got their hands on an AI Kit — “a nice intro into computer vision” — it took the pair just three days to create Peeper Pam. “The most challenging bit was that we’d not used sockets — more efficient than the Pico constantly asking Raspberry Pi ‘do you see anything?’,” says Martin. “Raspberry Pi does all the heavy lifting, while Pico just listens for an ‘I’ve seen something’ signal.”
While he notes that you could get Raspberry Pi 5 to serve both functions, the two-part setup means you can place the camera in a different position to monitor a spot you can’t see. Also, by adapting the code from the project’s GitHub repo, there are lots of other uses if you get the AI to deter other objects. “Pigeons in the window box is one that we want to do,” Martin says.
Never one to do things by halves, Jeff Geerling went overboard with Raspberry Pi AI Kit and built a Monster AI Pi PC with a total of eight neural processors. In fact, with 55 TOPS (trillions of operations per second), it’s faster than the latest AMD, Qualcomm, and Apple Silicon processors!
The NPU chips — including the AI Kit’s Hailo-8L — are connected to a large 12× PCIe slot card with a PEX 8619 switch capable of handling 16 PCI Express Gen 2 lanes. The card is then mounted on a Raspberry Pi 5 via a Pineboards uPCIty Lite HAT, which has an additional 12V PSU to supply the extra wattage needed for all those processors.
With a bit of jiggery-pokery with the firmware and drivers on Raspberry Pi, Jeff managed to get it working.
As a proof of concept, Japanese maker Naveen aimed to implement an automated system for identifying and monitoring cars at toll plazas to get an accurate tally of the vehicles entering and exiting.
With the extra processing power provided by a Raspberry AI Kit, the project uses Edge Impulse computer vision to detect and count cars in the view from a Camera Module Wide. “We opted for a wide lens because it can capture a larger area,” he says, “allowing the camera to monitor multiple lanes simultaneously.” He also needed to train and test a YOLOv5 machine learning model. All the details can be found on the project page via the link above, which could prove useful for learning how to train custom ML models for your own AI project.
Wearing a safety helmet on a building site is essential and could save your life. This computer vision project uses Raspberry Pi AI Kit with the advanced YOLOv8 machine learning model to quickly and accurately identify objects within the camera view, running at an impressive inference speed of 30fps.
The project page has a guide showing how to make use of Raspberry Pi AI Kit to achieve efficient AI inferencing for safety helmet detection. This includes details of the software installation and model training process, for which the maker has provided a link to a dataset of 5000 images with bounding box annotations for three classes: helmet, person, and head.
Google’s MediaPipe is an open-source framework developed for building machine learning pipelines, especially useful for working with videos and images.
Having used MediaPipe on other platforms, Mario Bergeron decided to experiment with it on a Raspberry Pi AI Kit. On the project page (linked above) he details the process, including using his Python demo application with options to detect hands/palms, faces, or poses.
Mario’s test results show how much better the AI Kit’s Hailo-8L AI accelerator module performs compared to running reference TensorFlow Lite models on Raspberry Pi 5 alone: up to 5.8 times faster. With three models running for hand and landmarks detection, the frame rate is 26–28fps with one hand detected, and 22–25fps for two.
The MagPi #147 out NOW!
You can grab the new issue right now from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available at our online store, which ships around the world. You can also get it via our app on Android or iOS.
You can also subscribe to the print version of The MagPi. Not only do we deliver it globally, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico W!
Most Raspberry Pi single-board computers, with the exception of the Raspberry Pi Zero and A+ form factors, incorporate an on-board USB hub to fan out a single USB connection from the core silicon, and provide multiple downstream USB Type-A ports. But no matter how many ports we provide, sometimes you just need more peripherals than we have ports. And with that in mind, today we’re launching the official Raspberry Pi USB 3 Hub, a high-quality four-way USB 3.0 hub for use with your Raspberry Pi or other, lesser, computer.
Key features include:
A single upstream USB 3.0 Type-A connector on an 8 cm captive cable
Four downstream USB 3.0 Type-A ports
Aggregate data transfer speeds up to 5 Gbps
USB-C socket for optional external 3A power supply (sold separately)
Race you to the bottom
Why design our own hub? Well, we’d become frustrated with the quality and price of the hubs available online. Either you pay a lot of money for a nicely designed and reliable product, which works well with a broad range of hosts and peripherals; or you cheap out and get something much less compatible, or unreliable, or ugly, or all three. Sometimes you spend a lot of money and still get a lousy product.
It felt like we were trapped in a race to the bottom, where bad quality drives out good, and marketplaces like Amazon end up dominated by the cheapest thing that can just about answer to the name “hub”.
So, we worked with our partners at Infineon to source a great piece of hub silicon, CYUSB3304, set Dominic to work on the electronics and John to work on the industrial design, and applied our manufacturing and distribution capabilities to make it available at the lowest possible price. The resulting product works perfectly with all models of Raspberry Pi computer, and it bears our logo because we’re proud of it: we believe it’s the best USB 3.0 hub on the market today.
Grab one and have a play: we think you’ll like it.
Meet Kari Lawler, a YouTuber with a passion for collecting and fixing classic computers, as well as retro gaming.This interview first appeared in issue 147 of The MagPi magazine.
Kari Lawler has a passion for retro tech — and despite being 21, her idea of retro fits with just about everyone’s definition, as she collects and restores old Commodore 64s, Amiga A500s, and Atari 2600s. Stuff from before even Features Editor Rob was born, and he’s rapidly approaching 40. Kari has been involved in the tech scene for ten years though, doing much more than make videos on ’80s computers.
“I got my break into tech at around 11 years old, when I hacked together my very own virtual assistant and gained some publicity,” Kari says. “This inspired me to learn more, especially everything I could about artificial intelligence. Through this, I created my very own youth programme called Youth4AI, in which I engaged with and taught thousands of young people about AI. As well as my youth programme, I was lucky enough to work on many AI projects and branch out into government advisory work as well. Culminating, at 18 years old, in being entered into the West Midlands Women in Tech Hall of Fame, with a Lifetime Achievement Award of all things.”
What’s your history with making?
“Being brought up in a family of makers, I suppose it was inevitable I got the bug as well. From an early age, I fondly remember being surrounded by arts and crafts, and attending many sessions. From sewing to pottery and basic electronics to soldering, I enjoyed everything I did. Which resulted in me creating many projects, from a working flux capacitor (well, it lit up) for school homework, to utilising what I learned to make fun projects to share with others when I volunteered at my local Raspberry Pi Jam. Additionally, at around the age of 12 I was introduced to the wonderful world of 3D printing, and I’ve utilised that knowledge in many of the projects I’ve shared online. Starting with the well-received ’24 makes for Christmas’ I did over on X [formerly Twitter] in 2017, aged 14, which featured everything from coding Minecraft to robot sprouts. And I’ve been sharing what I make over on my socials ever since.”
Fun fact: The code listings in The MagPi are inspired by magazines from the 1980s, which also printed code listings. Although you can download all of ours as well
How did you get into retro gaming?
“Both my uncle and dad had a computer store in the ’90s, the PS1/N64 era, and while they have never displayed any of it, what was left of the shop was packed up and put into storage. And, me being me, I was quite interested in learning more about what was in those boxes. Additionally, I grew up with a working BBC Micro in the house, so have fond memories playing various games on it, especially Hangman — I think I was really into spelling bees at that point. So, with that and the abundance of being surrounded by old tech, I really got into learning about the history of computing and gaming. Which led me to getting the collecting bug, and to start adding to the collection myself so I could experience more and more tech from the past.”
One of Kari’s more recent projects was fixing a PSOne, the smaller release of the original PlayStation but with a screen attached
What’s your favourite video that you’ve made?
“Now that’s a hard one to answer. But if I go back to one of my first videos, Coding games like it’s the ’80s, it’s one that resonates with how I got my first interest in programming. My dad introduced me to Usborne computer books from the 1980s, just after I started learning Python, and said ‘try and convert some of these’. I accepted that challenge, and that’s what got me fascinated with ’80s programming books, hence the video I made. With the Usborne books specifically, there is artwork and a back story for each game. And while technically not great games, I just love how they explain the code and challenge the reader to improve. For which, I’m sure some of my viewers will be pleased to hear, I have in the works more videos exploring programming books/magazine type-in listings from the ’80s.”
Recreating classic NES Tetrinomoes with a 3D printer to make cool geometric magnets
The MagPi #147 out NOW!
You can grab the new issue right now from Tesco, Sainsbury’s, Asda, WHSmith, and other newsagents, including the Raspberry Pi Store in Cambridge. It’s also available at our online store, which ships around the world. You can also get it via our app on Android or iOS.
You can also subscribe to the print version of The MagPi. Not only do we deliver it globally, but people who sign up to the six- or twelve-month print subscription get a FREE Raspberry Pi Pico W!