โŒ

Reading view

There are new articles available, click to refresh the page.

OpenUC2 10x is an ESP32-S3 portable microscope with AI-powered real-time image analysis

Seeed Studio OpenUC2 10x AI Microscope

Seeed Studio has recently launched the OpenUC2 10x AI portable microscope built around the XIAO ESP32-S3 Sense module. Designed for educational, environmental research, health monitoring, and prototyping applications this microscope features an OV2640 camera with a 10x magnification with precise motorized focusing, high-resolution imaging, and real-time TinyML processing for image handling. The microscope is modular and open-source making it easy to customize and expand its features using 3D-printed parts, motorized stages, and additional sensors. It supports Wi-Fi connectivity with a durable body, uses USB-C for power and swappable objectives make it usable in various applications. Previously we have written about similar portable microscopes like the ioLight microscope and the KoPa W5 Wi-Fi Microscope, and Jean-Luc also tested a cheap USB microscope to read part number of components. Feel free to check those out if you are looking for a cheap microscope. OpenUC2 10x specifications: Wireless MCU โ€“ Espressif Systemsย ESP32-S3 CPU [...]

The post OpenUC2 10x is an ESP32-S3 portable microscope with AI-powered real-time image analysis appeared first on CNX Software - Embedded Systems News.

Using Arduino UNO to sync a visual neuroscience lab

Common research methods to study the visual system in the laboratory include recording and monitoring neural activity in the presence of sensory stimuli, to help scientists study how neurons encode and respond, for example, to specific visual inputs.ย 

One of the biggest technical problems in the neural recording setups used in such experiments, is achieving precise synchronization of multiple devices communicating with each other, including microscopes and screens displaying the stimuli, to accurately map neural responses to the visual events.

For example, in the Rompani Lab, a visual neuroscience laboratory at the European Molecular Biology Laboratory (EMBL) in Rome, the recording system (a two-photon microscope) needs to communicate with the visual stimulation system (composed of two screens) that are used to show visual stimuli while recording neural activity. To synchronize these systems efficiently, they turned to an Arduino UNO Rev3. โ€œIts simplicity, reliability, and ease of integration made it an ideal tool for handling the timing and communication between different devices in the lab,โ€ says Pietro Micheli, PhD student at EMBL Rome.ย 

How the setups works

The Arduino UNO Rev3 is used to signal to the microscope when the stimulus (which is basically just a short video) starts and when it ends. While the microscope is recording and acquiring frames, a simple firmware tells the UNO to listen to the data stream on a COM port of the computer used to control the visual stimulation.ย 

Within the Pythonยฎ script used for controlling the screens, every time a new stimulus starts a command is written on the serial port. The microcontroller reads the command, which can be either โ€˜Hโ€™ or โ€˜Lโ€™, and sets the voltage of the output TTL at pin 9 to 5V or 0V, respectively. This TTL signal goes to the microscope controller, which generates time stamps for the microscope status. These timestamps contain the exact frame numbers of the microscope recording at which the stimulus started (rising edge of the TTL) and ended (falling edge of the TTL).

All this information is essential for the analysis of the recording, as it allows the researchers at EMBL Rome to align the neural responses recorded to the stimulation protocol presented. Once the neural activity is aligned, the downstream analysis can begin, focusing on understanding the deeper brain activity.ย 

Ever wonder what neurons that are firing look like?ย 

Micheli shared with us an example of the type of neural activity acquired during an experimental session with the setup described above.ย 

The small blinking dots are individual neurons recorded from the visual cortex of an awake, behaving mouse. The signal being monitored is the fluorescence of a particular protein produced by neurons, which indicates their activity level. After the light emitted by the neurons has been recorded and digitised, researchers extract fluorescence traces for each neuron. At this point, they can proceed with the analysis of the neural activity, to try to understand how the visual stimuli shown are actually encoded by the recorded neural population.

The post Using Arduino UNO to sync a visual neuroscience lab appeared first on Arduino Blog.

โŒ