Reading view

There are new articles available, click to refresh the page.

Exploring fungal intelligence with biohybrid robots powered by Arduino

At Cornell University, Dr. Anand Kumar Mishra and his team have been conducting groundbreaking research that brings together the fields of robotics, biology, and engineering. Their recent experiments, published in Science, explore how fungal mycelia can be used to control robots. The team has successfully created biohybrid robots that move based on electrical signals generated by fungi – a fascinating development in the world of robotics and biology.

A surprising solution for robotics: fungi

Biohybrid robots have traditionally relied on animal or plant cells to control movements. However, Dr. Mishra’s team is introducing an exciting new component into this field: fungi – which are resilient, easy to culture, and can thrive in a wide range of environmental conditions. This makes them ideal candidates for long-term applications in biohybrid robotics.

Dr. Mishra and his colleagues designed two robots: a soft, starfish-inspired walking one, and a wheeled one. Both can be controlled using the natural electrophysiological signals produced by fungal mycelia. These signals are harnessed using a specially designed electrical interface that allows the fungi to control the robot’s movement.

The implications of this research extend far beyond robotics. The integration of living systems with artificial actuators presents an exciting new frontier in technology, and the potential applications are vast – from environmental sensing to pollution monitoring.

How it works with Arduino

At the heart of this innovative project is the Arduino platform, which served as the main interface to control the robots. As Dr. Mishra explains, he has been using Arduino for over 10 years and naturally turned to it for this experiment: “My first thought was to control the robot using Arduino.” The choice was ideal in terms of accessibility, reliability, and ease of use – and allowed for seamless transition from prototyping with UNO R4 WiFi to final solution with Arduino Mega.

To capture and process the tiny electrical signals from the fungi, the team used a high-resolution 32-bit ADC (analog-to-digital converter) to achieve the necessary precision. “We processed each spike from the fungi and used the delay between spikes to control the robot’s movement. For example, the width of the spike determined the delay in the robot’s action, while the height was used to adjust the motor speed,” Dr. Mishra shares.

The team also experimented with pulse width modulation (PWM) to control the motor speed more precisely, and managed to create a system where the fungi’s spikes could increase or decrease the robot’s speed in real-time. “This wasn’t easy, but it was incredibly rewarding,” says Dr. Mishra. 

And it’s only the beginning. Now the researchers are exploring ways to refine the signal processing and enhance accuracy – again relying on Arduino’s expanding ecosystem, making the system even more accessible for future scientific experiments.

All in all, this project is an exciting example of how easy-to-use, open-source, accessible technologies can enable cutting-edge research and experimentation to push the boundaries of what’s possible in the most unexpected fields – even complex biohybrid experiments! As Dr. Mishra says, “I’ve been a huge fan of Arduino for years, and it’s amazing to see how it can be used to drive advancements in scientific research.”

The post Exploring fungal intelligence with biohybrid robots powered by Arduino appeared first on Arduino Blog.

This DIY smart chicken coop features AI-based predator detection

Raising chickens can be a very rewarding endeavor, as they can provide fresh daily eggs and help get rid of pests in the yard. But, like all animals, they require care. Most importantly, you’ll need to ensure that they have regular food and water, and you’ll need to protect them from predators like coyotes, foxes, and cats. To ease the workload, you may want to consider building Coders Cafe’s DIY smart chicken coop that features AI-based predator detection.

The purpose of a coop, aside from being a comfy place for chickens to roost, is to provide protection from weather and predators. This design is pretty small and is probably only suitable for one or two chickens, but the concepts can be applied to larger coops. It provides a few very useful features: remote or automated feeding, remote or automated door operation, and predator detection with remote notifications. You’ll never have to worry that you forgot to feed the chickens or that you left the door open, and you can respond immediately if you get a notification about a predator.

An Arduino UNO R4 WiFi board oversees those features, operating the door and dispensing food using simple motor-driven mechanisms. A companion app lets the user set an automated door and food schedule, or perform those actions with the tap of the button. A Twilio app integration enables SMS alerts.

The predator-detecting magic works thanks to DFRobot’s HuskyLens AI camera sensor. Users can train that to recognize specific predators and then it will tell the Arduino if it sees one. That communication occurs over I2C and is easy to setup, removing all of the difficulty of implementing AI. 

The post This DIY smart chicken coop features AI-based predator detection appeared first on Arduino Blog.

Technology meets creativity in two interactive art student projects

Art and engineering are not separate concepts. There is a great deal of overlap between the two and many modern disciplines increasingly blur those lines. Mónica Rikic is an “electronic artist and creative coder” who embodies that idea: you might remember her and her incredible Arduino UNO R4-powered installations from our blog post last year. In addition to her artistic practice, her technology-forward approach inspires her work as an educator, as she helps her master’s students develop hybrid concepts that use microcontrollers, sensors, lights and a variety of different technologies to create interactive art pieces. The level of creativity that technology is able to unleash is readily apparent in two of her students’ projects: Flora and Simbioceno.

Flora, created by College of Arts & Design of Barcelona students Judit Castells, Paula Jaime, Daniela Guevara, and Mariana Pachón, is a board game in the form of an interactive art installation. It was inspired by nature, with gameplay occurring throughout a simulated ecosystem. An Arduino UNO R4 WiFi board handles the interactive elements, with additional hardware including NFC readers, motors and accompanying drivers, sensors, pumps, LEDs, and more. 

Simbioceno, by Ander Vallejo Larre, Andrea Galano Toro, Pierantonio Mangia, and Rocío Gomez, also uses an UNO R4 WiFi. It consists of two ecosystems: one aquatic and one aerial-terrestrial. They exist in symbiosis, communicating and sharing resources as necessary. Hardware includes LEDs, pumps, and biofeedback sensors. The students put particular thought into the construction materials, many of which are recycled or biomaterials. 

Both projects are interactive art and expressions of creativity. While they do integrate technology, that technology isn’t the focal point. Instead, the technology helps to bring the two experiences to life.Feeling inspired by this creative use of the Arduino platform? We hope you’ll develop your own projects and share them with us and the entire community: contact creators@arduino.cc or upload directly to Project Hub! You could be our next Arduino Star.

The post Technology meets creativity in two interactive art student projects appeared first on Arduino Blog.

This perplexing robotic performer operates under the control of three different Arduino boards

Every decade or two, humanity seems to develop a renewed interest in humanoid robots and their potential within our world. Because the practical applications are actually pretty limited (given the high cost), we inevitably begin to consider how those robots might function as entertainment. But Jon Hamilton did more than just wonder, he actually built a robotic performer called Syntaxx and it will definitely make you feel things.

It is hard to describe this robot without sounding like a Mad Libs game filled out by a cyberpunk-obsessed DJ. Hamilton designed it to give performances, primarily in the form of synthetic singing accompanied by electronic music. It looks like a crude Halloween mask given life by a misguided wizard sometime in the 1980s. It is pretty bonkers and you should probably watch the video of it in action to wrap your head around the concept.

Hamilton needed three different Arduino development boards to bring this robot to life. The first, an Arduino Giga R1 WiFi, oversees the robot’s operation and handles voice interaction, as well as audio playback. The second, an Arduino Mega 2560, moves the robot’s neck according to input from two microphones (one on the left, the other on the right). The third, an Arduino Uno R4 WiFi, controls the rest of the servo movement. 

The result is a robot that is both impressive and also pretty disconcerting. 

The post This perplexing robotic performer operates under the control of three different Arduino boards appeared first on Arduino Blog.

❌