Skip to main content

Mind-bending MIT project uses lasers to generate music from spiderwebs

Spiders may typically have eight eyes, but very few have good eyesight. Instead, they rely on vibrations to navigate and seek out their prey. That’s, in essence, what a spiderweb is: A giant, enormously complicated crisscross of tripwires that can tell a spider exactly when — and where — some delicious bit of food has landed on its web.

As humans, we’re not exactly privy to what that experience would feel like. But Markus Buehler, a Professor of Engineering at the Massachusetts Institute of Technology, has come up with an intriguing way of simulating it — and it involves laser scanning, virtual reality, and the medium of music.

Spider With Prey
Markus J Buehler/MIT

“We have given the silent spiderweb, especially often-overlooked cobwebs, a voice, and shed light on their innate intricate structural complexity,” Buehler told Digital Trends. “[We] made it audible by developing an interactive musical instrument that lets us explore sonically how the spiderweb sounds like as it is being built.”

According to this creation, being on a spiderweb sounds a whole lot like an orchestra of wind chimes, scored by John Carpenter. No wonder spiders perpetually seem on edge!

A world of vibrations

Whether it’s Vivaldi’s “Four Seasons” quarter of string concertos or Mozart’s use of the Fibonacci Sequence, plenty of musicians have been inspired by nature over the years. But none have turned the sounds of the natural world into music with quite the scientific fidelity of Buehler’s creation. In order to create his biofidelic soundscape, Buehler and fellow researchers used a laser scanner to record details of every line of webbing in a spiderweb. Not content with scanning the regular boring webs of any old spider, they focused their efforts on the extremely complex web of the Cyrtophora citricola, also called the tropical tent-web spider.

Using the sheet laser scanner, they took measurements of these webs as a series of images, which they then used an algorithm to reassemble as a three-dimensional model on computer, containing the exact location of each filament and connection point of the web. The researchers then calculated the “vibrational patterns” for each of the strings on the web, basing this on the physics study of string vibrations to understand resonance. This was a complex job; not just because of the massive number of strands, but because each strand has a different vibrational frequency according to its size and elasticity. Next, they aggregated these to reflect the sonic qualities across the entire web.

Thanks to the 3D model, the researchers (or anyone who dons the necessary headset) are able to dip into VR to explore different sections of the web, giving the user a sense of what the audioscape might sound like in each different area. The results are a weird blend of the artistic and the scientific — and Buehler wouldn’t have it any other way.

Spider web sonification: Less busy music, sonification of the porous web along z-axis

“[I’m interested in] pushing the way we create sound and music, by looking to natural phenomena to solicit vibrational patterns for new types of instruments rather than relying on the tradition of ‘harmonic’ tuning like equal temperament,” he said. “We have [so far] done this for proteins and folding, cracks and fractures in materials, and also for spiderwebs. In each case, [we’re] seeking to assess the innate vibrational patterns of these living materials to work out new ways to conceptualize musical structures.”

Spider music

Buehler said that the work is “driven by my long interest to push the boundary of how and why we create music — to use the universality of vibrations in nature as a direct compositional tool.” He noted that: “As a composer of experimental and classical and electronic music, my artistic work explores the creation of new forms of musical expression — such as those derived from biological materials and living systems — as a means to better understand the underlying science and mathematics.”

It’s not just about creating unusual electronic music, though. Buehler noted that this work can be useful for students of the natural world who can better understand the geometries behind prey catching in the spider kingdom. It could also be used as a novel way to help design new materials, by applying this same process to help design by sound. “We find that opening up the brain to process more than just the raw data, but using images and sound as creative means, can be powerful in understanding biological methods — and to be creative as an engineer when it comes to out-of-the-box ideas,” he said.

For now, though, it’s enough that someone created a biofidelic spider theme. No, it probably won’t show up in Marvel’s next Spider-Man movie, and it doesn’t have the same relaxing qualities as whale song, but it’s pretty darn neat all the same. Even if it makes the sight of a spider perched on a web, waiting for flies, look a whole lot less peaceful.

Alongside Buehler, other people who contributed to the project included Ian Hattwick, Isabelle Su, Christine Southworth, Evan Ziporyn, and Tomas Saraceno.

Editors' Recommendations

Luke Dormehl
I'm a UK-based tech writer covering Cool Tech at Digital Trends. I've also written for Fast Company, Wired, the Guardian…
MIT’s Nightmare Machine uses AI to make ordinary photos skin crawling
MIT 'Nightmare Machine'

In “Playtest,” a recent episode of Charlie Brooker’s superb show Black Mirror, audiences got a glimpse at how the future of horror might look in a world of advanced neural networks and augmented reality. As with everything about Black Mirror, the world the episode portrays is a near-future dystopia, in which the technology is recognizable, but ever-so-slightly out of reach.

It may be a bit closer than you think, though, based on a new machine learning project coming out of MIT, bearing a name from Charlie Brooker’s twisted imagination. In short, MIT’s “Nightmare Machine” uses cutting-edge deep learning technology to conjure up images designed to scare the bejesus out of us.

Read more
MIT researchers develop simulation to manipulate real-world objects in videos
nintendo niantic pokmon go japan version 1469006547 pokemongoash

Pokémon Go just passed the 100 million player mark -- and it's still growing. The game’s success can be largely credited to millennial nostalgia but also to the app’s clever use of augmented reality (AR), which overlays the Pokémon universe on the real world via the smartphone’s camera.

But, even though you can toss a virtual Poké Ball at a virtual Pikachu on the sidewalk, traditional augmented reality doesn’t let virtual objects interact with objects in the real world. Pikachu can't electrify a puddle, bump into a signpost, or jump up on the curb.

Read more
The new gaming mouse from Tt eSports packs a crazy 16,000 DPI laser sensor
tt esports 16000 dpi gaming mouse level 10 m advanced

In addition to launching the Challenger Edge membrane gaming keyboard earlier this week, Tt eSports, a division of Thermaltake, injected a new entry into its Level 10 family of gaming mice, the Level 10 M Advanced Laser Gaming Mouse. While the name seems rather long for a peripheral, what makes it stand out in the crowded gaming mouse market is that it sports an Avago 9800 laser sensor capable of up to a crazy 16,000 DPI.

Now, if you’re new to the gaming mouse market and have no idea what we’re talking about, just remember that the higher the number, the more sensitive the mouse movement can get. DPI simply means dots per inch, and when you crank that number up, the cursor flies across the screen even when your actual hand movement is minimal. Lower that sensor speed, and you have to swipe in huge loops to make any headway.

Read more