Fashion, Senses, and Spaces: the Technology That Built the Red Bull Playrooms

“Hey, do you want to come to the Netherlands and build the smart club
of the future?”

“Excuse me?”

“We need someone to lead a research team to work with a fashion designer here and help build wearable sensors for a two night party at the Amsterdam Dance Event. Can you do it?”

“How long has this project been going on?”

“It hasn’t, it starts in three weeks.”

“When’s the event?”

“In about three months.”

So, that’s how it started. Three weeks later in August, I was on a plane to Amsterdam to work with a new team in the Distributed and Interactive Systems Group at Centrum Wiskunde & Informatica (CWI) and was biking across town for meetings at the BYBORRE studio and IBM. The problem was interesting. Red Bull and BYBORRE were building the club of the future at Amsterdam Dance Event in a two night experiment. The club was to be a set of rooms providing a collection of experiences: Lights, Sound, Taste, Smell, and Touch. Each one brought together specialists to curate that experience. We (CWI) were to build the sixth sense…the unseen entity that connects the senses. Whatever it was we were making, it had to be something with connected presence (that people would know about), shouldn’t be a distractor (people shouldn’t get annoyed with it), and should have meaning (people should care about it) even when powered down.

The device was to a wearable built to sense the crowd, see what room was hopping with energy, and what room was calm or chill. It was to be a fashion object, that could sense people in a club, building a tacit structure of linked data, to provide interactions in the club and deliver post-club experiences to over 400 people per night. To do this, we needed to build out a sensor network. We had three months. We couldn’t custom fabricate sensors in time, so we hunted for existing chips.


We knew we needed to localize the sensors to rooms to see if people were in a dance hall or at a signature bar. We also knew we needed to measure each individual’s activity level (we called this their energy level); unlike traditional activity tracking that your phone might track (like walking or stair climbing) we needed to see if you were idle/standing, walking around, or dancing. Activity recognition typically uses accelerometers perhaps with gyroscopes, so there was our first requirement. To find where a sensor is, there’s a few tricks to measure signal strength depending if you’re using BLE, ZigBee, 6LoWPAN or what have you. We looked at over a dozen off the shelf sensor solutions and two BLE sensors stood out: Estimote Beacons and the TI SensorTag 2650.

Estimotes come in a few flavors; we chose the Estimote Stickers for the project. These are these neat, little, extremely affordable sensors (roughly $10 each). It’s a little BLE beacon that transmits accelerometer data and ambient temperature and the battery lasts forever (well they estimate a year but we needed them to last 2 weeks). They normally come in a little sticky silicon case, however they were kind enough to send us a box of just the circuit boards. We had 900 of these little chips powered and ready to go. In many ways, outside of its silicon casing, it’s a sweet sensor for embedded wearable tech. It’s lightweight, tiny, could be wrapped to make it more water resistant, and had a changeable battery. By default, it sends out a BLE advertisement every 1.25 seconds when in motion and every 2.5 seconds when at rest (though you can reprogram it, we left it at its default).

We used Texas Instruments SensorTag 2650 for a more sophisticated interaction: flashing lights. The SensorTag can be flashed BLE, ZigBee, or 6LoWPAN and it has 10 sensors (light, digital microphone, magnetic sensor, humidity, pressure, accelerometer, gyroscope, magnetometer, object temperature, and ambient temperature) and can talk to many cloud interfaces (like IBM’s BlueMix). The SensorTag is open source and open hardware; this is what excited me the most. We were able to program it to behave much like the Estimote (embed the accelerometer and temperature data in the Bluetooth Advertisement) as well as overload the JTAG pins for GPIO (to light up a DotStar strip through a library we ported to the SensorTag): Open Source Hardware FTW! It’s an amazing chip and ecosystem that comes in at 3x the cost of the Estimote and has a less impressive battery life. We made only 100 of these devices.


All 1,000 devices had to be registered. This meant, having a program scanning for BLE devices, inserting a battery, registering the device identifier, packing that device in a numbered envelope, and then sending all of them to the fashion studio for embedding into a wristband. The wristbands themselves were designed and made at BYBORRE’s studio in Amsterdam. The Estimote bands were completely made from a knit fabric while the SensorTag bands were a combination of a knit piece and more luxurious leather band. I’m not going to go into the details of this process but you should read more about Borre in Wired.

Software Architecture

With the chips determined and BYBORRE studio started to design the textiles and wristbands for the various sensor chips, we turned our attention to the code. Our run was to be about 500 Estimotes and 50 Sensortags a night. Building this out without the ability to test onsite was the major obstacle for the hardware.

From a software perspective we had a few challenges. First, we had a rough blueprint of the club space but the space wouldn’t be ready for setup until the night before the event. Further, despite what you might think, there was no club dancing dataset for us to build a neural net to classify activity behavior.

Also, most activity trackers advertise data between 30–120 Hz; our chips broadcasted at 0.8 Hz. While this is great for power consumption, training an accurate model would take some additional consideration. Finally, most BLE localization methods had the iBeacons in fixed locations and you navigate with another device, like your phone. We were putting 500 iBeacons on moving and dancing people and had about 30 RaspberryPis around the space to read the broadcasting data. So the problem, as we designed it, was reverse.

In the end, we built out the club space with several Pis reading about 550 sensors per night. This was fed via websocket to a central server which would de-duplicate the feed and send another websocket to a second service which contained our CNN for energy level and location sensing. This was output to a real-time projected visualization via RESTful JSON and was also sent to the Overdrive Room (an audio and light installation) via OSC. The energy in the Overdrive Room was inversely correlated to that of the club. If you’re a big fan of block diagrams, this is what the overall system looked like:

The projected visualization from our data was made by a Dutch company called Clever°Franke. After the event, each guest received an email with a personal visualization of their night as well as some details on the songs they danced the most to. A few special guests received a physical print of their night’s event artwork as well as a scarf that represented the evening’s activity in its 3D knit.

Artwork (left) and Energy Visualization (right) with DJ Names and track listing. Visualization by Clever°Franke.
A scarf that contains the visualization of the evening’s overall activity.

Designing the Future of Textiles and Technology

It was quite the three month adventure. Starting from scratch to build out a technology enhanced club with a fashion designer and an energy drink. In total, we collected some 50 BLE million packets across the two nights and are working on some formal publications in HCI, data-science, and AI. Our goal was to have the technology connect the senses throughout the curated club. With experts in audio, visual, perfumes, cooking, and music coming together for two nights, can technology further enhance the environment and experience. Science fiction author Karl Schroeder theorized in the Hieroglyph Project about such a connection with the environment.

It’s called sensory substitution. You can replace one sense with another…Gives you a new sense. It works best if you don’t consciously think about it. Just let it wash over you.

The future of wearable fashion experiences will be connected, unobtrusive, and aesthetic. We’ve taken a liking to the word ‘connected’ as being I’ve accepted a friend request. However IRL sharing an experience binds us; we should design for experience sharing be them fast or slow in pace. With many technology experiences being a distractor, we have to design for unobtrusiveness, especially for wearables as they are always with us, strapped to our bodies, and often carry an inability to readily put it down or take it off. We also have to design for the things we aesthetically care about. Here’s a tricky one: wearable objects need to be designed with personal aesthetics in mind. In the end, if you don’t love it for what it visibly is, you won’t wear it or value it. We found this important even for just one evening’s event; it’s more critical for things we’d use everyday which are the experience we aim to build out in the future.

Thanks Demosthenis Katsouris and Julian Burford for the block diagram, Samuel de Goede for the Schroeder reference, DIS @ CWI, Clever°Franke, BYBORRE, Red Bull, and all the others involved. All photos are CC-BY ayman (that’s me!) on Flickr. Some professional photos from the event are also on Flickr by Red Bull.

scientist/research director: @toyotaresearch @FXPAL @cwi_dis @yahoo @flickr @sigchi. instructions: place in direct sunlight, water daily.

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store