Flashback Friday: The Kinetic Pavilion

Flashback Friday: The Kinetic Pavilion

Flashback Friday: The Kinetic Pavilion

I can’t believe it’s already 5 years ago that  Elise Elsacker and myself developed The Kinetic Pavilion.  I’m really proud to say that the concept behind the project still stands today. We urgently need architects to become more involved in the Smart City discussion. The Kinetic Pavilion is a parametric structure able to respond to its environment and more importantly to user generated inputs.


The Kinetic Pavilion is a research project for the development of a new kind of pavilion that’s capable of acting upon changing weather conditions, human movement or human moods/mindsets. Its shape has been made dependable of environmental choices and parameters extracted from the pavilion’s surroundings. Just like a living organism, this new prototype changes itself when parameters take on other values.The scale model (90×120 cm) has been built starting from a rational grid, which allowed us to test the following parameters:

1. Based upon weather data.
A different reaction pattern in cold and warm areas.
  • In cold environments: The pavilion picks up solar alignment and tries to catch as many solar irradiation (heat gains) as possible. Places with higher irradiation levels result in a height-difference in the pavilion’s roof structure so it expands and attracts more heat.
  • In warm environments: The pavilion takes on an aerodynamic shape and available winds cool down the pavilion. Places with high irradiation levels result in a change of the roof structure and create shadow spots.We’re using Autodesk’s Ecotect to translate weather data onto the pavilion for the time being.
2. Human movement.
  • Movement in space: This is also an architecture that reacts on movement. The roof structure reacts upon the dynamic movements of the people using the pavilion. It creates a dialogue between the user and the architecture and the perception of the space they find themselves in.To illustrate this idea, we’re using an iPad to recreate human movements by touching and sliding over the screen. The finger’s coordinates are processed through OSCtouch to Grasshopper, which controls the height data of the pavilion.
  • Body Movements: Webcams are able to process movement data into preset shape patterns of the pavilion. For example dancing people can trigger the pavilion to react upon the dynamic movements. We used a Microsoft Kinect for the duration of our research.

3. Human behaviour and interactions.

Even in 2011 we couldn’t look past the  steep rise of social media use in our society. Never has it been more easy to gain access to a general feeling or mood. We’re capable of filtering through twitter feeds and picking up trends. For example: when a certain number of pavilion users tweet ‘party’ or a synonym the pavilion starts moving actively. On the other hand when users tweet messages containing words like ‘tired’, ‘lazy’, ‘sleepy’, etc.. the pavilion starts moving in a swaying kind of way.This manner of interaction stimulates social cohesion and interactions between people using the pavilion and the pavilion itself. In the future we hope to implement deep learning capabilities on the pavilion.

For the nerds:
1. Input: Ipad, Ecotect, Processing (sine function, webcam, twitterfeeds,…)
2. Process:
a) input parameters are sent through OSCTouch , gHowl, geco GH2Ecotect and UPD to Grasshopper.
b) Different kinds of data are translated into height coordinates.
c) These processed coordinates are sent through Firefly to an Arduino-board
3. Output:
a) Arduino controls 28 servos
b) Spur gears translate these coordinates into a vertical movement, controlling the roof structure.
See also:
Got some ideas on how to enhance our Pavilion? Do drop a comment below!

About the Author

Leave a Reply