Category: Design process


control and looseness

Christian Korab commented on the panopticon photographic tool,

“It’s a tool that operates in an heuristic manner with stochastic information, so its not so much what the ego makes with the tool as it is what the tool does to guide the ego in the design process. Its a tool that needs to be held loosely in keeping with the nature of peripheral vision.”

This semester I taught two sections of Design Fundamentals 1, an undergraduate class. Every week I asked the students to allow their chosen material and tool to inform their design process. I asked them to strike a balance between control and looseness in their work so they would design an object that was neither forced nor a product of their preconceived idea of what it should be. Motion Lapse uses a tool that allows designers to have control of their bodily choices and allow the tool to document the stochasticity of the exercise.

numbers

161 days

My camera was suspended above Rapson Hall courtyard from November 21, 2010 to May 1, 2011.

150,293 photographs

recorded

385

multi-frame, auto blended photographs

225 GB

photographs produced

2

backups

10.5 mm lens coverage on a Nikon D200

I just un warped one of the photographs from my camera. I am surprised to see the coverage a 10.5mm lens has. Due to the fact that my Nikon D200 does not have a full frame sensor, the lens behaves like a 15mm lens.

A 15mm lens has about a 114 degree field of view. A human eye has about a 180 degree field of view. This star that the 15mm lens sees would be even more exaggerated.

105.mm or 15mm lens coverage

105.mm or 15mm lens coverage

I have made a lot of mistakes in the last four days and learned valuable lessons about AgiSoft StereoScan, the free program I am using to convert stereo photographs into 3-D meshes. I noticed a problem that when I blended the photos in Photoshop first the floor often melded with the material and StereoScan treated it as a floor plane. Download the 3-D model.

StereoScan lessons

StereoScan lessons

After a lot of trial and error, I found a way to blend multiple photographs and build a 3D model in StereoScan. I ended up manually selecting and deleting everything except the illuminated material in each layer of the sequence. Then StereoScan could map the volumes of the multi-frame packet.

stereoScan volumes

stereoScan volumes

This is the method for converting stereo pairs of photographs into a 3D model:

  • Record synchronized stereo photographs.
  • Manually delete everything in the frame except the illuminated material.
  • Load the blended left and right images into StereoScan
  • Create a 3-D model

The models may, of course, be imported with other programs. I used Rhino with V-Ray to make a quick render of one of the simplified surfaces.

Rhino with V-Ray render of original form

Rhino with V-Ray render of original form

Rhino with V-Ray render of rebuilt surface

Rhino with V-Ray render of rebuilt surface

AgiSoft StereoScan is a free program that can convert a stereo pair of photographs into a 3D model. I am exploring the technique of photogrammetry in stereo to convert ephemeral models into 3D, digital surfaces. This is the first test. The model may be orbited by clicking and dragging the surface. Download a pdf of the stereo photogrammetry test. The newest version of Acrobat reader is required to view the 3D surface.

untangling thoughts

Throughout this project I have struggled to explain my ideas. This is another attempt to clarify this project, its relevance to the field of architecture and its assumptions.

second review interim report

” …it is the possibility of shifting our attention from the object to the experience of the object and in so doing reconceptualizing architectural design as the design of architectural experiences.”

Dr. Julio Bermudez in Visual Architectural Experiences

Dr. Bermudez makes a strong case for digital, virtual environments as a tool to enable designers to design architectural experiences instead of architectural objects. He argued, in 1994, that traditional representation methods of architecture fail to adequately represent temporal phenomena. Time, in particular, has been difficult to represent.

“… there remains the fact that the nature of our media and techniques of representation have generated and supported a structural weakness in how we deal with the phenomenology of architectural orders”

Dr. Julio Bermudez in Visual Architectural Experiences

He argued for the use of immersive, 3D, digital environments as a tool to design experiences. Seventeen years later, there are a plethora of 3D virtual reality tools available to designers. I am in no way an expert on 3D virtual reality environments. However, I have used one laboratory. The tool helped me experience my design with my body. Still, I was viewing and experiencing the design within the constraints of the environment. I wore a head band that placed two screens in front of my eyes. Once my eyes adjusted to the resolution of the screen I could trick my brain into seeing beyond them as just two screens. I was limited by the room as to where I could walk. The lab worked well, but I was experiencing my design through the filter of the lab.

Ephemeral models allow us to quickly and easily model experiences in real time, with real occupants under controlled circumstances. The blended photograph is the visual remnant of the experience of ephemeral modeling.

I’ve summarized my current thoughts on the history of similar photographic methods by Etienne-Jules Marey, Eadweard Muybridge, László Moholy-Nagy and Annie Halliday. Included are ephemeral models from the most recent modeling session with Elizabeth Turner and some tests of how to transform this information further.

Process, research and tests

from camera to model

I made models out of illuminated materials, time and blended photography. I have been making them as a test of a photographic tool I made to understand an architectural space. I thought of the idea for this tool last summer but I didn’t know how to build it. I spoke about it with my dad and he built it. None of this would have been possible without his help. The tool consists of a digital camera with a fisheye lens, a motion sensor and an Arduino.

Nikon D200 motion lapse camera set up

camera set up

The camera is set to respond to motion. If the sensor detects motion it signals the Arduino to run the following program:

 

  1. pause one second
  2. trigger the shutter and make one photograph
  3. pause three seconds
  4. trigger the shutter and make one photograph
  5. pause three seconds
  6. trigger the shutter and make one photograph
  7. pause three seconds
  8. trigger the shutter and make one photograph
  9. pause three seconds
  10. trigger the shutter and make one photograph

If the sensor still senses motion the program is repeated. If not, the camera stops taking photographs. The camera has been running since November 21, 2010. Between then and March 1, 2011 the camera was only triggered between 7:00 a.m. and 7:00 p.m. The sensor was located on a column about 25 feet from the camera. On March 1, 2011 I removed that restriction and moved the sensor closer to the camera.

The result of this experiment has been over 100,000 photographs of the same view. Until March 13, 2011 I was observing activity the camera captured. I used the auto blend function in Adobe Photoshop to merge multiple photographs together. The process of finding interesting packets of photographs, loading them into layers in Photoshop and auto blending them resulted in blends of activity.

On March 23, 2011 I decided to begin using light and time to explore ideas of space and transparency. The camera is in shutter speed priority mode and it is set to 1/15th of a second. That means that after dark the photographs are almost completely black. If there is a light on in the space below the camera it has a strong contrast with the surrounding area. I began by collecting light tools and moving them around the space. Knowing that I would blend the photographs together later helped me understand how to make models with simple illuminated materials and time.

individual frames

individual frames

blended photograph

blended photograph

inverted, blended model

inverted, blended model

Powered by WordPress | Theme: Motion by 85ideas.