Nima Navab
  • Projects
    • All Projects
    • In/Decline
    • in-between
    • sum
    • deathwhiff
    • fade out, fade in
    • genocide memorial
    • @fear
    • flux 1.0
    • control freak
    • Liquid Light
    • Cloud Chamber
  • Research
    • Atmospheres
    • topological media lab
    • LEAP @ Concordia
  • Blog
    • Current blog
    • 2014 spatial theory blog
  • Bio

Site Research: The Former Eglise de Saint Joseph

11/17/2015

 
Picture
Eglise de Saint Joseph, Photos by: Nima Navab
“We live from our bodies: from because human perception and action extend outward from the materiality of the flesh and organ and bone to items and sources of sensation that exist outside the boundaries of the skin. From be- cause the body is the center of human experience... Our relationship to all else is structured from the position, location and attributes of our bodies.”

- Franck, From the Body, in “Architecture from the Inside Out” 46

The Former ‘Eglise de Saint Joseph’

In collaboration with Concordia’s Topological Media Lab (TML) and McGill’s Facility for Architectural Research in Media & Mediation (FARMM) I began working on project Sinter at the beginning of this semester. It is through this connection that I have gained access to church Saint Joseph at the corner of Richmond and Notre-Dame in Little Burgundy. Upon my first visit it became very clear that this church, currently under construction, exposes a multitude of narratives, which makes it an ideal subject for exploration.
Picture
Eglise de Saint Joseph, Main Interior, Photo by: Nima Navab
“The energetic effect of color affects our entire organism. It influences physical procedures. It also affects our psyche, our feelings, thought processes, and emotions. Through holistic associations and parallel sensations within our sensory organization, colors stimulate not only the sense of sight, but also other sensory organs. The intensity of color stimuli and the entire context in which they are perceived play a significant role... A certain color impression not only evokes a momentary visual sensation, but also involves our entire experience, memory, and thought processes.”
- Mahnke, Meerwein, Rodeck, “Colour: Communication in Architectural Space” 23
Picture
Eglise de Saint Joseph, Side Room, Photo by: Nima Navab
“Spatial construction expresses [the] desire in the projection of the body into space, in the enactment of the interac- tion of the body with structure, and in the dialectic within structure.”
- Hendrix, “Architectural Forms and Philosophical Structures” 229

“The concept of kinesthesia refers to a body which is sentient and which moves and engages with the world through a form of corporeal consciousness. In other words, perception (of the world) is not cognitive, whereby thinking is separated from the body and located within the mind, but rather occurs through a ‘thinking’ body, which is seen to have particular kinds of intelligences and competences.”

-Blackman, “The Key Concepts: The Body,” 84
Research Creation
The intention behind this research creation is to first and foremost capture the intra-actions, the experience (for example the entanglement with light and colors) and be able to translate their significance (the story-so-far) into my creation piece. If the outcome can embody all the research within an audio/ visual sensory intervention then great, but if not, I will create a piece that will serve as a conceptual basis and as a ‘start’ to the final project, which I can lat- er implement with TML. For now I will continue exploring various entanglements based on my journal during various visits to site and begin to further narrow down my area of research.

Ideally I would like to use non-intrusive sensing mechanisms, such as light sensors, pickup-mics and camera tracking to be able to track some phenomenological characteristics of space and the experience of it in users and get some usable data from this, that I can further animate it back into the space, through the medium of light, sound, projection etc. as to burn these experiences back into space like shadows leaving traces.
Picture
Eglise de Saint Joseph, Sinter, Image Source: http://farmmresearch.com
The significance of the map: first of all it gives an understanding of the urban layout of church and its neighborhood closer to the time of its creation. Second, through color mapping we can get an overview sense of the materiality of site: Red: brick | Yellow: wood | Blue: stone, etc.
Picture
1912 Map of the church with legend, Fire Insurance Maps of Montreeal, Image Source: http://www.banq.qc.ca/

Sensor Research

10/1/2015

 

AHRS (attitude and heading reference systems)
&

IMU (inertial measuring units)

consisting of sensors on three axes that provide attitude information:
(1) gyroscope (2) accelerometer (3) magnetometer

Picture
Other interesting sensors: tilt, proximity, kinect, camera, mic, contact mic, pressure sensor, light sensor, heat sensor, humidity sensor, water sensor
Picture
Triple Axis Accelerometer Breakout - ADXL345 (click for datasheet)
Picture
Triple-Axis Digital-Output Gyro ITG-3200 Breakout (click for datasheet)
Picture
9 Degrees of Freedom IMU Breakout - LSM9DS0 (click for data sheet)

Accelometer, Gyro & IMU Sensors (x-OSC Board User Manual)

x-OSC Board (source)

x-OSC is a wireless I/O board that provides just about any software with access to 32 high-performance analogue/digital channels and on-board sensors (gyroscope, accelerometer, magnetometer) via OSC messages over WiFi. There is no user programmable firmware and no software or drivers to install making x-OSC immediately compatible with any WiFi-enabled platform. All internal settings can be adjusted using any web browser.

I/O channels:
  • 16× analogue/digital inputs
  • 16× digital/PWM outputs (up to 50 mA per channel)
  • 13-bit ADC with 400 Hz update rate per channel
  • Up to 16-bit PWM resolution for 5 Hz to 250 kHz
  • Control up to 400 RGB LEDs (NeoPixel)
  • 4× serial communication channels

On-board sensors:
  • Gyroscope (±2000°/s), accelerometer (±16 g) and magnetometer
  • 400 Hz update rate

Networking:
  • High-performance WiFi (802.11b/g, 54 Mbps)
  • Supports ad-hoc and infrastructure networks
  • Fully configurable by web browser

Other features:
  • Regulated 3.3 V output
  • Battery level monitor
  • Size: 45 × 32 × 10 mm

OSC Open Sound Control (source)

Open Sound Control (OSC) is a protocol for communication among computers, sound synthesizers, and other multimedia devices that is optimized for modern networking technology. Bringing the benefits of modern networking technology to the world of electronic musical instruments, OSC's advantages include interoperability, accuracy, flexibility, and enhanced organization and documentation.

This simple yet powerful protocol provides everything needed for real-time control of sound and other media processing while remaining flexible and easy to implement.

Features:
  • Open-ended, dynamic, URL-style symbolic naming scheme
  • Symbolic and high-resolution numeric argument data
  • Pattern matching language to specify multiple recipients of a single message
  • High resolution time tags
  • "Bundles" of messages whose effects must occur simultaneously
  • Query system to dynamically find out the capabilities of an OSC server and get documentation

There are dozens of implementations of OSC, including real-time sound and media processing environments, web interactivity tools, software synthesizers, a large variety programming languages, and hardware devices for sensor measurement. OSC has achieved wide use in fields including computer-based new interfaces for musical expression, wide-area and local-area networked distributed music systems, inter-process communication, and even within a single application.

Interaction Scenarios

  1. Interactive Scenography (AV installation):
    Through movement of body projected visuals and spatialization of auditory environment is affected. For example with x-OSC attached the the performer's hand, he/she will be able to manipulate multiple textures and their orientation in OpenGL (simulation flight, etc.)

  2. Racket Orchestra (composing through movement):
    A considerable array of actuators, motors and various noise makers are spread out in space, and through the movement of hand and body in space you can control all the mechanical instruments around. This is more rhythmic and choreography piece.

  3. Surface Manipulation (real-time sculpting)
    The last scenario revolves around actual manipulation of architectural surrounding through movement and orientation of x-OSC. This can be achieved through inflation of nodes of ballon behind spandex on the ceiling, or control the tilt and rotation of motorized. surfaces or spatial installation consisting of threads etc.
Picture
Steer by W. Yong in collaboration with Jerôme Delapierre and Navid Navab, Montreal 2014 (click for link)
Picture
Thorax by Jean-P. Gauthier, Montreal 2011 (click for link)
Picture
Kinetic Sculpture by Bayerische Motoren Werke AG, Munich 2009 (click for link)

Extra Info from Live Science (source)

What is a gyroscope?
A gyroscope is a device that uses Earth’s gravity to help determine orientation. Its design consists of a freely-rotating disk called a rotor, mounted onto a spinning axis in the center of a larger and more stable wheel. As the axis turns, the rotor remains stationary to indicate the central gravitational pull, and thus which way is “down.”


What is an accelerometer?
An accelerometer is a compact device designed to measure non-gravitational acceleration. When the object it’s integrated into goes from a standstill to any velocity, the accelerometer is designed to respond to the vibrations associated with such movement. It uses microscopic crystals that go under stress when vibrations occur, and from that stress a voltage is generated to create a reading on any acceleration. Accelerometers are important components to devices that track fitness and other measurements in the quantified self movement.

Uses of a gyroscope or accelerometer
The main difference between the two devices is simple: one can sense rotation, whereas the other cannot. In a way, the accelerometer can gauge the orientation of a stationary item with relation to Earth’s surface. When accelerating in a particular direction, the accelerometer is unable to distinguish between that and the acceleration provided through Earth’s gravitational pull. If you were to consider this handicap when used in an aircraft, the accelerometer quickly loses much of its appeal.

The gyroscope maintains its level of effectiveness by being able to measure the rate of rotation around a particular axis. When gauging the rate of rotation around the roll axis of an aircraft, it identifies an actual value until the object stabilizes out. Using the key principles of angular momentum, the gyroscope helps indicate orientation. In comparison, the accelerometer measures linear acceleration based on vibration.

The typical two-axis accelerometer gives users a direction of gravity in an aircraft, smartphone, car or other device. In comparison, a gyroscope is intended to determine an angular position based on the principle of rigidity of space. The applications of each device vary quite drastically despite their similar purpose. A gyroscope, for example, is used in navigation on unmanned aerial vehicles, compasses and large boats, ultimately assisting with stability in navigation. Accelerometers are equally widespread in use and can be found in engineering, machinery, hardware monitoring, building and structural monitoring, navigation, transport and even consumer electronics.

The appearance of the accelerometer in the consumer electronics market, with the introduction of such widespread devices like the iPhone using it for the built-in compass app, has facilitated its overall popularity in all avenues of software. Determining screen orientation, acting as a compass and undoing actions by simply shaking the smartphone are a few basic functions that rely on the presence of an accelerometer. In recent years, its application among consumer electronics extends now to personal laptops.

Sensors in use
Real-world usage best illustrates the differences between these sensors. Accelerometers are used to determine acceleration, though a three-axis accelerometer could identify the orientation of a platform relative to the Earth’s surface. However, once that platform begins moving, its readings become more complicated to interpret. For example, in a free fall, the accelerometer would show zero acceleration. In an aircraft performing a 60-degree angle of bank for a turn, a three-axis accelerometer would register a 2-G vertical acceleration, ignoring the tilt entirely. Ultimately, an accelerometer cannot be used alone to assist in keeping aircrafts properly oriented.

Accelerometers instead find use in a variety of consumer electronic items. For example, among the first smartphones to make use of it was Apple’s iPhone 3GS with the introduction of such features as the compass app and shake to undo.

A gyroscope would be used in an aircraft to help in indicating the rate of rotation around the aircraft roll axis. As an aircraft rolls, the gyroscope will measure non-zero values until the platform levels out, whereupon it would read a zero value to indicate the direction of “down.” The best example of reading a gyroscope is that of the altitude indicator on typical aircrafts. It is represented by a circular display with the screen divided in half, the top half being blue in color to indicate sky, and the bottom being red to indicate ground. As an aircraft banks for a turn, the orientation of the display will shift with the bank to account for the actual direction of the ground.

The intended use of each device ultimately influences their practicality in each platform used. Many devices benefit from the presence of both sensors, though many rely on the use of but one. Depending on the type of information you need to collect — acceleration or orientation — each device will provide different results.

Story Telling Balloon Prototype

9/29/2015

 
Picture
Prototype @ the sensor lab, Concordia University

Story Telling Balloon:

The story telling balloon project is an exploration into making audible activity in space into something tangible, visible and feel-able through air). The project captures spoken word in space and as words are spoken gradually inflates the balloon until it’s full and then deflates, spilling its contents back out in the space. The output scenario is not yet fixed. Possible scenarios include projection of words or playback of words simultaneously as the balloon deflates. Given that the balloon if gone beyond its capacity will explode, the project needs a safe kill switch. Two reliable options came to mind: a pressure sensor inside a chamber or some conductive material that would act as kill switch when the pressure of the balloon strategically placed will make the connection once balloon is full and deflate the balloon. The max patch is on its way but was not included for the prototype.

Story Board:

  1. mic setup faraway to bypass feedback loop (in 2nd iteration instead of threshold there will be voice recognition recording any actual conversation where by it is speaking and not just any noise that will activate the valve)
  2. amplitude over certain threshold will active pneumatic valve
  3. amplitude under certain threshold will shut off pneumatic valve
  4. process keeps iterating until balloon is full and this is where the kill switch comes into play
  5. when balloon is filled the top of it will touch the bottom of the cabinet where 2 pieces of conductive tape are hanging
  6. ballon pushes against the tape and makes a connection which in turn turn on the solenoid for deflation
  7. simultaneously the word spoken into the balloon will playback in reverse based on the time of deflation
  8. ie. if the total time of inflation (words spoken) is 30seconds and deflation time is 10 seconds then the playback will speed up by 3 times and playback in reverse while the balloon is deflating
  9. with voice recognition the words would ideally by scattered onto where the balloons deflate

Schematic:

Picture

+

Picture

Code:

void setup()
{
  pinMode(12, OUTPUT);
  pinMode(2, INPUT);
  pinMode(13, OUTPUT);
  pinMode(4, INPUT);
}
void loop()
{
  if(digitalRead(2) == HIGH)
  {
    digitalWrite(12, HIGH);
    delay(10000);
  }
  else
  {
    digitalWrite(12, LOW);
  }

  if(digitalRead(4) == HIGH)
  {
    digitalWrite(13, HIGH); 
  }
  else
  {
    digitalWrite(13, LOW);
  }
}

Video:

Materials Research

9/17/2015

 
Picture
Other interesting material: spandex, lycra, latex, vinyl, burnt motor oil, graphite (led), conductive thread, conductive paint, water, ink, liquid based materials, invisible wire, threads, wire, dough, waste recyclable (tires, bottles, etc.), sand, air, moisture, fabric, plants (leaves)

Conductive Paint (data sheet)

material properties
  • physical:
    liquid form (paint) water-based, nontoxic. can use hardener to adjust density.
    standard acrylic or water-based paints can even be used alongside Electric Paint to act as insulation or to create multi-layer circuitry!

  • electrical:
    electrically conductive! can be used as controller or potentiometer. works with low voltage dc power

  • structural:
    can paint wires onto things like models, clothes, furniture, walls, almost anything you can think of.

  • perceptual and aesthetics:
    by itself it doesn't offer much, it drips, it dries but where and how or the method of painting & what it can be painted onto can is an open field for perceptual and aesthetic experimentation

  • production/supply/ cost:
    here or here... no production or supply problems, can be found and ordered online or bought at Spikenzie/ Abra etc.
    $25 for 50ml at spark fun (link), or you can make your self for a couple dollars

  • environmental impact and safety:
    the water-based version although costly if store bought is pretty safe, pretty much if there's no led in there you'll be fine

  • types of manipulation and processing:
    can play with density based on use, can be processed using glue, water-based liquids generally and texture can be changed through blending paint with flour or something neutral

Interaction Scenarios

As mentioned above, paint can be applied to anything to either demand a reaction, output, get some data out of it or used as potentiometer up to 5 sensor capacity (this i with the silver paint). Ideas involve painting surface based on some sort of labyrinth pattern and passerby will actuate sound light or something all in the interest of manipulation and accentuation of some spatial perceptual qualities of space. Alternatively conductive thread can be used to achieve something similar.
Picture
thread art (click for source)
Picture
playing with sense place (click for source)

In/Decline (the making/ process/ experiments & more):

5/29/2015

 
The project was installed at P.A.R.E (Place, Architecture, and Responsive Environments): a three-week cross-disciplinary inter-university/ institutional research residency that investigated responsive computational environments in relation to place-making and storytelling, perception, time, ambiance and atmosphere studies. While showcased at the residency the project was explored in many different ways; including:
  • non-active behavior of surfaces/ i.e. modulation with algorithmic behavioral programming. Standalone/ without human interaction
  • inflation/ deflation based on human presence via motion tracking with a Kinect
  • inflation/ deflation based on procedural OpenGL 3D landscapes where the Z access controlled the pneumatics
  • inflation/ deflation through performance/ engagement with the interaction surface underneath the structure

ideas for future projects:

1/28/2015

 

entangled performance

Hacking into as many telecommunication devices so essentially I can turn them into a massive drum machine, adding scanners etc. and practicing a lot of electronic hacking/ circuit bending and making the whole setup into a performance where as the piece goes on I become more and more engaged and entangled with the stage full of these devices around.

stairwell orchestra

Our auditory built urban environment highlights the direct effect of industry and technology on people, drawing attention to the hegemonic nature of technology as prime contributor to the tense and restless condition of modern life. As Godfrey Reggio states, “technology has become as ubiquitous as the air we breathe” (Essence of Life, 2002). I want to turn this around with this project. I want to install many many noise makers in the EV stairwell and have them activate with movements of the passerby in transition. This would be a buildup of a previous residency: http://concreteresidency.tumblr.com/nimanavab where I took a very different approach producting 16000hz and higher frequencies that are the same range as the frequencies generated by the technologies that surround us (lights, projectors, engines). By placing the same range of frequency in the stairwell the sound becomes contained and thus amplified to an almost nauseating pitch. So when I said I wanted to have fun and go rhythmic this semester, it stand completely in opposition of the last iteration of project. Same topi, different outlook. 
Picture
By me for the Concrete Residency with Spatial Theory class with Cynthia Hammond (click for residency link)

rhythmic acoustic shell

This idea revolves around building a modular rhythmic acoustic shell. The work would potentially translate some sensor data outside such as wind, light, etc into rhythm, molding physical properties of space with reactive acoustical rhythm sculpture. The end piece would be meditative.
Picture

tangible physics engine/ gestural particle system

using what I learn from Creative Computation II, covering Daniel Shiffman's Nature of Code: natural simulations with programming, I will create a physics engine revolved around physical objects in space using OPENCV. Once that is achieved I will hack into Leap Motion for gesture tracking to release particles onto projection space. Here is the proposal:

Imagine blending the line between the physical world and the virtual world.  Well it’s possible, by tracking gestural movement. This is done by projecting the chosen code’s canvas onto a surface and placing real-life objects onto it. By doing so, thus chosen objects become registered as object-orient objects within the computer code and are able to function as them. This allows for all sorts of phenomena, events and special effects to occur on the screen, in real-time. Furthermore, if more real-life objects are placed onto the screen interactions can occur between them, demonstrating the code’s ability to control action flow whilst encouraging human participation!
Forward>>

    BLOG:
    Research & Creation

    An archive of process:
    including ideas, inspirations, sketches, references, etc.

    Archives:

    November 2016
    October 2016
    April 2016
    March 2016
    December 2015
    November 2015
    October 2015
    September 2015
    May 2015
    January 2015

    Categories

    All
    Ideas
    Inspirations
    Materials
    Photo Journalism
    Pneumatics
    Process Work
    Prototyping
    Research
    Responsive Environments
    Scenerios
    Sensors
    Site
    Spatial Design
    Spatial Theory
    TML
    Workshops

Powered by Create your own unique website with customizable templates.