The loss of sensible referents

The Loss of Sendible Referents: Architecture in Non-place-Space



Self generational architectural systems within a computational environment


Table of Contents

Thesis Statement

Thesis Abstract

Chapter 01: Technological Historical Acconts

Chapter 02: Technology and Nothingness

Precedent

Architectual Process

Theory

Application

Program

Setting up structure.

defining resultant terminology

architectural ramifications--design

Bibliography


Time is the accident of accidents-

-Epicurus

The laws of harmony that are internal today will become external tomorrow-

-Kandinsky

You cannot think about thinking, without thinking about thinking about something.

-Seymour Papert

That theory is worthless. It isn't even wrong!

-Wolgand Pauli

It has been the persuasion of an immense majority of human beings that sensibility and thought [as distinguished from matter] are, in their own nature, less susceptible of division and decay, and that, when the body is resolved into its elements, the principle which animated it will remain perpetual and unchanged. However, it it probable that what we call thought is not an actual being, but no more than the relation between certain parts of that infinitely varied mass, of which the rest of the universe is composed, and which ceases to exist as soon as those parts change their position with respect to each other.

-Percy Bysshe Shelley

Introduction:

The development of civilization is motivated by the substantial interaction of various discourses. These interactions allow the borrowing of important thought and developments with individual fields of study to have influence on alternative discourses. Architecture represents this fact by being a profession having roots and invovlement in multiple fields. Architecture as an apperatus of education fundamentally allows the personal exploration of any number of these alternative applieable areas systematically

The importance of theory and process

The future of architecture

The experiments intent is to investigate the possibility of borrowing from the fields of science, mathematics, and technology. The process sets up a hypothesis similar to that of a scientific experiment. The hypothesis revolves around the possibility of removing the conventional product of architecture as a fixed object and replacing it with objectified thought process' modelled by the computer. The purpose is to utilize the computer as an interactive and advantagous design instrument allowing it to become a necassary extension of the thought process and the architectural method. The thesis is to explore architectural theory using the tools and modeling methods of science and technology.

The investigation will try to define and explore an architecture in non-place-space[1]; an experimental field where the interactions of architectural thought are explored, described and created. An attempt is made to examine the significance of architectural thoughts as interrelational multi-dimensional processes[2], which most commonly results in built form, but the necassary product of architecture does not have to reside in the built form. Architecture must remove the stigma of a singularly motivated product and replace it with a generational process producing multiple justifiable conditions derived from the structure of procedural thought.[3] The exploration is to understand the potential of having an architecture which is temporal, unpredictable, transformable, and mutable as its condition of existence. In essense it is a process which very closely relates to the condition of built architecture--being occupied, changed, renovated, and contextually challenged--but having these potentially uncontrollable circumstances as resultant factors influencing the system. The purpose of this exploration is to visualize the potential of manifesting thought and creativity by adhereing to an objective scientific model. The model will then be applied and tested systematically to architectural objects at multiple scales.[4] Theorectically to forecast the possiblity of using computer methodologies within the design process so that the produced architecture form more closely resembles the process of thinking: transforming, manipulating, and not seeing architecture as a frozen completed object. The computer will act as the systematic generator and imformation organizer. Utilizing the computer as an essential design tool will allow new paths within architecture to be developed.

Abstract--Technological/Historical Account

We are witnessing a change, a shift of paradigms and one of fantastic unfolding challenges. The world is involved in changing medias, mediums, and conventional definitions--essentially a world challenged by electric/electronic technologies. With the advent of mass quantities of personal computers, televisions, telephones--reaching into everyone life--we challenge our personal and conditioned perceptions. We are in effect dealing in paradoxical spatial relationships--where there is no here or there--we can, unbound by distance, be everywhere. We enter constructed space, a space unconfined, and unchallenged- literally moving beyond Euclidean geometries into technological 'space-time'. We have the ability not only to push the architectural envelope-but in effect to fold, break and turn it inside out. We can exist virtually without day or night- weather- or gravity; but no with new constraints of the electronic-day- a day which has the potential to expose itself instantaneously, speed distance will replace physical dimension.

If architectonics once measured itself according to geology, according to the tectonics of natural reliefs, with pyramids, towers and other neo-gothic tricks, today it measures itself according to state-of-art technologies, whose vertiginous prowness exiles all of us from the terrestrial horizon.

Jean Piere Chanageux[5]

The exiting convention of reality--basing itself truly in perceptions and physical beliefs--now falls to the wayside, giving rise to a multitude of alternative realities- existing in other forms of appearances with the absence of shape, dimension, and meaning.

Precision is the relationship of measured value to the value of its uncertainty. One could say that precision is its inverse: relative uncertainty.

Patrick Bouchareine

The epistemological movement of the solar day to the chemical day to the electric day to the electronic day: everlasting.[6] The materiality of mental images can no longer be doubted. This perhaps has the potential to become real- or postreal- for the belief of mental images that can recorded and seen visually on a screen doesn't appear to be to far fetched- and actually obtainable.

Is it possible to intellectualize architectural thinking? Which is to say that the notion of architectural training instead of being primarily visual is essentially, albeit a personal, yet self constructed method for thinking. The abstract notion of communicating rhetoric between architects is accepted but not defined- we morphologically generalize terms describing planes, structure, systems, site, precedent, aesthetics, truth, language, etc. They are in essence- notational constructs- for there exists no precise definitions only implied understanding. The history of architectural realization allows for adaptation and flexibility. If physics and mathematics operate in a world of determinant precision, and relative structure, what happened with architecture. Certainly architecture, revolving around artistic principles which most of the time is lost from drawing to building, hastens relationship to science and mathematics. Where was the disjuncture? The initial division is understandable- based upon anti-physically unrepresentable or non conventionally describable paradigm shifts in mathematics and science. Architecture is being forced, cornered, and palatable be unseen forces of great economic, political, and social structures. In effect it is at the whim of convention- for in itself architecture remains luxurious, unnecessary, trivial, and trapped. This thesis therefore, does not advocate the liberation or exoneration of architecture, but by investigating the possibility of grafting architectural thought and method with mathematical principles, and scientific theory within a technological framework- allows architecture to explore itself unbound by perceptible conventions.

What potential is there in non-perceptible space? It is in the investigation of a constructed scientific model--matrices modeling, Heisenberg's uncertainty principle, and notions of space--which produce this non-perceptible concept. Initially 'primary' energy acts (structure), influence and force theoretical objects placed into 'constructed space' (site) which in turn define the boundaries of that 'space' by their interaction (program), this methodology produces an architectural thought model. Forces, and the breakdown of energies within the system (entropy) will produce perpetual change within the system, creating continual action, movement, adjacencies, relationships and possibilities.

The vanishing point, the horizon line

The vanishing point is the anchor of a system which incarnates the viewer, renders him tangible and corporeal, a measurable, and above all a visible object in a world of absolute visibility.[7]

Perspectival construction is a system for transcribing how we perceive the world through our eyes. The vanishing point in art and architecture was an important part of the manufacturing a simulation of the world into two dimensions. Brunelleschi, integral in this development, conducts an experiment in 1425, while painting a perspective of the front of the Baptistry, he turns the painting around so it faces the front of the Baptistry, and places a mirror in front of the painting, from the backside of the canvas (now looking towards the Baptistry) he pierces the canvas and gazes through the hole, and sees the mirror reflecting the painting and the Baptistery at the same time--both represented in two dimensions and proved his constructive technique in perspectival for they are equivalent.[8] The importance of perspective method allows the transferring of information from reality to a tangible representation of reality. Essentially the vanishing point is the point at which our focus is located. The vanishing point represents a fixed and perceptible view of the world. It represents a definite location within a real and physical scene. The importance of understanding the nature of the concept is that from where we look depicts a point zero, and the vanishing point is bound by the limitation of our sight and is thus a concept terminating our sensorial range. The vanishing point represents a point of view which if mirrored would reflect the opposing view of the world as seen by the viewer. The vanishing point is the an anti point of the viewing point.

Sense and Perception--Speed Distance

Our perceptions are inherently rooted into our lack of understanding of the concept of Nothing. We can only mentally construct the representation of Nothing. To imagine a condition of Nothingness we must remove ourselves as time bound material objects and appoint concepts hierarchically above the object. Perhaps we can image space without anything in it, but only in a sense of visually accepting Nothing, for in the confines of our universe we do not perceive the presence of energy--or the emission of waves. On earth our environment enfolds before us as perceived through our senses. Senses are channels of sensations--to sense something is to detect something--it is not probable to detect 'Nothing'. Logically our senses prevent us from assigning a real value to Nothing. What happens to our perceptions once technology moves, and continuously advances, forward? We have within this century alone gained access and the ability to physically move, and mentally move, unbound in our world and universe. It is at this departure point at which perceptions can become confused. The contemporary manifestations of transportation and telecommunication technologies result in the profligation of Nothingness. It is from here on the concept of the conventional vanishing point is departed. We now have the potential to experiences multiple vanishing points, where pespectival reduction is no longer evident or in case relevant. Fundamentally we are preparing to occupy all space between the conventional static point of viewing and the horizon line. It is this concept of existing nowhere which is becoming part of our contemporary being. It is also where the assigning a value to space becomes confused.

Perspective construction is how we visually translate our world to two-dimension representation, but the alternative defintion for perspective is the ability to see relevant data in a meaningful relationship. With new development in technology, telecommunications, and telescience we witness a changing paradigm--one which presupposes the breakdown of conditional and conventional perceptual systems. These new technologies allow the dismantling of distance with the intrinsically rooted concept of speed. Our perceptions regarding the physical movement from place to place are no longer applicable when we are connected to 'place' by telephone, or roaming within the labyrinth of information through computer technologies. The highest reaches of this notion would be perhaps existing physically in one place, but experiencing every place--being nowhere. These concepts allow the breakdown of place-hood perception--for now the physical space which our body inhabits, our mind and senses do not. Basically this exploits the very notion of the conflict between perception and Nothingness. This movement which seemingly developed from the notion of assigned worth is a dilemma of our time.

Sartre develops an explicit explanation of Nothing, he describes, if an object is to be posited as absent or not existing, then there must be involved the ability to constitute an emptiness or Nothingness with respect to it.[9] Sartre goes on further than this and says that every act of imagination there is a double nihilation. In this connection he makes an important distinction between being-in-the-world, and being-in-the-midst-of-the-world. To be in-the-midst-of-the-world is to be one with the world as in the case of objects. But consciousness is not in-the-midst-of-the-world; it is in-the-world. This means that consciousness is inevitably involved in the world (both because we have bodies and because by definition consciousness is consciousness of a transcendent object) but that there is a separation between consciousness and the things in the world.[10] For consciousness in its primary form, as we saw earlier, is a non-position self-consciousness; hence if consciousness is consciousness of an object, it is consciousness of not being the object. There is in short, a power of withdrawal in consciousness such that it can nihilate (encase with a region on non-being) the objects of which it is conscious. Imagination requires two of these nihilating acts. When we imagine, we posit a world in which an object is not present in order that we may imagine a world in which our imagined object is present. I do not imagine a tree so long as I am looking at one. To accomplish this imagining act, we must first be able to posit the world of synthetic totality. This is possible only for a consciousness capable of effecting a nihilating withdrawal from the world. Then we posit the imagined object as existing somehow apart from the world, thus denying it as being part of the existing world.

It is in this notion of positing imagination, and the property of being within-the-world, which will be described as we approach a technological media epoch. For in this 'media' environment we essentially occupy space inbetween being in-the-world and being in-the-midst-of-the-world. As for Nothingness, this would derive its origin from negative judgments; it would be a concept establishing the transcendent unity of all these judgments, a propositional function of the type,"X is not."[11] This negative judgment established itself as soon as reality and human perspective in the conventional perceptually representable sense is abandoned for a simulated virtual one.

Precedent analysis

This investigation of precedent leads a wandering path throughout discourses but all adhering to an initial thesis idea. One a creating a placeless architecture, reduced thought- secondariness of form to process, descriptions of space, of processes of representation in space- practical applications derived from math and science and process.

01 Fin d'Ou T Hou S by Peter Eisenman

02 Buckminster Fuller conceptual connections between science/math/architecture.

03 Space as defined by modern physicists.

04 x-dimensional architecture[12]

05 Heisenberg's Uncertainty Principle

06 Hyper Graphic n-dimensional manipulations.

07 A brief discussion on the concept of Nothingness.

Fin d'Ou T Hou S

What can be the model for architecture when the essence of what was effective was in the classical model--the presumed rational value of structures, representations, methodologies of origins and ends and deductive processes--have been shown to be delusory?[13]

The Fin d'Ou T Hou S by Peter Eisenman is an exploration in architecture which moves beyond the limitations presented by the classical model to the realization of architecture as an independent discourse, free from external values; that is, the intersection of the meaningful, the arbitrary and the timeless in the artificial.

Traditionally, the architectural object was assigned a value based upon strength and visibility of its connections to a set of programmatic requirements including function, structure, meaning and aesthetics. Judgment of value on the basis of extrinsic criteria was perceived and defined as rational. Non-conformity in the context marked value-less architecture.

The first premise of the Fin d'Ou T Hou S is that the world can no longer be understood in relation to any 'absolute' frame of reference devised by man. If one accepts this presupposition then the concept of extrinsic or relative value becomes meaningless and traditional rationalism merely arbitrary. Fin d'Ou T Hou S suggests the architectural object must become internalized so that its value lies in its own processes. Those programmatic requirements which had previously been seen as the causes must now become the effects of architecture. Fin d'Ou T Hou S is not rational architecture in the traditional sense. It proposes an intrinsic value system which is alternative to a context of arbitrariness; it is true to its own logic. Faced with an object that admits no discursive element external to its own processes, our customary role as subject is futile, and we are bereft of our habitual modes of understanding and appraising architecture. Eisenman suggests that Fin d'Ou T Hou S requires a new reader, willing to suspend previous modes of deciphering for an attitude of receptive investigation.

While Fin d'Ou T Hou S claims to be self-definitive, it does claim to be self-explanatory. The process records its own history at every point in its development, but at any one of the steps shown, including the last, is no more than artificial representation of a single frame from a seamless continuity which would be self-explanatory if it could be recreated. This in term becomes a departure point for the thesis as representation of spatial-temporal physical development- actually in real time regenerating changing... alive. Traditionally, the necessity of a score or a text devalued the architectural project. Here Fin d'Ou T Hou S is presented as a score of its process and an explanation of the analysis and processes discovered in the initial configuration. This presentation is consistent with the devaluation of object in favor of process. The process consists of multiple stages of development. What is important for the analysis, and justly a thesis is the condition of an everchanging system.

Object States:

A constructed system which, in effect, will develop autonomously. The system basis is the process of decomposition, or rather an approximation of decomposition. Inherently from the initial stage of intervened action, are the forms of two el shapes. There is an initial dichotomy set up by the relationships of the parts- in effect preconceiving conflict on behalf of the initial phase state. One is a present solid el, and the other is one half the others size and is present void. The process of decomposition happens when the smaller el moves closer to the larger el. This in effect is setting the stage for interaction between the two forms.

From the onstart a determined structure was designed to reflect the interactions between developed pieces of the initial equilibrium system. This structure is the rules of interaction. As a volume enters another volume the secondary volume will be displaced, and change.

The initial postulations are as follows:[14]

Active Passive Result

Presence over Presence creates Absence

Absence over Absence creates Presence

Presence over Absence creates Presence

Absence over Presence creates Absence

Void over Void creates Solid

Solid over Solid creates Void

Void over Solid creates Solid

Solid over Void creates Void

Notation and Trace:

Notations of Presence:

Physical representation of notation only appear on the surface of form. These frozen states represent transitional departures for with which to view the process.[15]

Present Solid : Opaque

Present Void : Transparent Color

Absent Solid : Translucent Plane Grids

Absent Void : Translucent Line Grids

Notations of Trace:

As the form moves it leaves a trace of its previous position, a notation of the previous state will appear on the surface of the form.

Present Solid : Void Line Grid

Present Void : Solid Line Grid

Absent Solid : Solid Plane Grid

Absent Void : Solid Line Grid (Void Plane Grid)

Fin d'Ou T Hou S represents a point of departure of formal and analytic structure from which to follow. The relative aspects are to present a rationally logical structure--in the description of empirical, arbitrary, and relevant argument--with which the project develops. It is the relationship of the author to the object, a subjected of fatherless development, resulting in controlled genetic development rather than careful nurturing of the object into its primacy. Thus the process divorces the architect from the object, as seen only as coded determination. Logic and rational play a large role in the analysis, for by reducing the interactions/objects/formal existence's one can legitimately predict the possible representation outcomes but not the formal outcomes. It is in the objective persistence of viewing which allows Eisenman to continue his experiment. It is a game of which the outcome is not predictable, and definitely not preconceived. For in the preconception we limit the possibilities of radical development. The representation of the diagrams a frozen objects is misleading for in essence the object is in a continual state of developmental disarray- it is here where the departure from modernism takes place; the account of the existence of multiple, and multiply correct from. Noting that function is but one possibility of form, Eisenman argued that all such possibilities cannot be known a priori or discovered empirically. Architecture in its essence cannot break beyond the compositional, subjectiveness of itself unless it breaks the bonds and these connections between author and object be removed as much as possible.

[The Moderns proposed to extract architecture from history by identifying its essential, therefore a priori, purpose. They selected one aspect of function, use, to elevate to an a priori principle of architecture. It is obvious now that the actual function of architecture is far more complex than the efficient use of building, but even within then bounds of their own postulate, the possible uses of form that they considered to be self-evident were known only to them through a tradition a history and a use. Therefore form cannot follow function until function has emerged as a possibility of form. Even the possibility of utility cannot be known empirically][16]

The process of weaning oneself from subjectiveness is relatively difficult, it is a procedurally cyclically infinite argument- the more one removes him or herself from authorship, the greater the struggle is to take the initial footstep. It is possible to in effect work within a range of subjectivity's and objectivity's, sliding the balance between the fuzzy zone, never reaching either end and always finding relative traces to the inescapable opposite.

Hyperobjects

The possibility of displaying n-dimensional hyperobjects by computer.

Edwin Abbott is his book Flatland describes a world restricted to two dimensions, one dimensions, and describes the social order and the attitude about space within each. The inhabitants of both worlds were completely unable to visualize a third dimension and were baffled by the weird contortions. Man finds himself in a similar predicament when he begins to describe the properties, and in particular the properties of objects within spatial dimensions of higher then three. Many alternative investigations lead to principles of justifying relationships within greater than three dimensions. One case in particular which points to the ability to use projected geometries to translate the positions of points or objects in higher dimensions to perceptible three dimensional objects. This method is very similar to the perspectival methods used today in transcribing a three dimensional world to two dimension images/ representations of the object in three.

Projective geometries can be used on any number of dimensions so that an n-dimensional hyperobject can be mathematically projected into an (n-1) dimensional space. Such projection could be applied repetitively until finally a dimension which can be discernible by our perceptions. Motion of a hyperobject can also be charted, the most basic type of movement would be rotation of a hyperobject in n-dimensional space.[17]

Click here for Picture [18]

Space, Space-Time

Buckminster Fuller

526.00 Space

526.01 There is no universal space or static space in Universe. The word space is conceptually meaningless except in reference to intervals between high-frequency events momentarily "constellar" in specific local systems. There is no shape of Universe. There is only omnidirectional, non conceptual "out" and the specifically directional, conceptual "in." We have time relationships but not static-space relationships.

526.02 Time and space are simply functions of velocity. You can examine the time increment or the space increment separately, but they are never independent of each other.

526.03 Space is absence of events, metaphysically. Space is the absence of energy events, physically.

526.04 The atmosphere's molecules over any place on Earth's surface are forever shifting position. The air over the Himalayas is enveloping California a week later. The stars now overhead are underfoot twelve hours later. The stars themselves are swiftly moving with respect to one another. Many of them have not been where you see them for millions of years; many burnt out long ago. The Sun's light takes eight minutes to reach us. We have relationships- but not space.

526.05 You cannot get out the Universe. You are always in Universe.[19]

From this brief text on space, it is reasonable to assume that the investigations which Buckminster Fuller undertook were very scientific oriented. Fuller also, in his book synegetics, grapples with the forces of math and science to, in his view, discover the cosmic relationships of form, force, and human ideology. The framework which is investigated most is the nature of the complexities of geometry giving rise to a structured formal architecture. Fuller describes synergy as meaning the behavior of whole systems unpredicted by the behavior of their parts taken separately. The words synergy (syn-ergy) and energy (en-ergy) are companions. Energy studies are familiar. Energy relates to differentiating out sub functions of nature, studying objects isolated out of the whole complex of Universe--for instance, studying soil minerals without consideration of hydraulics or of plant genetics. But synergy represents the integrated behaviors instead of all differentiated behaviors of nature's galaxy of systems and galaxy of galaxies.[20]

It is from this departure point that Fuller analyzes the beauty of a scientific and artistic model, which to him was were invention resided. It is this notion of a comprehensive background and research within the scientific discourse which gave rise to Fuller's profound and elaborate concepts.

Project 402c Computer Generated Architecture

This project was a topic studio, taken within the computer studio and was an experiment in form, process, and program. The project was develop a space analog station in Antarctica, a complex to house 16 scientist year-round with enough resources and storage for their scientific experiments.

Concept: Volume only exists as mandated by the technological systems concerns.

Permanent structures may well underlie all modes of communication but that of a serial technique (technique rather than thought- a technique that may imply a vision of the world, without being itself a philosophy) is the construction of new structured realities and not the discovery of eternal structural principles.

Umberto Eco[21]

What begins is our ability to comprehend that on the contrary change ought to be very controlled. In using tables in general or a series of tables, I believe one can arrive at direct form-- is what interests everyone unfortunately-- it is wherever you are, and there is no place where it isn't- highest truth that is. Eventually everything will be happening at once nothing behind the screen, unless the screen happens to be in front. All that is necessary is an empty space of time and let it act in its magnetic way eventually there will be so much in it that whistles in order to apply to all these various characteristics he necessarily reduces it to numbers he has also gone the mathematical way of making a correspondence between roles.

John Cage[22]

Theory

Experimental form in a space analog station- the form moves past predictable convention- displacing tradition, reality, and gravity.

Antarctica -- Abstract and conceptual, a place for experimentation- the explorers who reached the barren plateau were moving towards the future; discovering. Today the research in Antarctica is following a similar path but pushing the frontier to the simulation of a space analog station. This experimentation leads us to believe that the die-hard notion of exploration is still very present in human cognition. This conceptual lesson we can learn from and (re)apply when investigating the architectural possibilities for Antarctica, space and the future of the profession.

Architecture, while continually on an experimental journey, searches to enter the 21st century- this century will be defined by information, communication and networking- all governed by advancements in technology. As architects we must not only understand this paradigm shift but define, explore and create it.

The project is an experiment attempting to define an x-dimensional architecture. An architecture which can truly be derived from- and responsive to evolving and fluctuating fields of 'Speed',culture and information. This architecture loaded with information bypasses the conventional form making methods of sketching, drawing, and self-righteous biased determinacy. This x-dimensional architecture is rooted, created and reduced to instantaneously transforming numeric data information. The information is based upon and reflects the fluctuating loads and demands on the essential components of the occupiable building: the systems. The attempt is to analyze the systems demands on individual programmatic elements- thus consumption and waste are reduced and efficiency is optimized. The transformable form holds promise in not only mimicking this 'information shift' but implementing and allowing this hyperreal interactive situation to occur.

We notice that in a society which is fragmented, and completely changing, constantly in flux that Marx's allegorical modernist aphorism-'fast frozen and fixed relationships' are destroyed depicts reality. This reference holds true today but we can capture this transformation formally. Especially with the advent of networks having non boundary and barrier breaking connections which, in the near future will connect everyone. These interspactially woven relationships are continually changing. By investigating the possibilities of utilizing computers in design we place ourselves at the threshold- defining the next steps toward the future of technological and human development.

Process

Instead of designing the with classical methods of design--for example composition, symmetry, aesthetics--the experiment investigates the possibility for encoding data- and transforming it into form. This is done by writing a program in AutoLisp. The importance of this process is to show how data, however changed once entered, can produce unexpected and unpredicted form. The program has the capability to run multiple series of data entry to produce a wide variety of possibilities.

01 The program enters an intensive investigation of technological systems concerns. These concerns are evaluated from derived equations and used as form influencing data.

This experiment examines each programmatic element and its need/ supply/ or demand on the systems of the Air Conditioning System (ACS), Re-supply System (RS), Waste Management (WM), Fire Dampening System (FDDS), Electrical Power Supply (EPS), Data Management System (DMS), Communication System (CS), and the Water Supply System (WSS). Standard, invented, and intuitive equations were derived for creating a 'unit-less' database, in which to configure percentages and allocations for multiple simulations of load. For example the loads would substantially change from day to night- or similarly summer to winter.

02 A computer program is written (Tech15.lsp surface models, and Tech25.lsp solid models)[23] which (re)analysis the data, and plots the data in space- this data is represented in space as folded 2-dimensional planes in three-dimensional space- each graphing an individual program space for a particular instant in time. This is done through circular graphing, radiating from a center point which is determined by highest load capacities, alongside highest importance data.

03 The program is analyzed 'personally' and relationships between programmatic spaces are determined. These relationships are the psychological human concerns between spaces. These are constant and do not change in any manipulation. The relationships are broken into 3 types:

01 Primary- Solid Relationships

02 Secondary-Transparent Relationships

03 Tertiary- Frame Relationships

The relationships between the program spaces are fixed. The primary (solid) relationships connect the quarters to hygiene, kitchen to dining, and quarters to the workspace. The secondary (transparent) relationships connect the quarters to the recreation room, office to communication control, recreation to biosphere, and general storage to vehicular storage. The tertiary relationships (frame) connect the hygiene to kitchen, chemistry lab to workspace, work to laser lab, and waste storage with fuel storage. The indeterminate chance interconnection between crossing relationships verifies the ability for unpredictable change.

05 The form once outputted is then analyzed and examined- horizontal, vertical cross, and vertical transverse section and then made through the form. This type of analysis begins to describe the overall volume of the form, and also shows the potential for a structural wrapping around the shell.

The data which can, and essentially is constantly changing is based upon the continually updated loads of the buildings systems. This representative flux is produced visually, as active engaging architecture whose form is only a by-product of the state of the given information at anytime.

Specialized Studies

Program Analysis:

How is non-place-created created?

Program:

The theory of programming in architectural terms is to develop a information processing system. It is within this system where judgments are made, hierarchies established and design decisions made. Working from the investigations of precedents, the thesis is finally establishing various connections--and starting to focus and narrow its considerations and exploration. The thesis is now reducing the process to a dynamic modeling system from the creation of architectural 'thought' and evaluated by its performance of its conceptual process.

The basis and product for the programmatic requirements inherently lies in the ability to make qualified judgments. The purpose of this programmatic evaluation is to set up the structural models for the development of the experiment. A software program is written to actually begin to develop program. In this thesis, the resultant developments will be in the realm of transient, transformable, and unpredictable ramifications. The program will begin to set up the structure of the possibilities of the interactions.

Essentially the program will be derived from the physical interactions found in nature. It is here where architectural methodology is applied to the investigation. To explain it 'simply' and to escape the rhetoric of cyclical argument--the program sets up how particles will interact with other particles in order to create form through their involvement of from the tracing of the resultant movements. These fundamental ideas derive from the principles involved in physics, especially gravitational attraction.

The process will be to first generate a two dimensional model that will show the interactions of autonomous particles. Particles represent the smallest physical representation of architectural form. The particles will have specific variables: mass, charge, density, and position. The particles have the capability to interact with each other either being 'pushed away' or attracted. Particles also have the capacity of being connected based upon their relative position, this will be signified as a line. A line is the product of two points, architectural the first form giving development. These lines will move and change based about the relative positions of the moving connected particles, they also have the potential to split based upon their inherent mass/density/distance relationships. When three or more particles connect simultaneously their resultant form will produce triangulated, or orthogonal forms, it is here where planes can be established.

The site, or the field of operation, where the interactions take place can either be bounded by a physical source, such as the dimension of the screen, or be unbound and allow the interactions to take place infinitely. While operating in various dimensions space in the first dimension will be bounded by two fixed points and the interactions can take place between the points, and resultant extremity interaction will force the particles back into the 'field of operation'; in a two dimensional universe the field is bound by a plane, and the interactions will be forced within; in a three dimensional universe the volume is bounded by a volume, for instance a cube, or sphere, but it could also be bound by contextual issues such as building positions, subsequently allowing rebound interactions. In fourth dimensional considerations the universe would be bounded by a quasi three dimensional object or an inherently four-d quadrant system..etc.

In a potential boundless universe the interactions could inherently be controlled by entropy, or the systematic breakdown of the particles energies, into smaller constituent portions of unusable energy. This could also be controlled timing devices tuned for possibilities of non-interaction allowing a meta-command to take over and randomly overdrive a particles vector/position.

The potential of architectural interpretation lays in the process of programming or structuring the potential opportunities, obviously all interactions cannot be predicted immediately, but reducing the form making instruments into fundamental architectural concepts is important. The model can be seen in many different positions: 01 the temporality of the form created by its interactions is the result; 02 the tracing of the fundamental, basic, particles and its subsequent interactions are the result; 03 the tracing of the hybrid resultants of the interactions-such as the tracing of a line into a plane, or the tracing of a 4 particle non planer object through space creating volume. The trace of the particles motion can also be interpreted into architectural form, fundamentally it is completely contextual in a reductionist sense. Allowing the path to be created by the interaction of the environment.

Terminology:

Particles

The particle is the basic element which forces are applied, particles have initial positions, velocities, mass, and density.

2object

The 2object is the connection of two particles creating a line.

3object

The 3object is the connection of three particles create a triangulated surface with the potential of being solid, transparent or negative depending on the combination of the particles.

4+object

The 4+object is the connection of 4+ particles creating a plane, this plane, if particles are non planar create a curved surface. The 4+object has the potential of embodying other 'architectural' material--form making--characteristics.

Mass

Theoretical mass of object--which is a constant which might not be applied to the system. But helps for the modeling of curvilinear movement.

Density

In kilograms per cubic meter--which is a constant which might not be applied to the system. But helps for the modeling of curvilinear movement.

Position

This will be determined by the dimensional constant; potential dimensions:

+/-0 dimension: begins to investigate the potential of operating with alternative constants and variables without direct movement but with the exchange of electrons/or particles between objects.

1-dimension: This dimension is bounded by a line, and fundamentally with two end points--movement of particles would be along the line. Position would be determined by location measured in x.

2-dimension: This dimension is bounded by a plane, where interactions are flat--allowing for the potential of a high probability of 'crashes' between particles. Position would be determined by location measured in x,y..

3-dimension: This dimension limited by three vectorial position variable; x,y, and z.

Velocity

In meters/second

V=v+at

Acceleration

a=F/m

Motion

X=x+vt

Time

Time Interval: This is the length of time that a gravitational force acts on objects in seconds, or in other words, the time elapsed between calculations for new velocity and position. The

particle motions are calculated at discrete intervals, the particles when positioned or plotted within the computer aren't moving in curves, but in a lot of tiny straight lines. The time interval relates to the length of those lines. Now in real space those lines are really 0 in length,

Total energy

Total Energy: Particles in higher orbits have more total energy than particles in lower orbits, even though the lower orbiting ones have a higher speed. This is how spacecraft are able to "slingshot" out of a system. They steal energy from an orbiting body by throwing it into a lower orbit.

Momentum

Total Momentum: To calculate the total momentum of a system, multiply the mass of each object with its speed. Then sum up the results: M1xS1 + M2xS2 + M3xS3 + .... If the total momentum is not

zero, the whole system will "drift" through space. The positions of the planets only come into play when calculating total angular momentum.

Newton's second law

F = ma,

Newton = kg m / sec^2

Gravitational Constant

Gravitational Constant: This is the strength of the

gravitational field induced by a mass. You should not need to

change this value, but it is available for versatility. This

value affects the units used for mass, density, distance, and

velocity. The value found in nature is +6.67E-11 N(m^2)/(Kg^2).

G = N m^2 / kg^2

= (kg m / sec^2) m^2 / kg^2

= kg m^3 / sec^2 kg^2

= m^3 / sec^2 kg

Considerations

Range of interaction

Number of objects in the environment

Combinatorial and subtractive connections

The conversion factor, therefore, between the SI gravitational

constant and its equivalent in our units is:

K = AU^3 / YEAR^2 M_SUN

Programmatic Outline for Procedural Studio Scientific Investigation

Specialized Studies

This section will deal with certain explorations, concepts and degfinitions which have motivated the thesis. They are explanations of scientific phenomena and rooted within their computational designs are important relationships between parts, sub-parts, and wholes. There is an explanation of Heisenberg's uncretainty principle, natural-physcal causal forces within our environment and understandings of how the brain works.

Uncertainty Principle-

The importance of the uncertainty principle is the relative unpredictable aspect of the universe's most fundamental particles. It is used as more of a definition to the pseudo determinant ideologies of truth. The questions raised regarding the uncertainty principle are very relevant to the seemingly visible and noticeable events which occur in the universe. Is it possible to predict future events if enough specialized information is taken into account? For some events sure, we on one hand know the sun will rise tomorrow, that Haley's comet will return in 67 years. But what about events such as hurricanes, earthquakes, or winning the lottery?

In order to predict the future position and velocity of a particle, one has to be able to measure its present position and velocity accurately. The obvious way to do this is to shine light on the particle. Some of the waves of light will be scattered by the particle and this will indicate its position. However, one will not be able to determine the position of the particle more accurately than the distance between the wave crests of light; so one needs to use light of a shorter wavelength in order to measure the position of the particle precisely. Now, by Planck's quantum hypothesis, one cannot use an arbitrary small amount of light; one has to use at least on quantum. This quantum will disturb the particle and change its velocity in a way which cannot be predicted.[24] Moreover, the more accurately one measures the position, the shorter the wavelength of the light that one needs and hence the higher the energy of a single quantum. So the velocity of the particle will be disturbed by a larger amount. In other words, the more accurately you try to measure the position of the particle, the less accurately you can measure the speed, and vice versa.

This in it self signals a fundamental shortfall or end in than account of trying to make the world completely deterministic. one certainly cannot predict the future events exactly if one cannot even measure the present state of the universe precisely.

It is this very fact of unpredictability, as an investagoratory postulate which will also be threaded through and represented. The uncertainty principle closely resembles computer random, processes, which in themselves are not random but specific coded signifies for the next possible outcome. The generative random function has a "seed value" associated with it. Each time you reset the seed, the computer generates new random numbers based upon that seed. A given seed value will always generate the same sequence of random numbers. Changing the seed value advances the computer along another different random number sequence.

Def. Uncertainty principle: one can never be exactly sure of both the position and the velocity of a particle; the more accurately one knows the one, the less accurately one can know the other.[25]

Forces

Force carrying particles can be grouped into four categories according to the strength of the force and the particles with which they interact. It should be emphasized that the division into four classes is man made for the representational construction of partial theories.[26]

Gravity- This force is universal, that is, every particle is influenced by the force of gravity, according to its mass or energy. Gravity is the weakest of all the four forces. The important concepts of gravity are they can act over very long distances, and within larger bodies, such as the planets and the sun can add up to produce a significant amount of force.

Electromagnetic Force- This force interacts with electrically charged particles, for example electrons and quarks, but not with uncharged particles. It is about 1042 times stronger than gravity. There are two kinds of charged particles, positive and negative. The force of attraction is between positive and negative particles.[27]

Weak Nuclear Force- This force is principally responsible for radioactivity, which acts on all matter particles of spin 1/2, but not on particles of spin 0,1,2.[28]

Strong Nuclear Force- This force holds the quarks together in the proton and neutron, and holds the protons and neutrons together in the nucleus of the atom. It is also believed that this forced is carried with another spin-1 particle, called the gluon, which interacts only with itself and with quarks. The strong nuclear force has a curious property called confinement: it always binds particles together into combinations that have no color. One cannot have a single quark on its own because it would have a color (red, green, or blue). Instead, a red quark has to be joined to a green and a blue quark by a string of gluons (red + green + blue= white). Such a triplet constitutes a proton or a neutron. Another possibility is a pair consisting of a quark and an antiquark (red + antired, or green + antigreen, or blue + antiblue= white).[29] Such combinations make up the particles known as mesons, which are unstable because the quark and antiquark cam annihilate each other, producing electrons and other particles.

The importance of this explanation is to show not only the rigid scientific, and perhaps complicated structure of the system, but to show the relationships between systems hierarchically. It also begins to develop the contrived system of rules, for a describable operational system. Not so much unlike the department of building- allowing and determining regulations and codes.

Bibliography

Abbott, Edwin, Flatland, New York, Penguin Books, 1952.

Brisson, David, W., Hypergraphics, Visualizing Complex Relationships in Art, Science and Technology, Westview Press, Boulder, 1968.

Bryson, N., Vision and Painting: The Logic of the Gaze, Macmillan, London, 1983.

Fuller, R. Buckminster, Synergetics, Explorations in the Geometry of Thinking, New York, Macmillan Publishing Co., 1975.

Gibson, James, The Senses Considered as Perceptual Systems, Houghton Milfflin, Boston, 1966.

Golubitsky, Martin; Stewart, Ian, Fearful Symmetry- Is God A Geometer?, Cambridge, Blackwell Publishers, 1992.

Hawking, Stephen, W., A Brief History of Time, Batam Books, New York, 1988.

Kappraff, Jay, Connections, New York, McGraw-Hill, Inc., 1990.

Klein, J., Greek Mathematical Though and the Origin of Algebra, MIT Press, Cambridge, 1968.

Kipnis, Jeffrey, 'Architecture Unbound', pg 12-23, AA, London,1985.

Krieger, Martin, Doing Physics, How Physicists Take Hold of the World, Indiana University Press, Indianapolis, 1992.

Krieger, Martin, Marginalism and Discontinuity, Tools for the Craft of Knowledge and Decision, Russel Sage, New York, 1989.

Lerner, Trigg, The Encyclopedia of Physics, VCH Publishers, Newyork, 1991.

Mitchell, William, The Logic of Architecture, MIT Press, Cambridge, 1990.

Morrison, Foster, The Art of Modeling Dynamic Systems, Forecasting for Chaos, Randomness, and Determinism, John Wiley & Sons, Inc., New York, 1991.

Paulos, John-Allen, Beyond Numeracy, Vintage Books, Random House, 1992.

Rotman, Brian, Signifying Nothing, Stanford University Press, Stanford, 1983.

Sartre, Jean-Paul, Being and Nothingness, Washington Square Press, New York, 1956.

Sklar, Lawrence, Space, Time, and Space Time, University of California Press, Berkeley, 1976.

Stewart, Ian, The Problem of Mathematics, Oxford, Oxford University Press, 1992.

Virilio, Paul, The Lost Dimension, Semiotext(e), Columbia University, New York, 1991.

Woolley, Benjamim, Virtual Worlds, Blackwell Publishers, Oxford, 1992.