Programming Sustainability + Strategies

If we take a moment to examine, as a whole, the architecture profession’s current relationship with computational assistance, we can see there is a progressive trend toward a new paradigm which has (relatively recently) begun moving with real momentum. The cutting edge of software adoption has passed from CAD into the multidimensional world of BIM. And yet still, as much as our livelihoods now depend upon specialized software applications, architects have been reluctant to delve into their inner workings.

This hesitancy is understandable. Few professions require as much specialized knowledge in as many distinctly separate fields as architecture; computer expertise can sometimes be seen as a necessary evil tacked on to an already overwhelming knowledge-base. Our pride in hand-wrought creativity has left many architects wary of a perceived sterile precision of technology and hesitant to lean too heavily on computational processes during the design phase. At the heart of this mistrust is the multifaceted meaning of the word “design”. We are accustomed to maintaining active control of all situations, and hence “design” can imply a very literal (and very linear) journey from unsolved to solved. That meaning contrasts with the relatively new definition associated with computational design: where we can specify criteria for a desired outcome, even if the form that outcome takes is potentially unknown.

Over the last decade, it is the introduction of these unknowns that has been met with the most resistance as seemingly random and infinitely complex forms have appeared at the forefront of avant-garde architecture. Forms derived from geometric extrapolations of biological processes are fascinating proof-of-concept experiments and research topics, but in their current state may be better suited as elaborate artistic installations rather than templates for the contemporary built environment. However, the same processes that can manipulate such huge quantities of data into nearly unimaginable forms have a much more immediate and relevant application within the field of performance-based computational design. This approach is not presupposed by formal characteristics, but rather by data which will influence the design as a byproduct of the generating data. The product then is not merely a formal exploration, but can be expanded to include items such as daylighting control, energy analysis, community analytics, and planning operations (to name a few).

Our industry is currently witnessing a new romance with technology in the form of fabrication, simulation, parametric modeling, and building information modeling. We are not adopting truly new ideas (similar manufacturing revolutions occurred a generation ago within the automotive and aerospace industries) but the methods are new to our field, and are rapidly gaining mainstream acceptance in the AEC industry. A confluence of factors has brought us to this present environment, but paramount to this shift in practice has been a demand for system-oriented building designs.

We are acutely aware of the impact of an inordinate reliance upon fossil fuels consumed by the built environment, which represents an estimated 40% of overall energy use, and as such we must make preparations for a reality in which all expenditures of energy are precious. HKS’ commitment to the 2030 challenge is proof of this acknowledgement, but current sustainable endeavors will only get us so far towards that goal. Our structures and occupants must actively work together to form singular, cohesive wholes in order to optimize the use of resources while maintaining human comfort. Hyper-aware structures will exist as a symbiotic entanglement of building systems requiring the real-time processing and manipulation of staggeringly large and complex data sets.

Successful manipulation of similar data sets can also control digital fabrication methods, power parametric planning tools, generate environmentally responsive building geometry, and a myriad of other innovative operations. We have reached the point where we can design systems of elements and parameters that work in concert to create adaptable architectural ecosystems rather than static monuments to the present worldview.

Key to the development and implementation of the aforementioned techniques is a solid grasp of the mechanics of computer programming. It is anticipated that to be considered literate in the near future, one will require the ability to read and write the language of our machines. It should then come as no surprise that there has been a persistent push by academia in recent years introducing scripting into the design curriculum of universities around the world. Fluency in this new architectonic language will allow us to work directly with big data pulled in with analysis, solidify custom components to fabrication, and enhance the performance of our buildings.

One of the earliest benefits resulting from the merging of design and programming training has been a steady reclamation of our digital toolsets. As hand-drawn production drafting gradually gave way to CAD, the maintenance of a sizeable part of our craft was relinquished to software developers. We were left largely dependent upon an outside party to determine what features were important. Only firms of the highest clout have had any persuasive say in the composition of our software. However, even with this influence, we have sometimes been forced to wait through many product release cycles before vital functions can be added to our workflow.

Efforts to reclaim our tools have started small, but are rapidly gaining momentum. Through the use of API’s (Application Programming Interface) we have been able to create custom extensions which enhance a host programs’ functionality. This allows us to sidestep the software industry’s notoriously slow implementation schedule, and empowers us to create custom solutions for our needs. As the architecture we create is dependently bound to the medium in which we choose to work, this is important because it affords the architect the ability to break free from the mold in which the software packages are cast. The visual programming environment of Grasshopper has lowered the bar of entry into the world of scripting by allowing us to experiment with code in a more fluid, intuitive way which is more natural for those in a field dependent upon visual and spatial intelligence. More importantly however, is the camaraderie of the architectural scripting community that has sprung up around open programming environments where people freely exchange new ideas and methods. Bits of code are posted, analyzed, rebuilt, and shared again for the benefit of the entire community instead of just the financial benefit of a few developers. We are now empowered to develop our own tools instead of waiting for third party implementation, and because of the increased focus on programming education, we’re amassing more staff that is literate in both architectural design and our newly adopted languages of logic and mathematics.

So where does this leave us? It is clear that the future will require our ability to competently create, process, and manage massive quantities of data. What is also clear is that these skills will need to supplement the tried and true fundamentals of architectural design if we are to ensure our competitiveness as a firm in the years to come. Our ability to create data already outpaces our ability to understand it, and innovative transformation of data into useful information will be the goal of a new generation of creative professionals. We are at an exciting crossroads that is straddling two eras, and, for the first time in history, are poised to fundamentally rethink the ways in which buildings are designed, constructed, and occupied. Our commitment to mastering the digital controls before us will lead to systems and processes that will allow us to reestablish control over our intellectual product, recapture services which have been subdivided and subcontracted out over the years, and ultimately lead to more a more sustainable environment through the implementation of big data.

It is the charge of LINE to understand, develop and implement these strategies across sectors within HKS. As our commitment to the 2030 challenge draws ever closer it will become increasingly paramount to define and implement sustainable solutions for our built projects. This will likely come in the form of increased simulation technology, algorithmic and parametric modeling, and energy analysis. As the encyclopedia of available software packages expands and the implementation of them becomes standard, streamlined workflows and integrated models without a loss of data will entail that we become accustomed to programming languages and how to further manipulate and customize those packages to meet our growing demands. With the introduction of LINE and HKS’ commitment to research, HKS is poised to be at the forefront of the development in these areas.