Large Language Model's In Architecture & Design (Impact & Use Cases)

Welcome to this week’s edition of Architecture Insights.

Large Language Models (LLMs) have major implications for the profession of architecture and this will become increasingly evident once practices incorporate them into daily operations. They are the foundations of AI tools like ChatGPT, Adobe, Midjourney, and may soon be the foundations of all our tools.

As always, here is this week’s latest news in the AI world that impacts architects and designers.

News & Updates

1. Google unveiled Genie, an AI that can turn text or image prompts into 2D video games.

Genie is another impressive product of AI. It is the world's first text-to-interactive-world AI model that is learning how to create realistic virtual worlds. To achieve this, the system must have a strong understanding of how the real world works.

2. Adobe announces an AI music/audio tool.

This will be a platform that can generate audio from text descriptions, and allow users to customize it. When it launches, users will be able to adjust tempo, intensity, repeating patterns, and structure, extend tracks, and remix music all from one platform.

Large Language Models

Why are they important?

In the context of architecture; Think of LLMs as a massive digital brain that's been fed a large amount of architectural and design knowledge – site plans, images, historical styles, landscape design principles, and material information.

This LLM now understands and stores this information, and when prompted to do so, it can answer a specific request from the user based on the data.

Tools like ChatGPT, Midjourney, and Stable Diffusion all interact with their knowledge base of trained data that is sourced from an LLM.

ChatGPT: You can ask "What are some sustainable design strategies for a desert community center?" It'll rattle off ideas based on all data regarding sustainable design and community centres.

Midjourney/Stable Diffusion: Describe a"zen garden with basalt water feature," similar to the above, Midjourney will tap into its data on zen gardens and basalt water to get you your results.

LLMs vs Design processes

As a designer you need to know the system and processes of design to get the best result, the same applies to AI, you need to understand how the data is trained and what it is trained on to get the best results.

Take stable diffusion, a large language model popular in the image generation space. By taking the existing LLM from stable diffusion, users have created tools such as Controlnet, (arguably one of the most fine-tuned and capable image editing and generating models) based purely on their customization of the existing model.

As time goes on a designer’s job in creating project renderings will be the process of “prompting” the AI without ever having to manually make edits.

Why every firm will have its own custom LLM.

Do you have a meeting that you need to organize, coordinate, and prepare summaries for afterward?

Here are some of the tasks you may be responsible for:

  • Make a project schedule

  • Send meeting invites

  • Create a PowerPoint presentation

  • Take notes during meetings

  • Summarize notes after meeting

  • Send meeting summary to attendees

Some of these tasks are shorter than others but all require valuable time and energy that a project manager has in a day.

If hypothetically the PM had access to use an AI chatbot that was connected to a trained model on the other end, it would simply need to prompt it to do all of those tasks subsequently.

Keep in mind that this LLM is now trained and has knowledge of past projects, documents, contacts, and relevant files you gave it was given when trained.

Beginning with the following prompt: “Send an email meeting invite to x,y,z for Friday at 2:30PM EST, subject: Kickoff Team meeting”.

Not sure how to structure the meeting? A prompt like “I am running a kickoff meeting for a large master planning project and need a meeting agenda created along with key points I should touch on. After reviewing your work I will need everything to be generated into a PowerPoint presentation following our standard template. Include a varying range of site images into the presentation from the images project folder.”

During the meeting you can use a tool like otter.ai to listen to and take notes during the meeting, creating a summary and sending it to all involved parties afterward. Like other AI tools, Otter is trained from an LLM to have the ability to complete these functions.

Brainstorm a task you think could be automated, and that is likely one that an LLM can complete. Our jobs will not become scarce but will require a major shift in perspective of day-to-day operations.

For exploring LLM’s

A platform like Hugging Face is a great place to browse the world of custom-trained models on LLM’s from text-to-text, text-to-image, text- to-video, and more.

Cohere is an AI company that makes LLMs built to be trained and deployed into specific industry products.

In an entirely different discipline like healthcare there are tools like Suki.ai, an AI assistant that Suki says generates an average of $54,000 in increased revenue per user based on its features that help doctors save time automating simple tasks.

Important notes

Custom LLMs will eventually become tools in your kit, just like CAD software. They are evolving at breakneck speed.

My opinion? LLMs aren't a threat to architects, they're power-ups. Fear of new technology is understandable, but those who jump in and experiment are the ones who'll shape how these tools change the architecture and landscape design fields for the better.

AI Image of the week

Thank you for reading this week’s issue, check past issues here. Share this newsletter with colleagues, friends, or anyone interested in the combined world of architecture and artificial intelligence.

Until next Friday,

A.I.

Did you enjoy this week's post?

Login or Subscribe to participate in polls.