Working with AI, not for it: a conversation with Amirsalar
- 2 days ago
- 4 min read
As an AI-native company building the operating system for real estate development, our engineers were among the first in our industry to put large language models into daily use. What is more interesting is not that we adopted early, but how we use what we adopted. We work with AI, not for it. That distinction shapes the way the OMRT / hub gets built.
To understand what that means in practice, we sat down with Amirsalar, computational engineer at OMRT, who has been one of the most active users of AI inside the team.

A technical environment without easy answers
Most software work has a head start. If you are stuck, someone has probably solved your problem already and posted the answer somewhere. Computational engineering for real estate looks nothing like that.
"A lot of what we do has not been done before," Amirsalar says. "You search and nobody has the same issue, because we are probably the first ones trying it."
The team writes bespoke logic in Grasshopper, Python and C#. They build the engines behind our variant studies, our program distribution algorithms, and the data layer that makes the OMRT / hub responsive in real time. There is no off-the-shelf code for any of it. Every project demands custom tooling, and the tooling has to stand up to scrutiny from architects, contractors, municipalities and investors.
This is the environment AI walked into.
What it actually unlocks
Amirsalar gives a concrete example that captures the point.
Grasshopper, the parametric design tool used across our industry, cannot handle real loops. Loops matter enormously for program distribution, the work of figuring out which mix of apartments, commercial space and amenities maximises value on a given plot. You need to test thousands of combinations to know which one wins.
"Before, we had to fake the loop or write the code ourselves. Two days of work, plus debugging, plus hoping I had not made a typo. Most of the time we just used brute force, which means the algorithm makes one decision and then lives with it, without knowing the consequences. To have the best outcome, we need to loop through all the possibilities, and pick the one that matches our desired outcome the best"
Work that used to take a week now takes a day. And it is not just faster.
"My decisions are not linked to each other anymore. I can look at every possible case and choose the strongest one. More options, more accurate, fine-tuned per project."
The wider effect is that the team can now build properly bespoke tooling on the timelines our clients actually have. Logic that used to live as 300 floating components in a Grasshopper file, fragile and impossible to reuse, can now be packaged into one reusable component that another engineer can pick up and adapt. That is what compounding capability looks like inside an engineering team.
The boundaries, deliberately drawn
This is where the OMRT philosophy is sharper than most.
Amirsalar is openly sceptical of engineers who plug AI straight into their tools and never look at the code that comes back. He sees it as a problem in the wider industry, not just inside our team.
"You type something, it writes something, it lands in your file, you see a result. Do you even know how it works? No. That is scary. You are making yourself irrelevant, and at some point anyone can do that."
His rules for the team are deliberately old-fashioned. Read every line. Set up the inputs yourself. Name them. Understand the function before you trust it.
"It is boring," he admits. "But you will not be surprised when something does not work, because you know exactly how it behaves."
The deeper principle behind those rules is the one worth quoting in full.
"It used to be about how to make something. Now it is about what to make. The how, you can figure out. The “what” is still on you."
That is the boundary. AI takes the boring part of the work, the writing, the commenting, the long stretches of code that nobody enjoys producing. The thinking, the structure, the direction, the judgement about what is worth building, all of that stays with the engineer. An OMRT engineer who hands all of that over to a model is no longer doing OMRT engineering.
"We are not writers. We are the people thinking about what gets written. That is the part that matters."
What this looks like in numbers
The output of working this way is hard to argue with. In less than a month of using AI seriously, Amirsalar built around twenty Grasshopper components he had been wanting to build for two years.
"I did all of them while I had crazy deadlines. Without this tool I could not have said yes to those deadlines. It runs in the background, turning my logic into a working production tool. So now I can customise tooling per project, not despite the deadline but because of it."
That is the line worth holding onto. Customised tooling per project, on the timelines our clients actually have, was not realistic a year ago. It is now. And it is happening inside an engineering culture that refuses to let the tools do the thinking.
This is what AI-native means inside the OMRT / hub. Not a chatbot bolted onto a product. Not engineers outsourcing their judgement to a model. A team at the frontier of what AI can do for real estate development, with the discipline to stay in control of what they ship.



Comments