0:00
/
0:00
Transcript

Designing for the AI, Not just for the user: Rethinking enterprise automation with Lutra

A conversation with Jiquan Ngiam, Founder & CEO of Lutra.ai, on a new blueprint for low-code automation across enterprise applications

About Lutra & Jiquan Ngiam

We recently sat down with Jiquan Ngiam, Co-founder & CEO of Lutra AI. Leveraging his deep learning experience at Google Brain, early work as Coursera’s first engineer, and collaboration with Andrew Ng at Stanford, he’s reimagining how messy enterprise data meets AI-driven workflows.

Lutra aims to transform the traditional low-code approach by “designing for the AI” first—empowering business users to automate manual, repetitive tasks across multiple applications. In this conversation, we dive into how business process automation is being rebuilt from first principles via code-generation techniques, letting users seamlessly “chat” with their enterprise apps. By codifying procedural knowledge, the AI identifies underlying applications and process steps, builds integrations, and verifies results with human oversight. This new UX paradigm goes beyond static low-code/no-code tools, creating a modern “business logic and integrations layer” for legacy enterprise applications.

Key Takeaways

1) Evolving UX paradigm for workflow automation

“RPA was designed to make coding easy for business users. We are designing to make coding easy for the AI, and AI management easy for the user.”

Traditional business automation tools rely on drag-and-drop workflows. By contrast, chat-based, iterative interfaces offer a new UX paradigm, merging process discovery with code generation. The AI observes user actions, identifies which apps are involved, generates deterministic scripts, and uses reasoning models to refine workflows. This approach slashes development costs, reduces back-and-forth between business and IT, and transforms how enterprises envision low-code/no-code automation.

2) Redesigning middleware & interoperability:

“We design an AI–computer interface layer that handles how APIs are exposed to the model. That design is crucial. If you just throw the raw API at the AI, you’ll get poor results.”

Building a genuinely AI-friendly interface changes how applications communicate behind the scenes. Similar to Anthropic’s Model Context Protocol (MCP), this involves defining a consistent “language” for the model to learn each API’s format and constraints. It evolves beyond static, point-to-point integrations, creating a more flexible and adaptive layer for legacy systems.

3) Onboarding the AI via Process Discovery

Foundational models deliver raw text-generation power, but real business value emerges when they’re contextually aligned with enterprise data. Using pre-training, fine-tuning, or retrieval-augmented generation (RAG), the model “learns” each company’s data structures and operational rules. This onboarding process—effectively a form of process discovery—helps AI become deterministic and accurate, even in the messy reality of day-to-day enterprise environments.

4) Inference time reasoning for enterprise document extraction

A large portion of enterprise workflows still revolves around scanned or unstructured PDFs. Reasoning models can iteratively refine OCR outputs by reading extracted tables, checking for errors, and verifying fields, significantly boosting data accuracy. This method extends beyond document parsing: it applies wherever data quality is critical, from compliance checks to data migrations—unlocking fresh efficiencies in older processes.

5) New model architectures are a rare sighting

Convolutional networks reshaped computer vision; Transformers transformed language modeling. Given their scalable attention mechanisms and capacity for multi-modal inputs, Transformers are likely to remain the backbone for the next few years. Even as new variants emerge, the new era of iPaaS and workflow automation still has ample runway to optimize around Transformer-based approaches.

Watch on other platforms

Connect with Lutra and Jiquan

Connect with the cohosts

Thanks for reading AIconomics! Feel free to share the post!

Share

Discussion about this video

User's avatar