More AI Engineering Podcast episodes

From Blind Spots to Observability: Operationalizing LLM Apps with OpenLit thumbnail

From Blind Spots to Observability: Operationalizing LLM Apps with OpenLit

Published 15 Feb 2026

Duration: 00:50:36

Current AI app development trends are hindered by challenges such as vendor lock-in, unclear value creation, and limited understanding of AI behavior, but open-source tools like OpenLit can help address these issues.

Episode Description

SummaryIn this episode of the AI Engineering Podcast, Aman Agarwal, creator of OpenLit, discusses the operational foundations required to run LLM-powe...

Overview

The podcast explores current trends in AI app development, noting a rapid increase in the number of AI applications being created without a clear purpose or strategy. Key challenges include a lack of understanding of AI behavior in production environments, difficulty in identifying value-generating aspects of AI systems, and the risk of vendor lock-in due to reliance on proprietary tools and formats. The discussion highlights the importance of essential requirements for building effective production AI apps, including prompt management, observability, cost tracking, and maintainability.

Open source solutions are positioned as critical tools for addressing these challenges, with OpenLit specifically noted as a key platform. It offers features such as centralized configuration, zero-code observability through a Kubernetes operator, and support for tracking and evaluating large language model responses. The platform enables the export of traces to various tools and provides observability across entire AI workflows, including vector and content stores. OpenLit has evolved from a telemetry tool into a comprehensive AI engineering platform, focusing on user experience, flexible integration, and future improvements in context management and functionality. Additional challenges discussed include managing memory and avoiding context overload in AI systems, as well as the need to balance the flexibility of open-source solutions with the robust features required by enterprise environments.

Recent Episodes of AI Engineering Podcast

27 Jan 2026 GPU Clouds, Aggregators, and the New Economics of AI Compute

Bruin, an open-source AI/ML data infrastructure framework, addresses GPU cloud market dynamics, technical challenges like Kubernetes portability and data gravity, and evolving trends in LLM tooling, infrastructure gaps, and hardware competition.

20 Jan 2026 The Future of Dev Experience: Spotifys Playbook for OrganizationScale AI

Spotify's engineering and AI integration focuses on distributed architecture, collaborative tools like Backstage, monorepo standardization, AI agents for code generation and operations, challenges in cross-team collaboration and reliability, and expanding AI beyond coding into product development and documentation while balancing innovation with rigorous testing and human oversight.

More AI Engineering Podcast episodes