Building a full featured AI product in 40 hours
Blink Sheets Architecture

Building a full featured AI product in 40 hours

I wanted to put #vibecoding to a true test. We often see posts showcasing smaller prototypes using Lovable, Replit, or Vercel v0—but what does it truly take to build a full-featured AI product that you'd comfortably deploy on GCP or AWS and let customers use? With all the talk about developers getting replaced, how true is that really?

Let me share my experience building BlinkSheets in around 40 hours over some weekends. It's a Google Sheets add-on that seamlessly integrates powerful AI models from OpenAI, DeepSeek, Anthropic, Gemini, xAI directly in Google Sheets as custom formulae. It also comes with an AI Assistant providing analysis and enrichment help. Checkout https://blinksheets.xyz/ for more details.


AI Formulae in Google Sheetes
AI Formulae using BlinkSheets

As a cherry on top, without any promotion or announcements, the first user discovered BlinkSheets, used it for over an hour, exhausted their credits, and proceeded to make a payment—all within the same timeframe.

Background

It all started with observing my wife, who works in academia, frequently toggling between Excel and ChatGPT to streamline operational tasks. Why not simplify her workflow—and earn some brownie points? I pondered integrating AI directly into spreadsheets. While Gemini offers AI integration, why settle for one model when you can have them all. This led to expanding the vision: a comprehensive AI assistant embedded within Google Sheets, capable of performing sophisticated data enrichment tasks. This is the first version, and if this is interesting, come say hello.

Architecture

Blink Sheets Architecture using Rust, LangDB, Stripe, Redis, Google Sheets, OpenAI, Anthropic, Deepseek

90% of BlinkSheets was generated using Sonnet 3.5 and Sonnet 3.7, supported by rigorous code reviews and careful prompting. And it's not just backend APIs—Sonnet helped generate the website, all the pages like usage and pricing, the embedded application, Google Sheets add-on code, and even legal documents like terms and privacy (though, of course, those were reviewed 😉).

Backend

Rust for Programming: From my personal experience, code generation in Rust is insanely effective—not only because the generated code is usually precise, but more importantly, because you get instant compilation checks. Rust + Sonnet = Love.

Serverless on GCP: Deploying Rust binaries as serverless functions provides high scalability, instantaneous performance, and efficient resource utilization, perfectly aligning with consumption-based costs.

Distri Agent Framework: Lightweight, self-crafted framework designed specifically for agent-based workflows.

  • Utilizes a Model Context Protocol via the custom async-mcp library, ensuring structured and efficient communication between models and agents. Note:
  • You can use any agent framework you personally prefer. In many cases, a simple API wrapper is sufficient, especially when LLM observability is plugged into a service like LangDB.

Redis Storage :

  • Redis is perfect for this use case providing fast retrieval, essential for maintaining agent state, user sessions, conversation histories, and summaries. Redis

Distri + Async MCP + Redis + LangDB is used to power the agentic workflow.

Article content


LangDB Integration

LangDB (https://langdb.ai) serves a critical piece acting as an AI gateway streamlining access to multiple LLM providers like OpenAI, Anthropic, and Deepseek and offering the streamlined LLM features that you would not to want build yourselves:

  • Unified LLM interface simplifying integration complexity.
  • Request caching
  • Rate-limiting for optimal cost efficiency.
  • Built-in observability to analyze interactions, aiding iterative improvements.
  • Pricing tier implementations and comprehensive cost control measures, essential for predictable business operations.

Why These Design Choices?

  • Rust for Performance: Tiny, performant binaries translate directly into rapid, responsive interactions crucial for spreadsheet integrations.
  • Serverless & CI/CD Automation: Ensuring rapid iteration, simplified operations, minimal maintenance overhead, and immediate scalability as demand fluctuates. Deploying on GCP Cloudrun.
  • Redis via Upstash : Fast, reliable memory and user session management, reducing latency for frequent read/write operations inherent to conversational AI.
  • LangDB for Observability and Cost Control: Simplifies complexities associated with multi-model AI infrastructure, enhances transparency in model interactions, and significantly reduces operational burdens.

How Viable is Vibe Coding?

Vibe coding can feel magical on its own, but pairing it with thoughtful code reviews and consistent diff checks levels up the game. Compiled languages like Rust add another layer of sanity—even while people debate if it's the "best". Sonnet 3.7 is exceptional at translating vibe into reliable code, accelerating productivity while preserving clarity.

AI is making engineers a lot more efficient. This means more products, more innovation, and fierce competition. Code is cheap.

Let the future of coding be intuitive, joyful, yet precise.

Tarun Kumar Pasupuleti

Product Management | Credit Products Data Platform

1mo

Wast/is Stripe always your go-to payments solution? Can you call out a few reasons why?

Like
Reply
Vivek G.

Co-Founder at LangDB

1mo

Here is the link to Blink Sheets(https://blinksheets.xyz/)

Like
Reply

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics