Val Town Release Notes

Last updated: Apr 3, 2026

  • Apr 2, 2026
    • Date parsed from source:
      Apr 2, 2026
    • First seen by Releasebot:
      Apr 3, 2026
    Val Town logo

    Val Town

    Changelog — Apr 2, 2026

    Val Town improves its AI coding experience with a better Townie prompt, fullscreen mode, clearer context controls, and MCP updates for Claude and Codex. It also adds std/oauth authentication, migrates to BetterAuth, upgrades Deno, and ships dashboard, email, and SQLite workflow improvements.

    In this changelog: improving Townie and the Val Town MCP server; a std/oauth library for wrapping vals with authentication; migrating auth providers from Clerk to BetterAuth; upgrading to Deno 2.7.5; and much more.

    Editor's note: We've been deconstructing the old Newsletter into discrete parts: this changelog (new), the Talk of the Town series highlighting community vals, Steve's public investor updates, and perhaps a public roadmap soon. The Newsletter itself is a periodical rounding up the best of what we've written each month.

    Making Val Town delightful for both you and your LLM

    Val Town exists foremost for you the developer, which for many increasingly means for your agent. Much of our energy of late has been spent improving Townie and our MCP server to improve DX both in the browser and wherever you get your LLMs. We've appreciated all the positive feedback on Townie and our MCP server, but more importantly please continue to send us bug reports and friction points so that we can improve AX while preserving great DX so you can improve UX.

    Better Townie system prompt

    We've updated Townie's system prompt with more platform guidance and opinionated patterns on building vals, like:

    • Wrapping vals in std/oauth for authentication
    • Runtime constraints/gotchas
    • Preferring incremental progress over one-shot implementation
    • Testing vals to validate generated code
    • When to use different frontend app architectures
    • How to integrate external services (Slack, Google, Stripe, etc.)

    The system prompt doubles as a dense but high-yield guide you can read to learn how the Val Town platform works. Thanks to Michael from Sourcebot for the nudge here!

    Preventing Claude Code hallucinations with the Val Town MCP server

    Claude Code has been recommending Val Town to some users as a handy place to deploy the code it spits out (we're not doing any fancy AEO, in case you're wondering). But when using our MCP server with Claude, it's been hallucinating how to work with Val Town. We updated instructions in the Val Town MCP server with more platform/runtime context that MCP clients previously had to explicitly request.

    Bringing Val Town to OpenAI Codex

    Like Claude Code, Codex + Val Town MCP is an increasingly popular workflow for developers on Val Town. We updated our docs instructions for getting set up. Thanks Ash, Arti, and David M. for feedback on making the Codex auth flow smoother.

    Fullscreen Townie

    Townie now has two modes:

    1. Sidebar (status quo)
    2. Fullscreen (new)

    Fullscreen UX is more like Claude Code or Codex where you're not necessarily keeping a close eye on the code itself, and sidebar Townie is more like Copilot or Cursor chat where you're reviewing code changes. (Why do all the AI coding tools start with C?)

    Explicit context in Townie conversations

    Previously, Townie knew which val you were currently viewing, but that wasn't obvious as a user. You were on the same page without knowing it. Now you can see and explicitly choose which vals to attach to Townie's context window.

    Townie can read blob content

    Townie could previously only read blob storage metadata: key, size in bytes, and last modified timestamp. Now Townie (and any MCP client) can also read blob content via an updated MCP tool.

    Townie onboarding for new users

    Onboarding for new users now routes to a Townie prompt box, which can be skipped to instead create a val manually (like the legacy onboarding flow).

    Improving Townie reliability

    We fixed a few corner cases in Townie where (1) responses manually stopped in progress by the user would disappear from the chat, (2) duplicate messages would appear, and (3) you could still approve tool calls after stopping a response in progress.

    CLI improvements

    The vt CLI is a third way to use LLMs with Val Town (we recommend Townie, then MCP, then CLI—in that order). Thanks to JP from Datasketch and Halah from Swayable for bringing a couple vt login frictions to our attention.

    Add a secure authentication layer to your vals

    Wrap any val with the std/oauth middleware from the Val Town standard library. This will authorize users based on their membership in your Val Town org (i.e. your teammates) or by their username or user ID. Or just add a gate that someone must have a Val Town account to access your val, to keep out spammers. It's sort of like Google OAuth meets Cloudflare Zero Trust, but for Val Town. This is a major unlock for internal tools that need to be secured behind an authentication wall. Thanks to Cole Lawrence who pushed for this feature!

    Migrating auth providers: from Clerk to BetterAuth

    After experiencing a few too many stability issues with Clerk, we're migrating to BetterAuth. We are optimistic about the greater control and stability we'll have with BetterAuth. If the transition goes smoothly, you should hardly notice (if at all). You may have to mark login email links as not spam or re-approve GitHub OAuth. Please email [email protected] if you run into bugs. Thanks in advance for your help with the transition.

    Opt in to receive Error notification email

    The default notification setting for val errors is now website-only, instead of website + email. You can opt back into error email in val.town/settings/notifications. Notifications can trigger on the first occurrence of an error or every error, depending on your preference. We had trouble with other email (e.g. login links) going to spam, possibly caused by sender reputation harm inflicted by error notifications. Error emails are often low-information and come from abandoned vals, anyway. Email sender reputation is a Hard Problem that Tom is working hard to improve.

    Dashboard redesign

    The logged-in val.town dashboard now includes a switcher to toggle between orgs, namely your personal account and Teams org. This is table stakes for more and more and more of you who are using Val Town both personally and at work.

    Access email headers in email trigger vals

    Email-triggered vals can now access headers (like Message-ID) in the email object passed to the val as input. With Message-ID from an inbound email, you can set In-Reply-To on your reply so that strict-threading clients like Outlook will properly group the messages. Thanks to dt for requesting this!

    DB Pro Now Supports Val Town

    DB Pro, a database desktop client, now supports Val Town SQLite as a database provider. This is a powerful query UI for SQLite power users in Val Town, if you graduate out of the built-in SQLite admin panel in a val's sidebar.

    Upgraded to Deno 2.7.5

    Val Town code runs on Deno. Before upgrading to 2.7.5, we were on version 2.5.5.

    Alternatives page

    We wrote an Alternatives page to help prospective customers evaluate Val Town against similar services. As Steve wrote about in his February investor update, Val Town is still finding true product-market fit, which can be tricky for such a broad tool. The alternatives page is part of the effort to publicly reason about where Val Town fits in the devtools ecosystem. Thanks to Justin Duke at Buttondown for modeling what a thoughtful alternatives page looks like.

    Original source Report a problem
  • Mar 10, 2026
    • Date parsed from source:
      Mar 10, 2026
    • First seen by Releasebot:
      Mar 11, 2026
    Val Town logo

    Val Town

    Newsletter 26

    Val Town unveils a wave of shipped updates: MCP server and vt CLI enable building vals anywhere, Townie enhancements, per-val SQLite databases, a faster TypeScript editor with LSP, and Val Town Teams for organizations, plus new dashboards and workflows.

    This is a newsletter about the goings-on in Val Town. I'm Pete, and I joined the team this year, partly to write things like this (and that , and this , and that ). Steve used to write these newsletters when I read them as a Val Town user, and I know it's nice to hear from the founder. Sorry I'm not Steve! Please do still reply with your thoughts and feedback, though—what you like reading about, what you wish Val Town would support, what vals you've built or want to build, et cetera. We'll reply!

    Our last newsletter, #25 , was delivered in summer of 2025 when Townie was a val in user-space and "vibe coding" was only a few months old. Since then: the Val Town MCP server and vt CLI launched, so you can work on vals wherever you get your LLMs; Townie was reborn in product-space; every val gets its own SQLite database; our TypeScript editor is faster and better; we launched Teams so that you can bring Val Town to work; and much more.

    Also: we're having a party this Thursday, March 12th, at our office in downtown Brooklyn. Come say hello!

    The elephant in the town: AI

    There are three primary ways to ✨ do AI ✨ in Val Town:

    1. Townie
    2. MCP + your favorite LLM
    3. vt CLI + your favorite LLM

    Before I talk about Townie, let me just say: Townie is quite good, but we encourage you to bring Val Town to whatever coding tool you prefer (claude code, codex, cursor, et al—it's up to you). Townie and MCP are a step above the CLI because they have better access to val primitives, but regardless of your weapon of choice, you get Val Town's core loop: edit code → 100ms deploy → live URL.

    Ok, without further adieu: Townie is Val Town's code robot, now back in town next to your code editor. Townie uses the Claude 4.6 family and the Val Town MCP server under the hood. The MCP server can do a whole lot: create, edit, and run vals; read your SQLite, blob, and log data; configure cron, http, and email; read and write environment variables; view history and switch branches; and more. Here we ask Townie to one-shot a "Val Town Square" messaging forum.

    Video at 1x speed to offer a realistic sense of Townie. Song credit: Good Time Oldies by Bosley.

    Scoped SQLite

    As of January, every val gets its own SQLite database . Before that, you got one global database for your org or personal account (and you still do), but now it's a whole lot easier to manage your data. There's also a new SQL admin dashboard for poking around your val's data.

    If you've already been using the admin UI, you'll notice some new features this month: now you can create tables and add records with a GUI, and if you're writing SQL by hand (or pasting from an LLM), your query will be saved as you navigate between table, docs, and query views.

    Personally, I'm very excited about this one. Previously my vals used the same shared database for newsletter subscriber lists, wiki submissions, and whatever other data I stored. Now each val's data lives sensibly in its own silo, and the query interface lowers the friction to quickly explore data.

    Also, coming soon: val-scoped blob storage.

    TypeScript language server

    The TypeScript code editor in Val Town now runs Deno's official Language Server for faster and more accurate type information. The CodeMirror language server and full server+client+proxy Val Town LSP are open source, and Wolf wrote a great blog post about it if you're curious to learn how it works.

    Val Town Teams

    Teams launched last year so that you can bring Val Town to work. Teams comes with features for your organization like environment variable groups, val-scoped API tokens, val-scoped OAuth connections, blob management, custom domains, val ownership transfers, member management and invites, public profiles, version control/history/branching, and more. Companies like Kilo Code, Preston-Werner Ventures, and Turso are collaborating on customer support agents, deal pipelines, customer leads, and the like.

    With the explosion of agents in 2026, we're noticing Teams customers beyond the eng team editing vals. One customer noted that Val Town is a perfect platform for engineers and business people to collaborate because the biz team can easily update prompts and see changes deployed immediately without waiting on engineers, and the eng team can customize the code to their liking.

    And much more

    Here's an unordered, incomplete list of other new features in Val Town:

    • Pin vals to your profile page
    • Change your account handle up to 5 times
    • Connect to Slack or Google from a val
    • .json , .css , and .txt files supported as val files
    • Download vals as ZIP files
    • Download val files individually
    • Export environment variables as .env file
    • Copy environment variables when you remix your own val
    • 3x faster SQLite, average latency down from 35ms to 10ms (thanks Turso)
    • New onboarding steps for org setup
    • Val READMEs support Mermaid diagrams
    • Docs guide for adding auth to your val
    • Docs guide for building a Slack agent
    • Go to definition for any symbol in your code
    • DenoLS resolves and type-checks imported private vals and modules
    • Search and replace in branches
    • Search results link to source line
    • Better code search (cmd+k)
    • cmd+s saves globally (not just when the editor is focused)
    • cmd+shift+f maps to project search
    • Live HTTP preview pane floating in or alongside the editor
    • Markdown preview pane alongside the editor (cmd+return )
    • Full-page Logs view for a val
    • Render multi-line logs
    • Linkify stack traces in logs
    • Query val logs via API /v1/logs
    • API endpoints for managing environment variables
    • Browser-based device auth for vt CLI
    • Manage OAuth client connections in Settings
    • Include merge commits, PR links, and Townie edits in val History
    • View diffs between val versions from History
    • View Townie chat history in Settings
    • Townie model picker to choose between Haiku, Sonnet, and Opus
    • Show Townie diff view before approving/denying edits
    • Let Townie run vals
    • Townie's "4 C's": commands for /cost, /compact, /context, /clear
    • As an aside^, why do so many AI things start with C? Claude Code, Codex, Cursor, context...
    • Attach images to Townie chat
    • Fullscreen Townie mode (new!)

    Val Town Party

    If you live in or around NYC, swing by our headquarters this week! We're having a party at the Val Town office in downtown Brooklyn on Thursday.

    Val Town Hall

    Speaking of community hangouts, last summer we had a Val Town Hall where we opened up the floor to community members demo'ing vals they'd made (for business or pleasure) and talked about our roadmap. This spring, we're bringing it back (date TBD).

    Reply (1) if you're interested in presenting or attending, and/or (2) if you have requests or ideas about what should be on the meeting agenda.

    Talk of the Town

    My favorite part of the old newsletter was the community vals section, which you'll now find in Talk of the Town . In February we highlighted:

    • vals in the atmosphere (read: atproto/Bluesky)
    • a Valentine's val (Val) , (dubbed "Valentown" in one email reply)
    • a literal laundry val
    • a Helium Hotspot val
    • a moon phase val

    and dozens of other neat vals you all wrote

    Share your vals in our Discord #showcase or on Bsky/X/HN/etc to be included in the upcoming March issue.

    Original source Report a problem
  • All of your release notes in one feed

    Join Releasebot and get updates from Val Town and hundreds of other software products.

  • Feb 25, 2026
    • Date parsed from source:
      Feb 25, 2026
    • First seen by Releasebot:
      Feb 27, 2026
    Val Town logo

    Val Town

    Bring Your Own Agent (BYOA)

    Open source Chat SDK in public beta; Slack bot path renamed to Chatbot, signaling early shifts in Vercel AI tooling. A candid take on BYOA and staying in control of AI agents across platforms, with Val Town as a flexible deployment option.

    BYOA—in Val Town or elsewhere

    Yesterday morning I wrote a Slack agent in Val Town using Vercel's AI SDK, and yesterday afternoon it became obsolete.
    Just after I'd written a docs guide and recorded a demo video, I saw the announcement: "Today, we're open sourcing the new Chat SDK in public beta." By this morning, I learned that the Chat SDK, which paves Vercel's new Slack bot happy path, has already been renamed to "Chatbot." What will happen tomorrow? Or next week? Heaven forbid you go on a family vacation.

    I'd written my Val Town agent—named duck, as in rubber duck debugging—and accompanying guide based on Vercel's now-stale Slackbot Agent Guide. That guide and its companion repo—last commit 8 months ago, an eternity in AI Land—uses the AI SDK v4. But they're already on v6. If even they were two versions behind on their own framework, imagine where that leaves the rest of us.

    Just about every software vendor you use has their own AI agent with a punny, on-theme name.¹ (Not to throw stones or anything—at Val Town we have Townie.) All those bespoke agents will always be behind, using yesterday's framework or model. It's ok to be behind ("if it ain't broke..."), but you should control when you are behind and when you want to catch up. You should Bring Your Own Agent.

    BYOA—in Val Town or elsewhere

    I've found Val Town to be quite a good place to write and run code, including agents. I thought so for two years as a regular user, and I think so now that I've been working here a couple months. It's "a nice place for JavaScript," as one of our users recently put it. But before I tell you why I think it's good, I should say: you may prefer to BYOA elsewhere, which is just as well. I don't know what tools you're familiar with and what you prefer. The important thing is that your agent answers to you—not your vendor.

    Val Town is designed to have credible exit, meaning if you ever want to move your code elsewhere, you can do so easily. You can credibly exit. It's anti lock-in. Just like how when you own a domain name, you can transfer it to another registrar, or move your website hosting between providers, or change email clients. Val Town code runs on the Deno runtime and emphasizes Web-standard and mature technologies: JavaScript (or TypeScript), npm packages, SQLite, et cetera. So if you BYOA in Val Town, you can later port it over to Vercel or Netlify or AWS, or wherever.

    Now, I like building agents (and many other things, but not everything) in Val Town because it removes the deployment step, such that I don't even think about it. Vals auto-deploy on save in 100ms, and you get a live URL. The feedback loop is very fast. There's no local versus staging versus prod environment (although there is branching for safe code changes). When collaborating with coworkers or sharing code with friends and strangers, there's no local dev setup—instead, they just remix your val or create a new branch. At this point in the paragraph my salesiness barometer is reading too high, and it'll probably only get worse in the next section about Kilo Code's agent val, so I'll transition by saying: it's totally reasonable if you want to BYOA on [insert hosting provider here]. I like Val Town, but YMMV.

    Case study: Kilo Code's customer support agent

    Last week, Steve (and Townie) from our team and Alex from Kilo Code set aside 8 hours to get a customer support bot² agent up and running on Val town. But they only needed 89 minutes.

    I watched a recording of Alex and Steve's meeting to inform my work on the aforementioned Slack agent. It really was, like, surprisingly easy. Here's some verbatim dialogue from the crux of the meeting, screenplay-style (stage directions added):

    [image: screenplay.png]

    The point this scene is supposed to illustrate is that setting up the agent was easy, even fun. But the more important, practical point that you can rehearse and repeat to your boss (or your boss's boss) in the architecture meeting next week is what Alex wrote about in his blog post:

    "Every SaaS platform you use is shipping an AI agent add-on right now. Your support tool has one. Your CRM has one. Your project management platform probably just announced one last week. They all cost $500–$1,000/month. They’re all black boxes. And if your experience is anything like ours, they mostly don’t work."

    And these black box SaaS agents won't obviously be a hot mess, not at first. They'll boil you slowly, like a frog. Of the customer support agent that Kilo Code tried for a month before BYOA'ing, Alex wrote:

    "It wasn’t bad enough to reject on day one — it was bad in the slow, corrosive way that wastes your time: almost good enough, but never quite right, and no levers to pull to fix it."

    So Kilo Code rolled their own agent with Val Town and the Kilo Code AI Gateway. Tomorrow or next week or next month, when a new model or better tools come out, they'll be in control.

    BYOA

    "Bring Your Own Agent" (not a new term, btw) feels like the right pattern for this build-don't-buy moment, and Val Town might be the right place to do it. But again, I don't want to oversell. We are pickling compute, and BYOA is the new flavor. Chances are your company is already (or will soon be) hiring agents to work alongside you: for customer support; for data analytics; for fraud detection; for code review; for lots of things. Maybe you'll build them on Val Town.

    Footnotes

    • (1) As far as punny AI agent names go, I think Gusto's Gus is pretty damn good. I like Townie, too. But my favorite robot name, hands down, is Kroger's inventory-taking robot, Tally.

    • (2) Here, and everywhere in this post, I mean "agent" as in Simon Willison's definition: an LLM running in a loop with tools—definitely not agent-as-human-replacement.

    Original source Report a problem
  • Jan 23, 2026
    • Date parsed from source:
      Jan 23, 2026
    • First seen by Releasebot:
      Jan 24, 2026
    Val Town logo

    Val Town

    Every val gets a database!

    Val Town now gives every val its own SQLite database with a powerful UI for easy persistence, viewing, and querying. Fork projects with their database, enjoy scoped security and a val API token, and benefit from Turso powered cloud SQLite.

    Code example

    Using the std/sqlite val to access the val database:

    import sqlite from "https://esm.town/v/std/sqlite/main.ts";
    await sqlite.execute('CREATE TABLE IF NOT EXISTS "docs" (contents TEXT)');
    await sqlite.execute('INSERT INTO docs (contents) VALUES ("Hello, world!")');
    

    Demo

    With the built-in SQLite browser you can:

    • Add and rename columns
    • Delete rows
    • Get the schema of any table
    • Run queries
    • Export tables as CSV
    • Dump the database as SQL

    History

    In 2023 we launched SQLite for every Val Town user, powered by our friends at Turso. Vals could use the database with zero configuration and total control. A lot of the most interesting vals are built on this - my Bluesky ThinkUp Tribute val uses SQLite to store people's social media bios, and devstats uses the database to track statistics about our repository.

    But because databases were scoped to users and organizations, they contained intermixed data: it's impossible to tell a priori which data came from which val. Scoped databases fix that problem, eliminating conflicts between different vals and making it easy to just browse val-specific data.

    Security

    We've designed scoped databases to be more secure, too: vals can only access their own scoped database, and can't reach across to access data from other vals. This makes it a lot safer to work on SQLite databases too, because your schema changes can't affect any other vals.

    Under the hood, we've added project-scoped API tokens to make this work: each val automatically gets a temporary API token as the VAL_TOWN_API_KEY environment variable. That API token has always belonged to a user, but now it belongs to a val as well, which gives it specific val superpowers.

    Existing databases will stick around

    We're supporting both val-scoped and user-scoped databases for the long term: there are uses to both, and there's no reason to force everyone into a gnarly upgrade path. But val scoped databases are definitely recommended for new projects: they make a lot of things easier, and come with a powerful UI.

    Thanks Turso for the continued collaboration!

    We've provisioned thousands of databases and this will add even more - but it's possible because Turso runs SQLite in the cloud. This feature really relies on some tricks from Turso too, like forking databases by seeding from point-in-time backups.

    We’re hiring!

    Join our team and help build the future of programming.

    Original source Report a problem
  • Jan 22, 2026
    • Date parsed from source:
      Jan 22, 2026
    • First seen by Releasebot:
      Jan 22, 2026
    Val Town logo

    Val Town

    Townie's back in town!

    Townie v5 brings an in-browser AI agent tied to Val Town, supercharging coding with instant 100ms deployments, full IDE-like actions, and three Claude 4.5 modes. Available in public beta for Pro and Teams at $10/mo.

    Townie's back!

    Townie's back! In its 5th version, Townie, our AI agent, lives alongside your code editor, powered by the Claude 4.5 family and our carefully crafted tools.
    Townie is like Claude Code, but in the browser, optimized for Val Town's simple and instant runtime.
    Here we ask Townie to scaffold this very blog post:

    Like every edit in Val Town, Townie's changes are instantly deployed live on the internet at a URL (or a branch-preview-URL) in 100ms.

    What can Townie do?

    Townie can take almost any action that you as a user can take:

    • List, search, & create vals
    • Read, write, & run files
    • View history, create & switch branches, revert versions
    • Query your SQLite databases & Blob storage
    • Read & write environment variables
    • Read logs
    • Read & configure cron jobs

    Because Townie has so much context, you can ask it to one-shot whole features or even full-stack apps. Or remix one of our templates. Townie gets the same tight feedback loop that Val Town users get:

    1. Edit code
    2. Automatically deployed live in 100ms
    3. Run the code
    4. View the output
    5. Read the logs
    6. Keep iterating until done

    You can rely on version history and branching to move fast without breaking things, along with Townie's chat history to pick up where you left off.

    Modes

    We're using the latest models in the Claude 4.5 family — Haiku, Sonnet, and Opus — with three modes:

    1. Normal: Townie asks you to approve before writes
    2. Plan: Townie as your read-only thinking partner
    3. Allow all edits: Townie YOLOs changes to your vals

    System prompt

    Townie knows Val Town inside and out. Our system prompt is public (still), and it doubles as a great "101 intro to Val Town."

    Keyboard shortcuts & slash commands

    As a power user, you can open Townie with ⌘J and use slash commands:

    • /cost Show estimated cost and usage
    • /context Show context window usage
    • /compact Summarize older messages to save context
    • /clear Start a new chat

    Townie is writing ~50% of my code

    Before Townie v5, we didn't think we needed an in-browser AI agent. We wanted to thoughtfully craft a good MCP server and let you the developer stay where you are (in your favorite AI coding tool). And we did. And you can!
    But MCP clients just aren't good enough yet, so that experience often falls short. Our customers asked for a good in-browser AI agent. Honestly, we as users of our product wanted this too. And it's pretty great! I use Townie v5 a lot now. Sometimes I use it for vibe coding (for prototypes or simple things), and other times I keep it on a tight leash (for stuff I care about).
    It's fun and productive – give it a try!

    Try Townie

    Townie is available in public beta for Pro and Teams customers, so you can try it for $10/mo.

    Thanks to Pete for helping me write this and crafting the lovely Townie doodle at the top.

    Original source Report a problem
  • Nov 14, 2025
    • Date parsed from source:
      Nov 14, 2025
    • First seen by Releasebot:
      Nov 15, 2025
    Val Town logo

    Val Town

    Introducing Val Town MCP

    Val Town launches the MCP server letting you deploy JavaScript in 100ms from Claude, ChatGPT, Cursor and other LLM tools. Edits go live immediately at a public URL, boosting fast feedback for AI‑assisted development. MCP makes Val Town instantly available in your favorite coding tools.

    On Val Town and MCP server

    On Val Town, you deploy JavaScript in 100ms. Now with the Val Town MCP server, you can do that from Claude, ChatGPT, Cursor, VSCode, or wherever you do your AI coding.

    If you've been following my tweets recently – "I've gotta rant about LLMs, MCP, and tool-calling for a second", "MCP is mostly nonsense", "MCP is overhyped" – you might be surprised by this announcement. Well, how did you think I got those salty takes except by building an MCP server?

    Yes, I think MCP is dubious as a protocol. But for now, MCP is the right way for Val Town to meet developers where they are. In Cursor or Claude Code or Zed or wherever. For example, here we use Claude Code to make a blog. Every edit is immediately live and deployed on Val Town.

    We have guides for some of the popular LLMs:

    • Claude Code
    • ChatGPT
    • Claude Web/Desktop

    But the Val Town MCP server should work with any MCP client. If you'd like a hand with setup, ask in our discord server or send us an email.

    Why MCP

    MCP is not perfect (again, see tweets), but it has a few things going for it:

    • Cheaper – Don't pay us for credits. Pay your inference provider directly.
    • Better – Use whatever state-of-the-art LLM you want. We at Val Town don't have to fast-follow it.
    • Val Town everywhere – Get the best parts of Val Town – instant deployments, built-in SQLite, etc – in your favorite LLM coding tool.

    MCP also allows us to ship faster. Traditional APIs require careful versioning to prevent breaking changes, but an MCP server can change continuously because LLMs read the spec and run inference at runtime.

    Fast feedback loops

    There's a common thread running through every feature we build – AI or otherwise: enabling fast feedback loops.

    Creators need an immediate connection to what they're creating.
    If you make a change, you need to see the effect of that immediately.

    • Bret Victor, Inventing on Principle

    When you – or your LLM – make an edit on Val Town, your code is deployed in 100ms. This allows you to have insanely fast feedback loops in your production environment. No need to wait a minute or two to see how it'll actually look when deployed. Every change is immediately live, at a public URL.

    Val Town isn't an AI company – we're a developer tools company – but this always-deployed model works quite well with LLMs. Just give your favorite LLM a branch, and the code it writes will be alive and sharable by default.

    Bring Val Town MCP to your favorite LLM, and let us know what you think.

    Original source Report a problem
  • Sep 10, 2025
    • Date parsed from source:
      Sep 10, 2025
    • First seen by Releasebot:
      Oct 29, 2025
    Val Town logo

    Val Town

    Introducing vt, the Val Town CLI

    Val Town debuts vt, a powerful CLI that lets you code locally with your favorite editors and deploy instantly as you save. Features include live watch, instant redeploys, git-like branches, and a Chrome/Firefox extension for seamless live feedback—a major boost for local development.

    Introducing vt, the CLI for Val Town

    Introducing vt, the CLI for Val Town that lets you use your favorite editors and local tools. Now you can use VS Code, Claude Code, Codex and more with our super-fast feedback loop, deploying software instantly as you develop it.

    To get vt, install Deno, then run

    deno install -grAf jsr:@valtown/vt
    

    With vt, you can:

    • Use vt watch to watch a folder for changes, pushing updates and redeploying instantly as you save
    • Remix or create brand-new Val Town projects directly from your command line
    • Livestream logs from your Val directly to your terminal
    • Manage branches, switching between separate deployments or prod & dev branches of a project

    We designed vt to work much like git, so vt branch and vt checkout -b work just like you'd expect. But the real magic is in the vt watch command: vt can resolve deltas between Val Town and a local folder of TypeScript and text files, automatically detecting file changes like renames and modifications. As you edit in VS Code, neovim, or your favorite editor, every time you save the changes go live. Or, if you don't want to live on the edge, you can use vt push to explicitly push new changes.

    Bring your own editor - and LLMs!

    vt works perfectly with your favorite LLM tools: it can even initialize a AGENTS.md file that contains all of the context necessary to write code for the Val Town platform.

    People are already using vt to build cool projects, like Geoffrey Litt's Stevens project, a really cool AI personal assistant telegram bot, built locally with cursor and vt. We built Val Town's new Val search on Val Town itself, with Claude Code and vt.

    Use the companion browser extension

    vt also has a companion browser extension which pairs with vt watch to automatically reload the tab as you edit your Val.

    It's available for Chrome or Firefox.

    If you have vt watch running, it should "just work"! The companion communicates with vt's watcher over a local WebSocket connection.

    We want feedback!

    vt is a big leap forward in the local development experience for Val Town. But we're always looking to improve and polish the experience. If you have any feedback we'd love to hear it. You can join our Discord server here, and contribute ideas, PRs, or issues to the val-town/vt GitHub repo.

    Original source Report a problem
  • Sep 9, 2025
    • Date parsed from source:
      Sep 9, 2025
    • First seen by Releasebot:
      Oct 29, 2025
    • Modified by Releasebot:
      Oct 29, 2025
    Val Town logo

    Val Town

    Building a better online editor for TypeScript

    Val Town debuts a revamped TypeScript editor that uses a remote Deno Language Server in cloud containers for fast, accurate in-browser feedback. The new LSP-based stack offloads work to servers and is open source as vtlsp, delivering a smoother, scalable editor experience.

    Introduction

    Val Town makes it easy to ship TypeScript automations and applications to the internet via an integrated web editor experience. We strive to offer a magical tight feedback loop, with 100ms deploys on save.

    That online editor experience should be great: we should support high-quality highlighting, autocompletion, information for when you hover over bits of code. But unfortunately it hasn't been so: our previous editor has been buggy and slow to give useful TypeScript feedback.

    But now, we've rewritten our editor's TypeScript integration from scratch. It's available to all Val Town users, is fast and accurate, and the code is open source.

    Our old system: running TypeScript in a Web Worker

    Our previous language integration was entirely client-side. We ran a TypeScript Language Service Host in a Web Worker, to isolate it from the top frame's thread, and communicated between the Web Worker and top frame using Comlink.

    The system looked like this:

    We bundled it into codemirror-ts, a CodeMirror extension, and Deno-ATA, an incomplete implementation of Deno's import resolution logic grafted onto TypeScript's capabilities.

    This solution worked great in the simplest cases, but stumbled when importing certain NPM packages, and required more and more workarounds. The main two issues we were facing were these:

    1) TypeScript isn't written for Deno

    At Val Town, we run Deno, a modern JavaScript runtime that differs from standard TypeScript. Deno supports URL imports, provides server-side APIs through the Deno global (like environment variables), and introduces its own quirks. Sometimes we’ve been able to work around these differences. For example, we could use Deno type definitions. But in other cases, like handling URL imports, it requires us to interpret files differently. Deno is distinct enough that it ships its own language server, built in Rust and wrapping tsserver.

    2) NPM modules can be gigantic and installing dependencies is no joke

    Huge import trees for NPM modules are nothing new, but at least when you're installing NPM modules locally, you have the brilliant minds of the package manager implementers to do module resolution: to install the minimal number of packages by comparing semver ranges. We didn't have that luxury, and often referencing an NPM module would trigger an avalanche of HTTP requests and bytes downloaded, which would overload the Web Worker and make the editor's language tools unresponsive.

    Bringing DenoLS to Val Town

    So, we redesigned our editor's TypeScript handling. Instead of running TSserver in a Web Worker, we now run the official Deno Language Server remotely in cloud containers.

    We no longer suffer writing our own workarounds to the mismatch between TypeScript and Deno, because the Deno project's Rust code that wraps around a TypeScript instance solves all those problems. Your browser doesn't struggle to download huge NPM dependency trees because a beefy server does that for you, from a faster connection.

    Now, when you visit our editor, we launch a containerized server that exposes a WebSocket and speaks the LSP protocol. The architecture was partially inspired by Mahmud Ridwan's great writeup of connecting CodeMirror & an LSP, with the main difference being that we directly map stdio to the WebSocket rather than serializing messages, because vscode-jsonrpc can do that for us!

    Our open source implementation

    To tweak the language server for our unique purpose, while keeping the Codemirror extensions LSP-generic, we also took inspiration from the official VS Code LSP client library, which we couldn't use directly because of its reliance on VS Code globals. Their client provides a way to use middleware and URI transforms so that you can easily tweak the language server at the client level when writing VS Code plugins. Transforming URIs makes it easy to spawn the language server from a temp directory but map file paths as if they were relative to the root, and middleware modify the language server for our unique use case, like automatically downloading dependencies when the server sends the client a red squiggle saying a package isn't installed. We built a similar style system as a Language Server proxy server library. It acts as a language server of its own, but can arbitrarily modify messages passing through it.

    To actually host the LSP as a WebSocket server, there are various subtleties that were important for our use case. We want to keep connections persisted even when the editor leaves, and allow multiple clients to connect to the same language server instance (to support multi tab, or even multi-browser/device editing). Our implementation uses a stream WebSocket wrapper and pipes stdio directly, and manages multicasting connections so many clients can talk to the same process at once.

    Bringing it to the Browser

    Once we had a language server server in place, we needed a client. This will be querying for hover information on symbol hovers, displaying red squiggles, and all of the rest of the language-specific tooling. The LSP specification is quite sprawling – there are many fun features to support, like code actions (buttons such as "infer return type") and method suggestions (that pop up as you call functions). Meanwhile we need the client to keep documents synced with the language server, and send document update events.

    There are some existing CodeMirror language server client implementations, which we pulled from when building our own. We wrote our own so that we could support more arbitrary transports, in our case WebSockets with message chunking, external renderers for language server UIs (like to be able to use libraries like React, highlight.js, or remark), and take external callback inputs (so that you can implement things like going to definition on an external document).

    Shipping on Cloudflare Containers

    For deploying our language servers, it was important that we kept user workloads isolated because code is private. Even though we are running language server processes in temporary directories, you can still infer types of libraries in other directories by importing upwards "../../", and possibly even hop to their definition.

    We also wanted servers to live for as long as the user's session. Someone might be editing code for two hours, so a solution like traditional AWS Lambda would be a tough fit. Finally, we wanted to restrict users to using a limited amount of language server resources at a time.

    Initially, fly seemed like a great option. We could spin up containers on the fly (🥁) and shut them down when not needed. The issues we saw with fly were that we'd need to manually manage the lifecycles of our containers, routing individual users to unique containers, and make sure containers shut down after some amount of time not sending heartbeats from the client.

    When Cloudflare announced Cloudflare containers, they immediately seemed like a perfect choice. Cloudflare containers fit within their worker/durable object ecosystem and are tenants of durable objects. This means that they are routable by an arbitrary ID, and that the durable object layer (a persistent, serverless, JavaScript class instance) can internally manage container lifecycles. In our case, we're routing users to a durable object with the ID that is their literal user ID, and then using their container library to shut containers down after inactivity.

    This means that we didn't actually need to implement any stateful routing layer ourselves. When you want to connect to a Val Town language server, you simply hit our Cloudflare worker with a signed cookie containing your user ID, which routes you directly to a already-running, or brand-new durable object/container that boots your LSP. In the future, it will also be easy to hook into Cloudflare's built in worker sqlite db to internally manage utilization too.

    All together, the architecture ends up looking like this:

    A server replaced the WebWorker, and instead of communicating by postMessage (via Comlink, to a WebWorker), we now use a WebSocket. But the biggest win here is using the Deno Language Server and an isolated server for running language tooling: this lets us piggy-back on the stock implementation of module resolution and keep those huge NPM dependency trees out of the browser's responsibilities.

    Try it out

    The easiest way to see this all in action is to sign up for Val Town and write some code! While we'll continue striving for perfection, it's nice to know that we've gotten a lot closer to it this summer.

    Out is the editor that was slow, buggy, and required a lot of custom workarounds. Now every user has the full, luxurious Deno language server experience.

    Now that our editor is in production, it will only continue to improve. We have plans to add more Val Town specific language server functionality, like suggesting Val Town standard library function imports, giving useful diagnostics about aspects of Deno that behave differently on our platform, and adding more language server features.

    We've also open-sourced everything you need to ship your own cloud container WebSocket language server as vtlsp. This repo includes the client, server, and proxy, which you can see in the demo below.

    Original source Report a problem
  • Aug 12, 2025
    • Date parsed from source:
      Aug 12, 2025
    • First seen by Releasebot:
      Oct 29, 2025
    Val Town logo

    Val Town

    How we built an API for Clay

    Val Town unveils Clay API Proxy turning Clay enrichments into a single API call for real‑time enriched user data. It helps teams identify ICPs, streamline outreach, and offers a self‑hosted option with clear setup steps.

    This post is for our fellow engineers & founders doing GTM / sales.

    I used to think doing enrichment was for big companies with massive outbound sales functions. Before launching Val Town for Teams (coming this week!), we set a goal: find 10 pre‑sales customers. All of a sudden, I was spending all my time looking for teams who could be the right fit. Our best channel was obviously new users signing up for Val Town. However, scrolling through emails and hoping we recognize someone quickly became unsustainable. (And really it only works for celebrities anyways.)

    Clearbit vs Clay vs Val Town

    Clearbit used to be the obvious answer, but post‑HubSpot acquisition, it's no longer an option for small new customers. Clay is great, but we wanted to use it like an API so anyone on our team could use it programatically, without Clay knowledge or set-up. We asked on X, the CEO explained how we could do it.

    Introducing: Clay API Proxy

    We turned Clay into an API / SDK on Val Town via our Clay API Proxy. The hard part was that Clay enrichments are triggered by one webhook, but you get the results from another webhook. We wanted the developer experience to be a single request that gets back the enriched data as the response. Here's how we built it:

    1. Your val imports and calls our clay() "sdk" function with an email or GitHub username
    2. Our proxy authenticates you as a Val Town user
    3. We generate a request id, and forward the payload to Clay with that id
    4. Clay enriches, and POSTs the result back using the id. We save the result to SQLite.
    5. While your original request is still open, we poll sqlite for it.
    6. When it returns, we give you the JSON back in the normal request/response pattern.

    Enriching emails with Clay is now as simple as:

    import { clay } from "https://esm.town/v/charmaine/clay-proxy/sdk.ts";
    const result = await clay({ email: "[email protected]", source: "user_signup", });
    console.log(result);
    

    Which returns:

    {
     "email": "[email protected]",
     "person": "Steve Krouse",
     "linkedin": "https://www.linkedin.com/in/stevekrouse",
     "company": "Val Town",
     "funding": "Seed round",
     "totalFunding": "$7 million",
     "employeeCount": "6"
    }
    

    How this helps

    User enrichment is a big part of how we successfully found our first 10 pre-sales customers. Our users are no longer a long list of anonymous emails. We are able to more efficiently spend our time interacting with our ICP (Ideal Customer Profile) as soon as they sign up, instead of combing through endless dashboards in search of a unicorn.

    • We see new users joining in real-time, with their enriched profiles
    • Anyone can start a thread, tag the right person, or notify everyone else that they've already reached out. For example:
    • Warm intros were easy to track
    • The team then starts building intuition about: how many new users joined, are we attracting ICP, who owns follow‑ups etc.

    Then what?

    Once you have this data on Val Town, there's lots you can do. Here are some templates:

    • Send new user enrichments to Slack
    • Enrich yours' or your competitors' GitHub repo stargazers
    • For fellow devtool companies, we reach out with an auto‑generated Val that runs their SDK so the first touch includes a working demo. This got some great responses!
    • Get Townie to build custom outbound demos etc. based on their profile. (Patrick Spychalski has a great example post with a similar workflow.)

    We've been loving these experiments, and would love to help more engineers scale the traditional GTM function.

    Pricing

    Clay credits aren't free, so we can't let just anyone use our proxy. We've allowlisted a couple of friends who we know are good for it, and can bill them later. If you wanted to get started, shoot me an email at [email protected] and I'll personally help you get set up. We also have a self-hosted alternative, if you'd like to create an API for your own Clay account.

    Original source Report a problem
  • Jun 11, 2025
    • Date parsed from source:
      Jun 11, 2025
    • First seen by Releasebot:
      Oct 29, 2025
    Val Town logo

    Val Town

    Introducing Townie Credits

    Val Town unveils a pay‑per‑use credits system for Townie, decoupling pricing from Pro and offering cheaper access for casual users. It explains a 50% markup on LLM costs, aims for sustainability, and clearer investment signals, with Claude 4 Sonnet powering fast, open‑source coding.

    Credits payment update

    Today, we are introducing a new credits payment system for Townie, our AI coding assistant. Townie is now pay-per-use. Here's the upside:

    • You can use Townie as much as you want – no limits
    • Townie is now priced independently from Val Town Pro, which makes it cheaper for casual Townie users
    • Townie becomes sustainable
    • We at Val Town get a clearer signal about how much to invest in making Townie better

    How credits work

    We charge a 50% markup on top of raw LLM costs. If you use $10 in Townie credits, Anthropic will get $6.66 and we'll get $3.33. We think this is fair, sustainable, and transparent. We don't want to be in the business of having murky limits, obfuscated credits, or unsustainable margins.

    When you pay for Townie, you're telling us in the clearest way possible to continue to invest in Townie. If you don't think Townie is worth the cost, there are many wonderful AI coding alternatives that you can use with the new Val Town CLI. We want you to choose Townie on its merits, not because it's subsidized.

    Start coding with Townie

    We think Townie is one of the best coding assistants out there. It uses agentic tool-calling, powered by Claude 4 Sonnet, to read and write your code. It can even make test HTTP calls and inspect logs & traces to see how things are working, and keep iterating from there. Paired with Val Town's instant (100ms) full-stack deploys, Townie is one of the fastest ways to create a deployed app.

    We hope you give it a shot, and are very eager to hear what you think.
    Start coding with Townie here: val.town/townie
    ps - Townie is 100% open-source & is hosted on Val Town

    Original source Report a problem

Related vendors