LogoFreestyle

Overview

When and why to use serverless code execution

Overview

Serverless code execution runs on a simple system — you send your TypeScript code and you get the output back. If the code has a syntax error, you get a nice trace of what that error was back. It is made for you to run code you only want to run once — or very few times.

Why Use it?

It is noticeably faster and cheaper than any VM based systems on the market because it doesn't run on VMs. Instead, it uses Freestyle's Serverless Code Execution Engine. The same technology that Google Chrome uses to isolate code between tabs is what we use to isolate code between users. This means that while other companies start whole VMs to run your code, for us running your code is like opening a tab — except we also always keep lots of tabs pre-opened to make it even faster. The fastest cold start time of any competitors we've seen is 90ms, our average full execution time is <150ms total

It also lets you use arbitrary npm packages without performance impact. You can list the packages you want in the nodeModules field of the configuration of the API call, and we'll cache them for all future uses. This means that sometimes you'll run code with new modules for the first time and it will take >10 seconds, but for all future runs that code would be instant.

When Not to Use it?

For code you want to be called repeatedly, you should deploy it to a Web Deployment.

Serverless code execution is great for code that doesn't need a code, but you cannot run binaries in it. You also cannot run code with persistent state in it, once a script finishes running it's state is lost. For code that needs to run a long time, run binaries, have persistent state, or generally needs a proper VM, you should use a Dev Server.

While we store the execution code for some amount of time after it has been run, it can be deleted at any time. If you want to store the code for a long time, you should do it on your side, or use a Git Repository.

How to Use it?

  • You can check out the Run Code page for a full example of how to use the API

We also provide a series of integrations with common AI Agent Frameworks to make it easy to run with your AI, including the Vercel AI SDK, Mastra, LangGraphJS, LanggraphPy, the OpenAI Python SDK, and the Gemini Python SDK.