The problem with long running code in Next serverless functions The current design paradigm at the time of this writing is called App Router. Next.js and Vercel provide a simple mechanism for writing and deploying cloud functions that expose HTTP endpoints for your frontend site to call. However, sometimes you want to asynchronously do work on the backend in a way that doesn’t block a frontend caller, needs to move on.
First attempt I made an attempt to setup TypeChat to see what’s happening on the Node/TypeScript side of language model prompting. I’m less familiar with TypeScript than Python, so I expected to learn some things during the setup. The project provides example projects within the repo, so I tried to pattern off of one of those to get the sentiment classifier example running. I manage node with asdf. I’d like to do this with nix one day but I’m not quite comfortable enough with that yet to prevent it from become its own rabbit hole.
I downloaded warp today. I’ve been using iTerm2 for years. It’s worked well for me but Warp came recommended and so I figured I should be willing to give something different a chance. Warp looks like a pretty standard terminal except you need to sign-in, as with most things SaaS these days. It looks like the beta is free but there is a paid version for teams. Warp puts “workflows” as first class citizens of the editor experience.
promptfoo is a Javascript library and CLI for testing and evaluating LLM output quality. It’s straightforward to install and get up and running quickly. As a first experiment, I’ve used it to compare the output of three similar prompts that specify their output structure using different modes of schema definition. To get started mkdir prompt_comparison cd prompt_comparison promptfoo init The scaffold creates a prompts.txt file, and this is where I wrote a parameterized prompt to classify and extract data from a support message.

Nix Language

To broaden my knowledge of nix, I’m working through an Overview of the Nix Language. Most of the data types and structures are relatively self-explanatory in the context of modern programming languages. Double single quotes strip leading spaces. '' s '' == "s " Functions are a bit unexpected visually, but simply enough with an accompanying explanation. For example, the following is a named function f with two arguments x and y.

Zero to Nix

I started working through the Zero to Nix guide. This is a light introduction that touch on a few of the command line tools that come with nix and how they can be used to build local and remote projects and enter developer environments. While many of the examples are high level concept you’d probably apply when developing with nix, flake templates are one thing I could imagine returning to often.
I’ve been following the “AI engineering framework” marvin for several months now. In addition to openai_function_call, it’s currently one of my favorite abstractions built on top of a language model. The docs are quite good, but as a quick demo, I’ve ported over a simplified version of an example from an earlier post, this time using marvin. import json import marvin from marvin import ai_model from pydantic import ( BaseModel, ) from typing import ( List, ) marvin.
Go introduced modules several years ago as part of a dependency management system. My Hugo site is still using git submodules to manage its theme. I attempted to migrate to Go’s submodules but eventually ran into a snag when trying to deploy the site. To start, remove the submodule git submodule deinit --all and then remove the themes folder git rm -r themes To finish the cleanup, remove the theme key from config.
The threading macro in Clojure provides a more readable way to compose functions together. It’s a bit like a Bash pipeline. The following function takes a string, splits on a : and trims the whitespace from the result. The threading macro denoted by -> passes the threaded value as the first argument to the functions. (defn my-fn [s] (-> s (str/split #":") ;; split by ":" second ;; take the second element (str/trim) ;; remove whitespace from the string ) ) There is another threading macro denoted by ->> which passes the threaded value as the last argument to the functions.
This past week, OpenAI added function calling to their SDK. This addition is exciting because it now incorporates schema as a first-class citizen in making calls to OpenAI chat models. As the example code and naming suggest, you can define a list of functions and schema of the parameters required to call them and the model will determine whether a function needs to be invoked in the context of the completion, then return JSON adhering to the schema defined for the function.