As mentioned previously, I've moved this site to a custom Next.js app. Along the way I've been adding some experiments, including some AI-type things, with the assistance of Cursor.
Digging into the underlying developer tools across AI / ML / whatever-we-are-calling-it-these-days, it's clear that Python is the (current) lingua franca, so I found myself wondering if I needed a way to call Python functions somehow.
I was pleasantly surprised to find that Vercel supports Python functions, so I thought I'd give it a try.
It didn't take too long to get it working, but I hit a few speed bumps along the way.
It is possible to have Python functions and Node.js functions in the same project. Your Python functions should be in an api
directory at the root of your project (not in app/api
, with your Next App's node.js functions).
api/
index.py
app/
api/
route.ts
You'll also want to add any dependencies to a new requirements.txt
file.
fastapi==0.115.0
uvicorn[standard]==0.30.6
While Vercel magic takes care of the different runtimes when you deploy, it's slightly more work to run everything locally. First you need to initialise a Python environment (and install any dependencies).
python3 -m venv venv
source venv/bin/activate
You need to edit the dev
command in your package.json
to also spin up the Python server (adding the concurrently
package as a dev dependency first).
"scripts": {
"fastapi-dev": "pip3 install -r requirements.txt && python3 -m uvicorn api.index:app --reload",
"next-dev": "next dev",
"dev": "concurrently \"npm run next-dev\" \"npm run fastapi-dev\"",
...
},
And you will also need to add a redirect in your next.config.js
file so that the Next.js app can find the Python functions when running locally (this is apparently taken care of by Vercel by magic when you deploy).
/** @type {import('next').NextConfig} */
const nextConfig = {
rewrites: async () => {
return [
{
source: "/api/ai/:path*",
destination:
process.env.NODE_ENV === "development"
? "http://127.0.0.1:8000/api/ai/:path*"
: "/api/",
}
];
},
};
module.exports = nextConfig;
Lastly, to avoid a strange error on deployment ("A Serverless Function has exceeded the unzipped maximum size of 250 MB") when you deploy your app, you need to add the following to your vercel.json
file so that the Python function doesn't erroneously include the Next.js app's files in its build (at least this is my working hypothesis).
{
"functions": {
"api/**": {
"excludeFiles": "{.next,*.cache,node_modules,public,app}/**"
}
}
}
Some of these things are obvious in retrospect (as is so often the case), so why write all this down? What was interesting to me was that unlike a lot of the development process so far, Cursor was unable to help me resolve those issues, often getting stuck in circular suggestions. As a result I found myself trawling through Google, StackOverflow and Github like it was 2022 again.
That might be a skill issue on my part, but I think it might also speak to some of the limitations in AI's ability to be of assistance - in this case I was working on some relatively recent and specific beta functionality with limited documentation. Notably this did not seem to affect the model's confidence.
In the end Vercel's examples had the answer I needed if you dug into the details (Flask, FastAPI), and I'm looking forward to using my newfound powers. But I was also interested to get a sense of where the edges are, in terms of Cursor's capabilities today.
Postscript: it turns out that there is an unexpected interaction between Vercel functions and dynamic Nextjs app routes, so still some wrinkles to be ironed out (Reproduction here).