Skip to main content
notpeter u/notpeter avatar

notpeter

u/notpeter

Feed options
Hot
New
Top
View
Card
Compact

r/ZedEditor icon

Zed: The social, performant, text editor. https://zed.dev


Weekly visitors Weekly contributions
r/ZedEditor

Zed: The social, performant, text editor. https://zed.dev


Weekly visitors Weekly contributions

Zed agent with OpenAI: gpt-5.3 and gpt-5.3-codex missing.

notpeter
commented

gpt-5.3-codex is not yet available via API. My understanding is that Cursor and Copilot are the only third parties with access and even they only got it yesterday (multiple days after release).






r/ZedEditor icon

Zed: The social, performant, text editor. https://zed.dev


Weekly visitors Weekly contributions
r/ZedEditor

Zed: The social, performant, text editor. https://zed.dev


Weekly visitors Weekly contributions

Just got zed why when i use an LLM it just paste into a text file?

notpeter
commented

In the Agent panel do you have the “Ask” profile selected? Try toggling it to “Write”. The default (Ask) doesn’t just start trashing your project by creating/editing your project files willy nilly.


r/ZedEditor icon

Zed: The social, performant, text editor. https://zed.dev


Weekly visitors Weekly contributions
r/ZedEditor

Zed: The social, performant, text editor. https://zed.dev


Weekly visitors Weekly contributions

Local AI - truly awful by accident or just no money there?

notpeter
commented

Ollama and LMStudio should support auto-model discovery in Zed, but you may need to tune your config to specify the context size appropriate for your hardware (defaults are conservative to try and make something work without config).

That said, the agent panel has a pretty long non-editable system prompt primarily tuned for Anthropic models. For local models with small context windows this is not particularly useful — 20k of context may leave little left for you.

If you’re longing for the “old” text-only Zed Assistant experience (no tool calling) which supports precise context control (add files; edit previous messages/responses) that’s still available under “New Text Thread” and works really well with local models. It’s just more manual because it can’t use tools (read_file, edit_file, terminal run, etc).

Another alternative is to configure Claude Code to use local models (Ollama recently announced Anthropic API emulation support) and then use Claude Code via the Zed ACP integration but that still has 15k+ token system prompt context last time I checked. The same can be done with qwen code cli connecting to local qwen-coder models via OpenAI API semantics.


r/PlaydateConsole icon
A banner for the subreddit

The community is dedicated to Playdate, a handheld gaming console by Panic!


Weekly visitors Weekly contributions
r/PlaydateConsole
A banner for the subreddit

The community is dedicated to Playdate, a handheld gaming console by Panic!


Weekly visitors Weekly contributions

Our first Playdate Game Jam

notpeter
commented

If folks are programming in Lua, I'd recommend you consider notpeter/playdate-luacats (shameless plug, I built and maintain it). With lua-language-server this will give you LSP completions with inline documentation in Editors that support it (VSCode, Zed, NeoVim, etc) whenever you are accessing the playdate API (playdate.*). The Playdate Lua SDK docs are quite good and this just pulls them into your editor.

For pure-C Inside Playdate with C is definitely less detailed than the Lua docs but is plenty to get things going. If you have things that are potentially slow in Lua (math heavy, hard on the Lua GC, etc) it is quite easy to use C to expose functions into Lua. For example, playdate.graphics.generateQRCode is a pure lua implementation which is very slow (multiple seconds to generate a long QRCode). But if you've happily built everything else in Lua you can expose a C-based QRCode generator as a lua function which is 75x faster. See: notpeter/playdate-qrcode for a trivial example of how to do this.

The C bindings can be used by a number of other languages: Rust, Swift, Zig, Nim, etc. with varying levels of ergonomics. For a game-jam most folks should probably stick to Lua or C.

Other links: