Download Latest Version 0.28 source code.tar.gz (235.5 kB)
Email in envelope

Get an email when there's a new version of LLM CLI

Home / 0.28
Name Modified Size InfoDownloads / Week
Parent folder
0.28 source code.tar.gz 2025-12-12 235.5 kB
0.28 source code.zip 2025-12-12 279.0 kB
README.md 2025-12-12 1.3 kB
Totals: 3 Items   515.8 kB 2
  • New OpenAI models: gpt-5.1, gpt-5.1-chat-latest, gpt-5.2 and gpt-5.2-chat-latest. #1300, #1317
  • LLM now requires Python 3.10 or higher. Python 3.14 is now covered by the tests.
  • When fetching URLs as fragments using llm -f URL, the request now includes a custom user-agent header: llm/VERSION (https://llm.datasette.io/). #1309
  • Fixed a bug where fragments were not correctly registered with their source when using llm chat. Thanks, Giuseppe Rota. #1316
  • Fixed some file descriptor leak warnings. Thanks, Eric Bloch. #1313
  • Fixed a deprecation warning for asyncio.iscoroutinefunction.
  • Type annotations for the OpenAI Chat, AsyncChat and Completion execute() methods. Thanks, Arjan Mossel. #1315
  • The project now uses uv and dependency groups for development. See the updated contributing documentation. #1318
Source: README.md, updated 2025-12-12