Chat and conversations
Talk with an AI designed to support your work: history, search, export and streaming responses depending on the model used.
Local AI
Ambre helps you summarize, write, code, document and retrieve information by working with your own sources. It keeps your context inside the workspace, without making the cloud a required step.
Talk with an AI designed to support your work: history, search, export and streaming responses depending on the model used.
Ambre can use your documents, code, Markdown notes and sources opened in PANACHES, so you do not have to repeat the same context every time.
Choose generation profiles, personalities, reusable prompts and settings such as temperature, length or response style.
What it does
Ambre is not here to replace your judgment. It helps you save time, keep the thread and work with your own sources.
Local models
The AI module lets you manage compatible local models, their downloads, validation, metadata and performance. The final experience depends on the model installed and the power of your machine.
Catalog, search, favorites, GGUF, safetensors and bin formats, plus manual file selection.
Progress tracking, resume support, integrity validation, installation and local management of model files.
CPU/GPU detection, cache, asynchronous loading, fallbacks and resource settings.
Go further
Guides
Local AI
PANACHES lets you choose the local AI engine that fits your documents, machine and creative rhythm.
ReadLocal AI
PANACHES connects documents, notes and Ambre so you can find the right information and work with source-aware answers.
ReadSecurity
PANACHES puts the desktop software at the center so files, notes and sources stay in your work environment.
ReadResources
Prompt
Short prompts to summarize, compare, critique and transform long sources.
Open resourceGuide
Build a mini knowledge base and ask Ambre for source-aware answers.
Open resource