
AI Meeting Manager
Turn every meeting into a living, searchable memory for your team
Available now
Meeting intelligence without the black box
AIMM is not just transcription. It is an AI pipeline for team memory.
The app takes an audio or video recording and moves it through controlled stages: conversion, local transcription, semantic cleanup, minutes, summaries, search, and management plugins. The result is not a wall of text. It is structured knowledge your team can inspect, reuse, and automate.
For users
Less note-taking, more presence
Record the meeting, run the pipeline, and come back to a clean transcript, readable minutes, and summary-ready context.
For teams
Meetings become a knowledge base
Decisions, risks, questions, and tasks stop living in memory. They become artifacts that can be searched, reviewed, and connected.
For buyers
A platform, not a closed app
The paid edition can grow through plugins: LLM providers, management extraction, search, integrations, prompt governance, and automation.
How the pipeline works
AIMM makes AI feel almost magical by keeping the engineering visible. Each stage has a job, each plugin has boundaries, and every useful result becomes an artifact.
Stage 01
Convert
The app prepares audio and video for reliable AI processing before any model sees the data.
- Normalizes media into a stable internal audio format
- Keeps timing and source metadata for traceability
- Uses bundled runtime tools in the release package
Stage 02
Transcribe
Whisper-based local transcription turns speech into text while keeping the workflow private by default.
- Free installer includes Whisper Tiny for immediate testing
- Stronger Whisper models can be downloaded during setup
- Language auto-detect is available, with explicit language mode for safer runs
Stage 03
Understand
Text processing and LLM plugins refine the raw transcript into usable meeting knowledge.
- Semantic blocks make long conversations easier to scan
- Minutes extraction finds decisions, tasks, risks, and questions
- Local or cloud LLM plugins can generate richer summaries
Included in the free release
A strong local starter set
The free build is designed for first value without setup pain: transcribe locally, clean the text, extract structure, and experiment with local summaries.
Whisper Advanced Local Transcription
Private speech-to-text that starts immediately.
Runs whisper.cpp locally and ships with Whisper Tiny for a fast first run. Users can download stronger models in the installer when accuracy matters more than startup speed.
Semantic Refiner
Turns raw transcript text into meaning blocks.
Cleans transcript noise, groups the conversation into semantic blocks, and extracts important keywords so users can understand the meeting faster.
Minutes Heuristic v2
Finds the practical shape of the conversation.
Creates structured minutes with decisions, tasks, projects, questions, and risks. It gives teams the first layer of order before any heavy LLM work.
Llama.cpp Local LLM
Local summaries without sending meetings away.
Uses llama.cpp and local GGUF models for summary generation. The installer can preload starter Qwen models, so the first serious summary does not look frozen.
Paid edition and plugin platform
The commercial value is in the plugin ecosystem
AIMM is built so advanced capabilities can be packaged as plugins. Buyers can start with the free local core and grow into search, management, cloud LLMs, integrations, and workflow automation.
Local LLM
Ollama LLM
Connects to an Ollama runtime for teams that already manage local models and want flexible offline summaries.
Best for private teams that want model choice without cloud calls.
Cloud LLM
OpenRouter Gateway
One integration for many cloud models, letting teams choose speed, cost, or quality per meeting type.
Best for buyers who want premium model quality without locking into one provider.
Cloud LLM
DeepSeek LLM
Adds DeepSeek models for cost-efficient summaries and stronger reasoning on complex analytical meetings.
Best for long, technical, legal, or strategy discussions.
Cloud LLM
Z.ai
Adds another LLM route for teams that need strong multilingual summaries and long-context processing.
Best for multilingual teams and large strategy sessions.
Management
Unified Management
Extracts tasks, projects, agendas, and follow-up candidates without silently creating clutter.
Best for teams that want AI suggestions with human approval.
Search
Smart Semantic Search
Finds meeting fragments by meaning, not just exact words, and can return answers with source context.
Best for turning the meeting archive into a usable memory layer.
Search
Simple Search
Fast full-text search across meeting artifacts for instant recall when exact terms are known.
Best for quick lookup across transcripts and summaries.
Workflow
Prompt Manager
Centralizes prompt profiles for short summaries, detailed minutes, task-only extraction, and team-specific formats.
Best for consistent AI output across departments.
Workflow
Meeting Topic Suggester
Suggests clean meeting titles from AI artifacts so archives stay readable without manual naming work.
Best for teams that process many recordings.
Automation
Management Index
Creates a compact machine-readable index of tasks, projects, agendas, and decisions.
Best for downstream automation and internal systems.
Automation
AI Meeting Bridge
Exports tasks and meeting outcomes into tools such as Todoist and Trello.
Best for closing the gap between discussion and execution.
Integration
Webhook Ingest Reference
Shows how meeting results can be delivered to external HTTP endpoints and custom infrastructure.
Best for custom enterprise integration paths.
Free
Private local core
A complete starter edition for individuals and small teams that want to test the workflow on real recordings.
- Portable Windows installer
- Whisper Tiny included for quick startup
- Local transcription, semantic cleanup, minutes, and local LLM support
- Optional model downloads during installation
Commercial
Expandable meeting intelligence
A paid edition can package premium plugins, team workflows, LLM provider choices, and integration capabilities.
- Advanced management extraction and approval workflows
- Cloud and local LLM provider packs
- Semantic search and answer-with-sources plugins
- Prompt governance, integrations, and automation hooks
Frequently asked questions
Is my data secure?
The free workflow is local-first. Recordings, transcripts, and artifacts are stored on the user's machine by default. Cloud LLM plugins are explicit add-ons, not hidden background calls.
Why does the installer download models?
The release includes a small Whisper model for immediate testing. The installer can also preload stronger Whisper, semantic, or summary models so users do not stare at a pipeline later wondering whether it is frozen.
What formats are supported?
The pipeline is designed for common audio and video files such as WAV, MP3, M4A, MP4, and MKV. Conversion runs before AI stages to create predictable input.
Can meetings be reprocessed later?
Yes. AIMM keeps source files and artifacts so users can rerun stages with stronger models, different language settings, or new plugins.
How is this different from an AI note taker?
AIMM is a pipeline and plugin platform. Notes are only one output. The larger goal is a meeting memory system that supports search, management, integrations, and automation.
Start with the free local edition
Public download will open after server-side free licensing, required update policy, and a signed release manifest are connected. This lets every installation move to a safe version when needed.
Step 01
Free activation
First run receives a free server license and device token.
Step 02
Update policy
The app checks optional, required, and blocked updates for every installation.
Step 03
Signed release
Users download a file with checksum, manifest, and release signature.
The package is intentionally not published yet: the current build is not connected to server-side license and update control.
The download button will go through a server endpoint with download counting. The file itself will be published only as a protected release asset.