logconsolidator ingests your log files in real time, persists them to PostgreSQL and a ChromaDB vector store, and lets you ask natural-language questions over the result — with Grafana dashboards and auto-generated daily reports out of the box.
Two queues, two output adapters, one process. Producers can never overwhelm consumers — by design.
queue.Queue · natural backpressure.You don't choose between dashboards and natural-language search. The pipeline writes both, and the API serves both.
A polling FileWatcher tails your sources and pushes lines into a bounded queue, so a noisy producer can't take down the pipeline.
Every structured LogEntry is persisted via psycopg v3 — durable, queryable, and ready for your existing SQL tooling.
Each entry is upserted as a vector document into a persistent local collection, enabling semantic similarity search out of the box.
Postgres is wired directly as a Grafana data source — build dashboards and run ad-hoc SQL on live log data without touching the app.
Chroma retrieves the most relevant entries by semantic similarity, OpenAI streams a grounded answer back — all behind one SSE endpoint.
A background scheduler runs the same RAG engine over the last 24 hours of Postgres data and emits a Markdown report — no setup needed.
A single python3 main.py starts the ingest pipeline,
the HTTP server, and the report scheduler — with unified signal handling so
Ctrl-C
drains everything cleanly.
/query as Server-Sent Events, plus /reports CRUD and the static front-end.Configure your sources, point at a Postgres instance, and run a single command. The HTTP API and the daily reporter are already wired up.
{
"id": "ssh_auth",
"path": "/var/log/secure.log",
"parser": {
"type": "regex",
"patterns": {
"timestamp": "(\\w+\\s+\\d+\\s+\\d+:\\d+:\\d+)",
"session_id": "\\[(\\d+)\\]",
"port": "port\\s+(\\d+)"
}
},
"classify": [
{
"match": "failed password",
"service": "sshd",
"event_type": "failed_login",
"severity": "medium",
"is_security_relevant": true
}
]
}
# 1. install pip install -r requirements.txt # 2. run everything (ingest + API + scheduler) python3 main.py # 3. run only the RAG / HTTP API uvicorn logconsolidator.api.server:create_app --factory \ --host 0.0.0.0 --port 8000 # 4. run only the log formatter python3 -m src.logconsolidator
Run logconsolidator alongside your existing services and turn unstructured logs into a knowledge base your whole team can query.