Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

Troubleshooting

Gather context first

Before digging into specific symptoms, collect three things:

docker compose ps                # is the container actually running?
docker compose logs --tail=200 taka   # recent server logs
docker compose exec taka sh -c 'ls -la /data'   # database present?

The logs are almost always where the real error surfaces.

Web UI won’t load

Check: docker compose ps. Is the container Up?

If it’s restarting in a loop, the logs will show why. The most common causes are:

  • Port already in use. Change TAKA_PORT in .env to a free port (see Configuration).
  • Permission denied on /data. Happens with bind mounts when the host directory isn’t writable by the container. chown the host directory or switch to the named volume (see Data Persistence).

If the container is up but the browser can’t reach it:

  • Confirm the port mapping: docker compose port taka 7331.
  • If you bound to 127.0.0.1, you need a reverse proxy or SSH tunnel to reach it from another host.

Scans never finish / hang mid-way

  • Check the target is reachable from inside the container. docker compose exec taka sh -c 'wget -qO- https://target.example.com'.
  • Lower concurrency. If the target (or a WAF in front of it) is rate-limiting you, the scanner may slow down or stall. On the New Scan form (Advanced Options), drop Concurrency and raise Request Delay (ms).
  • Live updates stopped but the scan is still running. Usually a WebSocket timeout at the reverse proxy. The scan itself is unaffected; refresh the page to pull the latest state. Fix the proxy config to keep WebSockets alive.

Crawler misses JavaScript-rendered pages

The image bundles Chromium for headless crawling. Common reasons it may fail:

  • Not enough memory. Chromium needs 1–2 GB free. Give the container at least 4 GB for deep scans (see Configuration).
  • Kernel disallows user namespaces. Required by Chromium’s sandbox. Most modern Linux distributions allow them by default; check sysctl kernel.unprivileged_userns_clone on Debian/Ubuntu.
  • Running on Docker Desktop for Mac/Windows with low memory. Raise the VM memory in Docker Desktop settings.

AI Verification errors

“Invalid API key”: the key saved in Settings doesn’t match the selected provider. Paste the key again and confirm it has the right prefix (sk-ant-… for Anthropic, sk-… for OpenAI).

“Rate limit exceeded”: your provider account is throttled. Switch to a cheaper model, narrow the Findings to Verify scope, or wait for the limit to reset.

Verdict status is “Partial Result” or “Verification Failed”. Some smaller or older models produce malformed structured output that Taka can only partially parse. Use the Re-verify button on the Finding Detail page with a stronger model (a flagship Claude Sonnet or a GPT-4-class model).

Target appears unreachable. In Active Verification mode, Taka performs a pre-flight reachability check before calling the LLM. If the target is intentionally off-network (e.g. an asset you captured traffic from but cannot hit directly), tick Skip reachability check in the AI Verification drawer and proceed.

Reports fail to download

Reports are generated on request from the database. Causes:

  • Scan still running. Wait for Completed.
  • Volume full. docker system df on the host. If /data is out of space, exports fail silently; free space, then retry.

Resetting without losing scans

If the UI is broken but you think the database is fine:

docker compose down
docker compose up -d

If that doesn’t help, pull a known-good tag (see Updating & Lifecycle); the database schema is forward-compatible.

Starting completely fresh

docker compose down -v
docker compose up -d

This destroys scan history, findings, and API keys. Back up first if any of that matters to you.

Still stuck?

Open an issue with the output of:

docker compose version
docker compose ps
docker compose logs --tail=500 taka

Redact URLs, tokens, and anything else sensitive before sharing.