Skip to content

Feature 39: Audit log system

Author: Masahiro Aoki

Implementation date: February 20, 2026 Version: 1.0.0 Status: ✅ Implemented

overview

EvoSpikeNet's audit logging system (Feature 39) provides tamper-detected audit logging using SHA-256 hash chains. We support compliance requirements (SOC 2, ISO 27001, etc.) and follow-up investigations of security incidents.

All HTTP requests are automatically logged by Starlette middleware.


How hash chain works

Each AuditEntry contains the following information, and tampering can be detected by referring to the hash of the previous entry.

Entry N-1:  hash = SHA256(id + timestamp + action + ... + prev_hash)
                    │
                    ▼  prev_hash
Entry N:    hash = SHA256(id + timestamp + action + ... + prev_hash_N)
                    │
                    ▼  prev_hash
Entry N+1:  hash = SHA256(id + timestamp + action + ... + prev_hash_N+1)

If one entry is tampered with, the hashes of all subsequent entries will be inconsistent and will be detected by verify_chain().


Core components

AuditEntry

One audit log record. Ensure integrity with SHA-256 hashing.

<!-- from evospikenet.audit_log import AuditEntry, AuditAction, AuditResult -->
from datetime import datetime, timezone

entry = AuditEntry(
    action=AuditAction.MODEL_LOAD,
    actor="admin_user",
    resource="/api/models/load",
    result=AuditResult.SUCCESS,
    detail={"model": "genome_v3"},
    prev_hash="d9e8f7a6...",
)
print(entry.entry_hash)  # SHA256 hash

field:

Field Type Description
id str UUID v4 auto-generated
timestamp str UTC ISO-8601 timestamp
action str (AuditAction) Execution action
actor str operator (username, system, etc.)
resource str Target resource (path etc.)
result str (AuditResult) success / failure / partial
detail dict Additional information (optional)
ip_address str Client IP (optional)
prev_hash str Hash (chain) of previous entry
entry_hash str SHA-256 hash of this entry

AuditLogManager

This is an audit log management class. Persist to a file in NDJSON format.

<!-- TODO: update or remove - import fail<!-- Need to know: Automatic conversion not possible  please fix manually -->uditLogManager, AuditAction -->

# Instance creation (usually using a singleton)
log = AuditLogManager(
    log_dir="data/audit_logs",   # log directory
    max_memory=10000,            # In-memory retention count
)

# write entry
entry = log.log(
    action=AuditAction.MODEL_TRAIN,
    actor="admin_user",
    resource="/api/models/genome_v3",
    result="success",
    detail={"version": "3.2.1"},
)

# search
entries = log.query(
    actor="admin_user",
    action_prefix="model.",
    result="success",
    since=datetime(2026, 2, 1, tzinfo=timezone.utc),
    limit=100,
)

# hash chain verification
valid, checked, error = log.verify_chain()

# export
json_data = log.export_json()   # list of dicts
csv_data = log.export_csv()     # CSV string

AuditAction list

class AuditAction(str, Enum):
    # Auth
    AUTH_LOGIN = "auth.login"
    AUTH_LOGOUT = "auth.logout"
    AUTH_FAILED = "auth.failed"
    API_KEY_CREATE = "auth.api_key.create"
    API_KEY_REVOKE = "auth.api_key.revoke"

    # Model / inference
    MODEL_LOAD = "model.load"
    MODEL_INFERENCE = "model.inference"
    CHECKPOINT_SAVE = "checkpoint.save"
    CHECKPOINT_RESTORE = "checkpoint.restore"

    # Evolution / genome
    EVOLUTION_START = "evolution.start"
    EVOLUTION_STOP = "evolution.stop"
    GENOME_SAVE = "genome.save"
    GENOME_APPLY = "genome.apply"
    FITNESS_OVERRIDE = "evolution.fitness_override"

    # Data
    DATA_READ = "data.read"
    DATA_WRITE = "data.write"
    DATA_DELETE = "data.delete"
    DATA_EXPORT = "data.export"

    # Configuration
    CONFIG_CHANGE = "config.change"
    SECURITY_CHANGE = "security.change"

    # Infrastructure
    NODE_REGISTER = "node.register"
    NODE_DEREGISTER = "node.deregister"
    FAILOVER = "infra.failover"
    SNAPSHOT_CREATE = "snapshot.create"
    SNAPSHOT_RESTORE = "snapshot.restore"
    SNAPSHOT_DELETE = "snapshot.delete"

    # Recovery
    RECOVERY_INCIDENT = "recovery.incident"
    RECOVERY_ACTION = "recovery.action"

    # Generic
    HTTP_REQUEST = "http.request"
    SYSTEM_EVENT = "system.event"

HTTP automatic logging (middleware)

make_audit_middleware() (Starlette/FastAPI) makes all HTTP requests automatically recorded.

# Settings in api.py (run at startup)
<!-- Module 'evospikenet' not found. Check moves/renames within the package -->
<!-<!-- Remember: Cannot convert automatically  please fix manually -->middleware("http")(make_audit_middleware(audit_log))

Autolog format: - actor: Obtained from the X-API-Key header, or "anonymous" if not present. The returned value takes the first 12 characters and appends"..."(egabcd1234efgh...). - **action**:AuditAction.HTTP_REQUEST- **resource**: Request path (request.url.path) - **detail**: Dictionary. The current implementation contains the following keys:```json { "method": "POST", "path": "/api/models/train", "status_code": 200, "query": "foo=bar" // 空の場合は空文字列 } ```- **result**: 2xx/3xx →success, other →failure- **ip_address**: Client IP (ifrequest.client.host` is present)


REST API

GET /api/audit/logs

Query parameters:

Parameter Type Description
actor str Filter by operator containing the specified string
action str Exact or prefix match (e.g. model., auth.login)
resource str Partial resource path match
result str success / failure / partial
since str ISO-8601 format start time
until str ISO-8601 format end time
limit int Maximum number (default 100, maximum 1000)
offset int page offset

Response example:```json { "total": 1, "offset": 0, "limit": 100, "entries": [ { "id": "550e8400-...", "timestamp": "2026-02-20T10:00:00.000Z", "action": "model.load", "actor": "admin_user", "resource": "/api/models/load", "result": "success", "detail": { "model": "genome_v3" }, "ip_address": "203.0.113.5", "prev_hash": "d9e8f7a6...", "entry_hash": "a3f1b2c4..." } ] }

### GET `/api/audit/stats`

Returns audit log statistics.

**Response example**:```json
{
  "total_entries_in_memory": 1500,
  "log_files": 5,
  "current_log_file": "data/audit_logs/audit_20260220T123456Z.ndjson",
  "entries_in_current_file": 1234,
  "by_action": {
    "http.request": 800,
    "model.load": 300,
    "auth.login": 50
  },
  "by_result": {
    "success": 1400,
    "failure": 80,
    "partial": 20
  }
}

GET /api/audit/verify

Verify hash chain integrity. max_entries query parameter (default 1000) allows you to specify the number of inspections.

Response example (aligned):```json { "valid": true, "checked": 1500, "broken_at": null, "message": "Chain intact" }

**Response example (damage detection):**```json
{
  "valid": false,
  "checked": 750,
  "broken_at": 750,
  "message": "Chain broken at position 750"
}

GET /api/audit/export

Export logs in json or csv format.

  • format: json (default) or csv
  • since, until: ISO-8601

The return is a plain text response with a Content-Disposition header.

POST /api/audit/log

Write the entry manually. Used from external services and SDKs.

Request body example:```json { "action": "model.inference", "actor": "service-account", "resource": "/api/generate", "result": "success", "detail": { "latency_ms": 120 } }

**Response example**:```json
{
  "status": "logged",
  "entry_id": "550e8400-e29b-41d4-a716-446655440001",
  "entry_hash": "a3f1b2c4..."
}


Verify the integrity of the hash chain.

Response (normal):```json { "valid": true, "checked": 1500, "error": null }

**Response (tampering detection):**```json
{
  "valid": false,
  "checked": 750,
  "error": "Hash mismatch at entry 750: expected abc..., got xyz..."
}

GET /api/audit/export

Query parameters: format = json or csv


File structure

data/
  audit_logs/
    audit_2026-02-20.ndjson   ← 日次ローテーション
    audit_2026-02-19.ndjson
    ...

It is persisted in NDJSON format (JSON with one entry per line).


Performance indicators

Metrics Target values Test reference
Write Throughput ≥ 1,000 entries/sec test_write_throughput_1000_entries
Concurrent writes (8 threads) ≥ 500 entries/sec/thread test_concurrent_writes_throughput
Hash chain verification (1,000 entries) < 5,000 ms test_hash_chain_verify_1000_entries
Query latency (500 hits) < 200 ms test_query_latency_1000_entries

Test

# unit test
pytest tests/unit/test_audit_log.py -v

# Integration test
pytest tests/integration/test_features_36_39_40_integration.py::TestAuditLogEndpoints -v

# System test (parallel writing)
pytest tests/system/test_features_36_39_40_system.py::TestConcurrentAuditLog -v

# performance test
pytest tests/performance/test_features_36_39_40_performance.py::TestAuditLogPerformance -v