Skip to content

Backpropagation implementation architecture overview

[!NOTE] For the latest implementation status, please refer to Functional Implementation Status (Remaining Functionality).

Target audience: Engineers responsible for gradient learning and SNN validation. Useful code: backpropagation_verification.py, surrogate.py.

This document describes the implementation of the backpropagation infrastructure in EvoSpikeNet in detail from the perspective of concept, module configuration, data flow, and sequence. All diagrams have readings and intent to help you follow the design.

1. Concept

  • Avoid non-differentiability of spikes: Use smooth gradients (surrogates) exclusively for backpropagation instead of spike functions, allowing PyTorch's autograd to flow gradients.
  • Verify validity and stability of gradients: Automate comparison with numerical gradients using finite differences, approximation error measurement of surrogate gradients, and numerical stability check of gradients/weights.
  • Convergence behavior and benchmarks: Track convergence rate, loss evolution, gradient norm, and compare learning efficiency between SNN and ANN.

2. Module configuration

  • FastSigmoid autograd function (surrogate.py): Returns a step function in the forward direction, and a smooth gradient derived from a fast sigmoid in the backward direction.
  • SurrogateGradient (backpropagation_verification.py): Provides multiple surrogate gradients such as fast sigmoid, triangular, rectangular, exponential, super spike.
  • GradientVerifier (also backpropagation_verification.py): Measures finite differences and surrogate errors, and reports maximum error, average error, and relative error.
  • NumericalStabilityTester (also backpropagation_verification.py): Repeatedly measures gradient norm, NaN/Inf, and condition number to determine stability. Also validate paths that include weight updates.
  • ConvergenceAnalyzer (also backpropagation_verification.py): Iteratively learns, calculates the convergence rate from the history of loss and gradient norm, and judges early termination based on the patience value.
  • ComparativeBenchmark (also backpropagation_verification.py): Train and evaluate SNN and ANN under the same conditions and compare loss/accuracy/time.
  • BackpropagationVerificationSuite (also backpropagation_verification.py): Brings together the above components and performs full verification and report generation.

3. Architecture overview diagram```mermaid

graph LR subgraph SNN学習基盤 A[Surrogate
Gradient functions] B[GradientVerifier
Finite difference/surrogate] C[NumericalStabilityTester
Gradient/weight stability] D[ConvergenceAnalyzer
Convergence analysis] E[ComparativeBenchmark
SNN vs ANN] F[VerificationSuite
Integrated Orchestration] end A --> B A --> F B --> F C --> F D --> F E --> F ```Figure description: Surrogate gradient (A) is used in gradient validation (B) and integration suite (F). Stability (C), convergence (D), and comparison bench (E) results are aggregated into a suite and can be checked in a single verification flow.

4. Data flow (gradient validation path)```mermaid

flowchart LR In[Sample input
Sample teacher] --> M[Target model] M --> L[loss calculation] L -->|backward| Gv[Gradient acquisition
autograd] L -->|finite difference| Fd[numerical slope] Gv --> Err[Error measurement
Max/Average/Relative] Fd --> Err Err --> Rep[Verification result
GradientVerificationResult] ``**Figure description:** Calculate model output and loss using input and teacher labels. Compare the gradient obtained with autograd and the gradient obtained with finite difference, calculate the error index, and summarize it in a result object. This visualizes the internal processing ofGradientVerifier.verify_finite_difference`.

5. Data flow (Surrogate validation path)```mermaid

flowchart LR X[Input sample
Linear space] --> SFn[Spike function
Heaviside et al.] X --> SG[surrogate gradient function] SFn --> FD[Finite Difference
Spike Difference] SG --> ErrS[Error Measurement Surrogate vs Finite Difference] FD --> ErrS ErrS --> RepS[Verification results
SURROGATE] ``**Figure description:** Compare the output difference (finite difference) of the spike function and the output of the surrogate gradient function to evaluate the approximation error. Shows the steps ofverify_surrogate_gradient`.

6. Sequence Diagram (Full Validation Suite)```mermaid

sequenceDiagram participant User as ユーザー participant Suite as VerificationSuite participant GV as GradientVerifier participant NST as StabilityTester participant CA as ConvergenceAnalyzer participant CB as ComparativeBenchmark

User->>Suite: run_full_verification(model, loaders, loss, opt)
Suite->>GV: verify_finite_difference(...)
GV-->>Suite: gradient_verification
Suite->>NST: check_gradient_stability(...)
NST-->>Suite: gradient_stability
Suite->>NST: check_weight_stability(...)
NST-->>Suite: weight_stability
Suite->>CA: analyze_convergence(...)
CA-->>Suite: convergence
Suite->>CB: benchmark_training(...)
CB-->>Suite: snn/ann metrics
Suite-->>User: results + report

``**Figure description:** Shows the execution order and return value of each verification module from therun_full_verification` call. Finally, the integration results and report string are returned to the user.

7. Main API overview

  • spike_function = FastSigmoid.apply: Alternative for spike activation (step for forward, smooth gradient for backward).
  • GradientVerifier.verify_finite_difference(model, inputs, targets, loss_fn): Calculate maximum/average/relative error of autograd gradient and finite difference.
  • GradientVerifier.verify_surrogate_gradient(spike_fn, surrogate_fn, ...): Compare surrogate gradient and finite difference and evaluate approximation quality.
  • NumericalStabilityTester.check_gradient_stability(...): Stability judgment from gradient norm distribution, NaN/Inf, and condition number.
  • NumericalStabilityTester.check_weight_stability(...): Check weight norm transition and condition number during iterative learning.
  • ConvergenceAnalyzer.analyze_convergence(...): Return loss and gradient norm history, convergence rate, early stopping with patience.
  • ComparativeBenchmark.benchmark_training(...): Parallel comparison of loss, accuracy, and training time of SNN/ANN.
  • BackpropagationVerificationSuite.run_full_verification(...): Execute the above in batch and return the result in dictionary format.
  • BackpropagationVerificationSuite.generate_report(results): Generate human-readable verification report string.

8. Traceability

  • Surrogate gradient implementation: surrogate.py
  • Gradient verification/stability/convergence/bench: backpropagation_verification.py
  • Test suite: test_backprop_verification.py
  • PoC comparison: proof-of-concept/poc5_backprop_verification