Trident Velocity Engine | Universal Reverse Proxy & Cache for Commerce
Trident Velocity Engine

Cache everything. Invalidate instantly. Scale infinitely.

Platform-agnostic HTTP reverse proxy and caching engine designed for backend-rendered commerce and high-throughput web stacks. Explicit configuration. Deterministic invalidation. Production-first behavior.

Works with Magento Open Source, Mage-OS, PrestaShop, WooCommerce, Sylius and any stack running on Node.js, Python, Symfony, Django or similar frameworks.

11x
Memory efficiency
34x
Faster invalidation
250K
Requests/sec
91%
RAM savings

Built for Modern Commerce Delivery

  • Structured JSON configuration (version-controlled)
  • Native cache tags & deterministic invalidation
  • Request coalescing & backend protection
  • Launch mode for safer rollouts
  • Compression, HTTP/2, and security hardening

Interested in purchase via the Partner Program? Email us to discuss terms and implementation model.

Why agencies adopt Trident

Trident is designed for teams delivering high-traffic, backend-rendered commerce and complex catalogs. It targets the pain points that cost agencies time, margin and confidence: unpredictable caching, fragile invalidation, backend overload during campaigns, and deployment risk.

Deterministic cache behavior

Routing, caching rules and invalidation are defined explicitly in JSON. The behavior is reviewable in PRs, reproducible across environments and automation-friendly.

Invalidation you can trust

Native cache tags mean you can invalidate by entity/tag without flushing everything. This is critical for large catalogs, frequent updates and campaign-heavy operations.

Origin protection under load

Request coalescing, rate limiting patterns, timeouts and stale strategies help you keep origin stable when traffic spikes or when upstream becomes slow.

Deployment safety

Launch Mode enables safer rollouts and controlled activation patterns, reducing the “deploy and pray” factor around caching behavior changes.

Higher cache density

Built-in compression improves memory footprint and cache hit ratios. Higher cache density typically means better TTFB and lower backend utilization.

Automation control surface

Admin API enables standardized operations: purge workflows, validations, health checks, and integration with deployment pipelines.

Installation Walkthrough on qoliber.com

Reserve space for a step-by-step video: how Trident is deployed, configured (JSON), routed, validated and how cache tags and invalidation are tested in a real environment.

Architecture & Request Lifecycle

Trident acts as a deterministic HTTP execution layer: routing, caching, invalidation and resilience live in one place. The key is that behavior is explicit and reviewable (JSON), enabling consistent delivery across projects.

Pipeline overview (diagram)
Client
  |
  v
[ Trident Edge ]
  |  (route match via JSON config)
  |-----> cache lookup (hash + tag index)
  |          | hit -> serve
  |          v miss
  |-----> request coalescing gate
  |          | leader -> upstream fetch
  |          | followers wait -> same response
  v
[ Origin / Backend Pool ]
  |
  v
Response -> store (ttl, headers, tags, compression) -> serve
            

This pipeline targets predictable performance under load: cache hits are immediate, cache misses are coalesced, and invalidation is tag-driven (no global flush by default).

Request lifecycle (step-by-step)
  1. Ingress: request arrives at Trident. Optional security gates (method rules, header validation).
  2. Routing: deterministic route match using JSON configuration (paths, methods, host rules, headers).
  3. Policy evaluation: cacheability rules, TTL, stale windows, vary logic, compression policy.
  4. Cache lookup: hash lookup combined with tag index metadata.
  5. Cache hit: response served immediately (low TTFB, minimal origin load).
  6. Cache miss: request coalescing ensures only one upstream call per identical key.
  7. Origin fetch: backend protection policies apply (timeouts, concurrency caps, retries).
  8. Store: response stored with tags + metadata; optional persistence supports warm restarts.
  9. Invalidate: targeted purge by tag (micro-invalidation) instead of global cache flush.

For agencies, the advantage is operational: you can reason about behavior, validate it during rollout, and automate it via API and CI/CD without fragile edge scripting.

Core Capabilities (Deep Technical)

Trident is not a thin cache wrapper. It is a structured execution layer for HTTP with explicit configuration, deterministic invalidation and protective behavior for origin services.

Launch Mode

Safe rollout mode enabling controlled activation patterns. Useful for high-risk deployments, major routing changes, cache policy rewrites or migrations.

  • • warm-up and validation workflows
  • • controlled enablement of cache behavior
  • • reduced blast radius during changes

Structured JSON Configuration

All routing and caching behavior defined in JSON: PR reviewable, environment-aware, CI/CD friendly. No hidden logic, fewer surprises.

  • • route rules per endpoint group
  • • TTL/stale windows and cacheability
  • • backend pool definitions + discovery
  • • safe overrides per environment

Native Cache Tags

Tag indexing aligned with business entities and relationships. Enables targeted invalidation without global flush. Critical for large catalogs and frequent updates.

  • • multi-tag purge operations
  • • high-frequency safe invalidation
  • • predictable cache coherence patterns

Request Coalescing

Prevents thundering herd on cache misses. Identical requests collapse into a single upstream call, protecting origin during spikes and during cold cache phases.

  • • stampede prevention by key
  • • load smoothing under concurrency
  • • predictable origin utilization

Backend Protection

Protective envelope around origin pools. Helps maintain service health during slowdowns, spikes and partial outages.

  • • timeout isolation
  • • concurrency caps
  • • retry policies (where safe)
  • • circuit-breaker style patterns

Stale Cache Strategies

Production-safe fallbacks: serve stale while revalidating or on upstream errors. Useful for campaign traffic and upstream maintenance windows.

  • • stale-while-revalidate
  • • stale-if-error
  • • configurable stale windows

Compression

Built-in compression increases cache density and reduces memory footprint. Higher cache density often translates to better hit ratios at the same RAM budget.

  • • reduced RAM usage
  • • improved cache throughput
  • • better performance under heavy catalogs

HTTP/2 & Connection Efficiency

Optimized connection behavior for modern clients. Reduces handshake overhead and improves page-level concurrency.

  • • improved multiplexing behavior
  • • reduced connection churn
  • • better throughput at peak load

Security & Hardening

Practical protection patterns at the edge: method restrictions, header rules, and safe caching defaults to avoid leaking sensitive responses.

  • • safe cacheability gating
  • • request validation hooks
  • • admin endpoints isolation patterns

Admin API (Automation-Ready)

Operational control surface for purge workflows, health checks and integration with deployment pipelines. Designed to support repeatable operations: purge by tag, warm-up flows, controlled rollout checks, and monitoring hooks.

  • • purge by tag / key
  • • health endpoints and readiness probes
  • • automation triggers for CI/CD

Trident vs Varnish (Engineering View)

This is not a generic comparison. For agencies the difference is usually in the operational model: how quickly you can reason about behavior, automate changes, and avoid risky cache flush cycles.

Varnish (common agency friction)

  • • Complex edge logic often accumulates in VCL and becomes hard to maintain across projects.
  • • Tag invalidation patterns are frequently implemented via custom hacks or plugins.
  • • Purges can be operationally expensive and often turn into “flush everything” under pressure.
  • • Scaling and container-native discovery typically requires additional glue and conventions.
  • • Memory footprint and compression handling can become a tuning project of its own.

Many agencies can run Varnish successfully. The cost is often paid in long-term maintainability and repeated custom logic per project.

Trident (production-first defaults)

  • • JSON configuration makes behavior explicit and reviewable. Less “magic”, more determinism.
  • • Native cache tags and fast invalidation support catalog-heavy commerce operations.
  • • Request coalescing reduces origin load during cache misses and warm-up phases.
  • • Backend protection and stale strategies support resilience during spikes and partial outages.
  • • Built-in compression improves cache density and reduces memory costs.

For agencies this typically translates into fewer emergency fixes, fewer risky flushes, and more predictable launches.

Benchmarks (How to Read Them)

Benchmarks are relative comparisons under controlled load conditions. Each graph shows proportional difference, not marketing exaggeration.

Memory efficiency

11x
Varnish baseline 100% Trident ~9%

Higher cache density reduces infra cost and increases headroom during traffic spikes.

Invalidation speed

34x
After purge 1.0x After purge 34x

Fast invalidation matters for deployments, product updates and campaign pages.

Hit throughput

250K+
Baseline 60% Trident 95%

High hit throughput enables simpler scaling and more predictable P95 under load.

Kubernetes Deployment Example

Typical agency deployments run Trident as a dedicated edge tier in front of application backends. The example below shows a minimal baseline: Deployment + Service + readiness. Adapt pools and config mounting to your environment.

deployment.yaml
apiVersion: apps/v1
kind: Deployment
metadata:
  name: trident
spec:
  replicas: 3
  selector:
    matchLabels:
      app: trident
  template:
    metadata:
      labels:
        app: trident
    spec:
      containers:
        - name: trident
          image: qoliber/trident:latest
          ports:
            - containerPort: 8080
          readinessProbe:
            httpGet:
              path: /health/ready
              port: 8080
            initialDelaySeconds: 5
            periodSeconds: 5
          livenessProbe:
            httpGet:
              path: /health/live
              port: 8080
            initialDelaySeconds: 10
            periodSeconds: 10
          env:
            - name: TRIDENT_CONFIG_PATH
              value: /etc/trident/config.json
          volumeMounts:
            - name: trident-config
              mountPath: /etc/trident
      volumes:
        - name: trident-config
          configMap:
            name: trident-config
          
service.yaml
apiVersion: v1
kind: Service
metadata:
  name: trident
spec:
  selector:
    app: trident
  ports:
    - port: 80
      targetPort: 8080
      protocol: TCP
  type: ClusterIP
          
configmap.yaml (JSON config)
apiVersion: v1
kind: ConfigMap
metadata:
  name: trident-config
data:
  config.json: |
    {
      "routes": [
        {
          "match": { "pathPrefix": "/", "methods": ["GET","HEAD"] },
          "cache": {
            "enabled": true,
            "ttlSeconds": 300,
            "staleWhileRevalidateSeconds": 60,
            "staleIfErrorSeconds": 300
          },
          "backendPool": "origin-web"
        }
      ],
      "backends": {
        "origin-web": {
          "discovery": { "type": "dns", "hostname": "app.default.svc.cluster.local" }
        }
      }
    }
          

Performance, SEO and Revenue Impact (Technical framing)

This section is written for agency owners and leads responsible for delivery outcomes. The value is not “marketing”. The value is reducing risk and improving measurable performance signals.

Lower TTFB under load

Stable cache hit path reduces backend latency contribution. When campaigns spike, the hit path should remain predictable, while misses are coalesced to avoid origin meltdown.

  • • fewer origin trips per burst
  • • smoother P95/P99 response
  • • better user-perceived speed

Crawl budget efficiency

For large catalogs, consistent response times and fewer error windows matter. Stale-if-error and controlled caching reduce periods where bots see timeouts or 5xx.

  • • fewer crawl failures
  • • less volatility during updates
  • • stable indexing signals

Campaign stability

Agencies get blamed when paid traffic lands on slow pages. Trident reduces the probability of origin overload during bursts. Faster invalidation also helps keep pricing/stock pages consistent during frequent updates.

  • • fewer emergency flushes
  • • fewer incidents during spikes
  • • better conversion reliability

Agency-level justification

Trident is purchased when the agency wants to standardize delivery: fewer custom cache hacks per project, fewer unknowns during launch, and a caching layer that behaves the same way across stacks. The ROI is typically found in reduced operational incidents, reduced dev time spent on cache edge cases, and improved platform confidence.

Purchase Trident via the Partner Program

If you are an agency, Trident can be acquired and delivered under the qoliber Partner Program. Conditions are discussed individually to fit your delivery model, support expectations and how you package value for clients.

Agency delivery alignment

We focus on how your agency ships and supports production: rollout patterns, escalation expectations, and maintenance boundaries.

Commercial model fit

Partner terms are designed to match agency realities: bundling, resale models, managed services, enterprise contracts.

Engineering-level support

The goal is not generic support. It’s fast alignment on real production constraints and repeatable operational practices.

Ready to standardize caching across your agency projects?

Trident is built to reduce operational risk, make invalidation deterministic, and protect origins under campaign traffic. If you deliver serious commerce or high-throughput backend-rendered systems, Trident becomes a reusable infrastructure component you can ship consistently across different stacks.

Email us to discuss purchase and Partner Program terms.