Capture
The CDC (Change Data Capture) - Capture pattern in Planner is used when you want to observe changes in DB2/400 (DB400) systems and forward those changes to modern destinations — without modifying the legacy applications.
This pattern belongs to Planner’s Decoupling Patterns, enabling non-invasive modernization by treating data change as a trigger.
When to Use CDC Capture
Use this pattern when:
- You need real-time visibility into legacy data changes
- You want to publish change events from DB400 without touching application code
- You’re enabling downstream modernization through event-driven or data-driven architectures
Supported Sources
| Data Source | Supported? | Notes |
|---|---|---|
| DB2/400 (DB400) | ✅ | Supported via journal-based CDC tooling |
| VSAM | 🚫 | Not currently supported |
| DB2 | 🚫 | Not currently supported |
| IMS DB | 🚫 | Not currently supported |
| Flat Files | 🚫 | Not currently supported |
Planner determines CDC feasibility per source during Baseline.
How It Works
-
Legacy app writes to a DB400 table
-
A CDC connector monitors DB2/400 journals for changes
-
Detected changes (insert/update/delete) are emitted as structured events
-
Events are published to a destination such as:
- Kafka topic, or
- OpenLegacy Bridge
What happens after the capture is intentionally left open — implementors may use the events however they choose (e.g., sync, audit, trigger, enrich).
Planner Implementation
When CDC Capture is selected as a Plan Action:
-
Planner generates:
- A CDC configuration linked to the source table
- A capture module or connector from DB400 journals
- Integration with a delivery mechanism (Kafka or API)
Example Task Sequence
- Kafka client module from CDC format
- CDC project connected to topic or API endpoint
Example Plan: DB400 Journal to Kafka
This plan configures capture from db400, using journal monitoring to push change events to Kafka.
- Source:
db400(DB2/400), with journal-based change tracking - Capture:
db400-capturemodule (OpenLegacy) - Destination: Kafka topic, exposed via Kubernetes
After delivery to Kafka, downstream behavior is up to the modernization team — such as forwarding to APIs, storing in cloud databases, or triggering workflows.
Benefits
- Keeps legacy apps unchanged
- Emits structured events from real DB2/400 changes
- Empowers downstream modernization and analytics
- Supports event-driven transformation with minimal coupling
Related Patterns
- Internal Decoupling – reroute legacy-to-legacy calls
- External Decoupling – modernize behind stable gateways (CTG, MQ)
- Augment – add modern services alongside legacy behavior
- Extract – for full data export (not just changes)
See Also
Updated 24 days ago