DM250: Data Engineering & DataOps NEW

Participants learn to implement scalable and operable data products with a DataOps mindset—from lakehouse/streaming and ELT/ETL to pipeline design and orchestration, through CI/CD, testing, and observability. Focus: principles rather than tool training (version control, data contracts, monitoring, incident handling), plus governance integration (metadata/lineage, access models, quality rules, responsibilities). Capstone: an end-to-end scenario including runbooks, alerts, SLOs, and a release checklist.

Unfortunately there are currently no available appointments.
Would you like to request an appointment? Then click on 'No matching appointment?'

Request prefered appointment period:

* All fields marked with an asterisk are mandatory fields.

Agenda:

  • Modern data architectures: warehouse vs. lakehouse, medallion/layering, domain interfaces

  • Batch & streaming: event models, basic concepts of windowing, late data, exactly-once vs. at-least-once

  • Pipeline design: ingestion patterns, transformation, partitioning, backfills, idempotency

  • Orchestration: DAG design, dependencies, parameterization, scheduling, retries, SLAs

  • Analytics engineering (e.g., dbt thinking): modeling, documentation, tests, semantic layer concepts

  • CI/CD for data: Git flow, build/deploy pipelines, environments, secrets, artifacting

  • Automated testing: unit/integration, schema tests, data quality checks, contract tests, regression strategies

  • Observability: logs/metrics/traces mindset, pipeline health, freshness, volume, distribution, cost

  • Operations & incident response: alert design, on-call basics, runbooks, postmortems

  • Security & governance touchpoints: access patterns, masking, lineage integration, catalog integration

  • Capstone: end-to-end blueprint + operations and quality concept

Objectives:

  • Select modern data architectures and outline them with clear justification (including batch/streaming classification)
  • Design data pipelines so they are robust (idempotent), scalable, and maintainable
  • Structure orchestration cleanly (DAG design, dependencies, backfills, SLAs)
  • Build a CI/CD-capable delivery structure for data projects (repo structure, environments, releases)
  • Implement a testing strategy for data (schema, contracts, data quality checks, regression)
  • Define an observability concept (SLOs, alerts, monitoring metrics, cost indicators)
  • Ensure operability (runbooks, incident procedures, postmortem improvements)
  • “Build in” governance requirements technically (metadata/lineage, access, DQ as code)

Target audience:

  • Data Engineers, Analytics Engineers, DataOps roles, platform/cloud engineers with a data focus
  • MLOps/ML Engineers responsible for data feeds, tech leads for data platforms

Prerequisites:

Description:

The DM250 Data Engineering & DataOps course addresses the technical implementation of reliable, scalable, and operable data products—using the mindset and practices of DataOps. Participants work across the entire delivery chain: from modern data architectures (lakehouse approaches, streaming/event-driven, ELT/ETL patterns) through pipeline design and orchestration (e.g., Airflow principles, dbt workflows, or comparable tools) to CI/CD, automated testing, and observability. The focus is not on “tool training,” but on robust, transferable principles: version control, environment strategy, a data testing pyramid, data contracts, deployments, monitoring, incident handling, and continuous improvement.

A key outcome of the DM250 Data Engineering & DataOps workshop is the ability to deliver data pipelines like product software: reproducible, measurable, secure, and cost-efficient. This also includes alignment with governance (DM100 Fundamentals of Data Management & Data Governance / DM200 Data Governance & Data Asset Management in Practice): metadata/lineage, access models, quality rules, and documented responsibilities are treated as integral parts of engineering practice—not as “afterthoughts.” In a capstone scenario (e.g., “Batch + Streaming Ingestion → Transformation → Serving Layer”), the teams design an end-to-end solution including operational artifacts (runbooks, alerts, SLOs, release checklist).
Check Icon

Guaranteed implementation:

from 2 Attendees

Booking information:

Duration:

4 Days

Price:

2.950,00 € plus VAT.

(including lunch & drinks for in-person participation on-site)

Authorized training partner

NetApp Partner Authorized Learning
Commvault Training Partner
CQI | IRCA Approved Training Partner
Veeam Authorized Education Center
Acronis Authorized Training Center
AWS Partner Select Tier Training
ISACA Accredited Partner
iSAQB
CompTIA Authorized Partner
EC-Council Accredited Training Center

Memberships

Allianz für Cyber-Sicherheit
TeleTrust Pioneers in IT security
Bundesverband der IT-Sachverständigen und Gutachter e.V.
Bundesverband mittelständische Wirtschaft (BVMW)
Allianz für Sicherheit in der Wirtschaft
NIK - Netzwerk der Digitalwirtschaft
BVSW
Bayern Innovativ
KH-iT
CAST
IHK Nürnberg für Mittelfranken
eato e.V.
Sicherheitsnetzwerk München e.V.