Making Data Ingest and Pipelines Use Case-Driven

How can an organization ensure it has access to the best data, of the highest quality, with as little delay as possible? Getting there is a matter of policy, business priority, and technical architecture, as well.

Want to adopt this outlook that puts data and insights at the center of attention, with technology as a facilitator? Join us for this free 1-hour webinar from GigaOm Research. The webinar features GigaOm analyst Andrew Brust.

In this 1-hour webinar, you will discover:

  • The correlation of business data strategies and ingest approaches
  • The challenges and strengths of streaming, CDC and batch data processing
  • How to standardize on open source technologies without new skillset burdens

Why Attend

Streaming data processing, classic Extract-Transform-Load (ETL), and change data capture (CDC)-based data replication each involve the ingestion, inspection, and movement of data. While typically used in different contexts and distinct platforms, it’s possible to “factor out” the differences, with corresponding benefits in tow. Technology stack precedent notwithstanding, organizations can look at all of these technologies through the wide-angle lens of processing data in real-time or in batch.

Viewed this way, the three can work together, in service of real-time data ingest and transformation, DataOps, ELT (Extract-Load-Transform), and classic ETL scenarios. Picking one of these approaches becomes a question of design rather than a choice of platform, skillset, or vendor. As requirements evolve, changing the ingest approach becomes a supported modification, avoiding re-platforming and developing pipelines from scratch.

Request Free!