CDO TechVent for
Modern Data Pipelines:
Practices and Products You Need to Know
MARCH 30, 2023 11:30 AM ET
As enterprises democratize data consumption and invest in advanced analytics, they need ever-higher volumes of complex, fast-moving data. To meet this demand, data teams need to accelerate the development of data pipelines, automate their execution, and continuously validate the quality of the output.
Keynote: Kevin Petrie
This 3-hour virtual event is designed to help data leaders evaluate and select data pipeline products, and learn best practices for implementing them. The event compresses the time it takes data leaders to understand an emerging technology, create a short list of products, and hear tips from experts, practitioners, and solutions providers in the field.
As enterprises democratize data consumption and invest in advanced analytics, they need ever-higher volumes of complex, fast-moving data. To meet this demand, data teams need to accelerate the development of data pipelines, automate their execution, and continuously validate the quality of the output. And along the way, they need to master the lifecycle of data, from ingestion and transformation to testing, orchestration, and monitoring.
Tools span the following categories, each of which plays a part in the data lifecycle. Vendors address these categories with both pure play products and suites.
Data integration automates the design, development, management, monitoring, and adaptation of pipelines.
Transformation manipulates data in creative ways to address new use cases such as data-driven applications.
DataOps optimizes pipelines with continuous integration and continuous delivery (CI/CD), testing, and monitoring.
Data observability tracks and helps optimize pipeline performance and data quality.
Orchestration integrates pipelines with applications to kick off operational tasks.
Reverse ETL helps operationalize data by integrating data warehouses with applications.
Attendees Will Learn
Innovative approaches to building and managing data pipelines
Criteria to evaluate and select the right data pipeline tools
Ways in which modern data pipelines ingest, transform and deliver data
How these pipelines integrate with enterprise environments
How to design, manage, and adapt data pipelines to meet dynamic business requirements
ABOUT CDO TECHVENT
The CDO TechVent is an innovative, virtual event designed to help data leaders evaluate and select emerging data technologies and learn best practices for implementing them.
The event compresses the time it takes data leaders and their teams to understand the value proposition of an emerging technology, create a short list of products tailored to their organization, and learn implementation tips from experts, practitioners, and solutions providers in the field.
The NEW Way Data Leaders Evaluate Data Tools
CDO TechVent Helps CDOs:
Understand the trends driving key data technologies
Accelerate the evaluation and selection of new products
Understand key differences among products
Identify which products are best suited to their organization
Understand best practices for implementing products
Evaluate products side-by-side in a virtual bake-off
Event begins March 30, 2023 11:00 AM ET
-Taylor McGrath, Rivery
-Saket Saurabh, Nexla
-Mike Pickett, StreamSets
-Justin Mullen, DataOps.live
-Mark Van De Wiel, Fivetran
Q/A, Vendor Rooms
Product Demos, 1:1 Conversations, Resource Links
-To Code or Not to Code ELT Pipelines: That is the Question!
Presenter: Elesh Mistry, Rivery
It's ridiculous to pay for a SaaS ETL/ELT solution when you can script a data pipeline yourself, or is it?
In this session, we will unpack the pros and cons of coding your own data pipelines, consider the costs of the different alternatives, and provide clear guidelines for when one should code or not their data pipelines.
-Liberate Your Enterprise Data for Cloud Analytics
Presenter: Mike Pickett, StreamSets
Cloud platforms have revolutionized the world of analytics, but many companies still face challenges in transferring their most valuable and comprehensive data, that is stored in enterprise systems, to these cutting-edge cloud data environments.
In this session, you will discover how to conquer typical obstacles to freeing up your enterprise data and learn how integrating this data can improve analytics, refine financial and regulatory reporting, streamline operations, and enhance the customer experience.
-Data Product based design pattern for data integrations
Presenter: Avinash Shahdadpuri, Nexla
Taking a product centric approach to data can fundamentally simplify how every data management task is done, including data integration. Join this session to see:
How creating and consuming data products results in a comprehensive data integration design pattern
How logical Data Products extend this design pattern to enable multi-speed data integration
Live Demo: Collaborative data integration with Data Products
Wrap Up: "Key Takeaways and Recommendations"
Rivery’s SaaS platform provides a fully-managed solution for data ingestion, data transformation, data orchestration, reverse ETL and more, with built-in support for your data operations development and deployment lifecycles. Designed to be nimble for non-technical users and with advanced capabilities for experts, Rivery enables you to manage data workflows as the foundation of a modern data stack.
At StreamSets, a Software AG company, we believe in the audacious, ambitious goal of teasing order out of the chaos of modern data. We help our customers achieve that goal by ensuring data engineering teams thrive in today’s world of constant change. StreamSets brings enterprise-proven DataOps capabilities to modern data integration, enabling continuous data for the modern data stack.
Nexla enables the automation of data engineering so that data can be ready-to-use. We do this through a unique approach of Nexsets – data products that make it easy for anyone to integrate, transform, deliver, and monitor data.
DataOps.live has been born out of the data analytics professional services firm Datalytyx, which used the DataOps.live platform in numerous client engagements during the past two years. To support key DataOps features, such as the ability to accurately develop, branch, and deploy both code and data, the DataOps.live team built its platform around Snowflake—the only data platform with zero-copy cloning. As a result, DataOps.live established a strong partnership with Snowflake and became a certified product on the Snowflake Partner Connect marketplace, where customers can instantly provision trials.
Fivetran is the automated data movement platform moving data out of, into, and across your cloud data platforms. We’ve automated the most time-consuming parts of the ELT process – from automated extracts to schema drift handling to transformations -- so your data engineers can focus on higher-impact projects with total pipeline peace of mind.