← Previous · All Episodes · Next →
Data Integration and Ingestion for AI & LLMs, Architecting Data Flows | changelog 3 S1E14

Data Integration and Ingestion for AI & LLMs, Architecting Data Flows | changelog 3

In this episode, Kirk Marple, CEO and founder of Graphlit, shares his expertise on building efficient data integrations. Kirk breaks down his approach using relatable concepts: The "Two-Sided Funnel": This model streamlines data flow by converting various data sources into a standard format before distributing it. Universal Data Streams: Kirk explains how he transforms diverse data into a single, manageable stream of information. Parallel Processing: Learn about the "competing consumer model" that allows for faster data handling. Building Blocks for Success: Discover the importance of well-defined interfaces and actor models in creating robust data systems. Tech Talk: Kirk discusses data normalization techniques and the potential shift towards a more streamlined "Kappa architecture." Reusable Patterns: Find out how Kirk's methods can speed up the integration of new data sources. Kirk Marple: LinkedIn X (Twitter) Graphlit Graphlit Docs Nicolay Gerold: ⁠LinkedIn⁠ ⁠X (Twitter) Chapters 00:00 Building Integrations into Different Tools 00:44 The Two-Sided Funnel Model for Data Flow 04:07 Using Well-Defined Interfaces for Faster Integration 04:36 Managing Feeds and State with Actor Models 06:05 The Importance of Data Normalization 10:54 Tech Stack for Data Flow 11:52 Progression towards a Kappa Architecture 13:45 Reusability of Patterns for Faster Integration data integration, data sources, data flow, two-sided funnel model, canonical format, stream of ingestible objects, competing consumer model, well-defined interfaces, actor model, data normalization, tech stack, Kappa architecture, reusability of patterns

· 14:53

|

In this episode, Kirk Marple, CEO and founder of Graphlit, shares his expertise on building efficient data integrations.

Kirk breaks down his approach using relatable concepts:

  1. The "Two-Sided Funnel": This model streamlines data flow by converting various data sources into a standard format before distributing it.
  2. Universal Data Streams: Kirk explains how he transforms diverse data into a single, manageable stream of information.
  3. Parallel Processing: Learn about the "competing consumer model" that allows for faster data handling.
  4. Building Blocks for Success: Discover the importance of well-defined interfaces and actor models in creating robust data systems.
  5. Tech Talk: Kirk discusses data normalization techniques and the potential shift towards a more streamlined "Kappa architecture."
  6. Reusable Patterns: Find out how Kirk's methods can speed up the integration of new data sources.

Kirk Marple:

Nicolay Gerold:

Chapters

00:00 Building Integrations into Different Tools

00:44 The Two-Sided Funnel Model for Data Flow

04:07 Using Well-Defined Interfaces for Faster Integration

04:36 Managing Feeds and State with Actor Models

06:05 The Importance of Data Normalization

10:54 Tech Stack for Data Flow

11:52 Progression towards a Kappa Architecture

13:45 Reusability of Patterns for Faster Integration

data integration, data sources, data flow, two-sided funnel model, canonical format, stream of ingestible objects, competing consumer model, well-defined interfaces, actor model, data normalization, tech stack, Kappa architecture, reusability of patterns


Subscribe

Listen to How AI Is Built using one of many popular podcasting apps or directories.

Apple Podcasts Spotify Overcast Pocket Casts Amazon Music YouTube
← Previous · All Episodes · Next →