Tag1 Consulting: Migrating Your Data from D7 to D10: The migration process pipeline

Series Overview & ToC | Previous Article | Next Article – coming July 24th — Our last article explored the syntax and structure of migration files. Today, we are diving deeper into the most important part of a migration: the process pipeline. This determines how source data will be processed and transformed to match the expected destination structure. We will learn how to configure and chain process plugins, how to set subfields and deltas for multi-value fields, and to work with source constants and pseudo-fields. Let’s get started. ## From source to destination The process section in a migration is responsible for transforming data as extracted from the source into a format that the destination expects. The collection of all those data transformations is known as the migration process pipeline. The Migrate API is a generic ETL framework. This means the source data can come from different types of sources like a database table; a CSV, JSON, or XML file; a remote API using JSON:API or GraphQL; or something else. The destination can be as diverse including databases, text files, and remote APIs. Because the series focuses on migrating from Drupal 7 to 10, most of our discussion will revolve…

mauricio

Wed, 07/17/2024 – 08:23