Efficiently Transferring a 30-Million-Row Table Between Fabric Tenants in Different Organizations

Likitha Bommasani 60 Reputation points
2025-03-25T20:50:13.73+00:00

What are the best practices or strategies to efficiently copy a table with 30 million rows from Fabric Tenant A to Fabric Tenant B, especially when these tenants are part of different organizations?

Azure Data Share
Azure Data Share
An Azure service that is used to share data from multiple sources with other organizations.
{count} votes

1 answer

Sort by: Most helpful
  1. Chandra Boorla 15,455 Reputation points Microsoft External Staff Moderator
    2025-03-25T21:17:54.94+00:00

    @Likitha Bommasani

    To efficiently copy a 30 million-row table from Fabric Tenant A to Fabric Tenant B (especially when these tenants belong to different organizations), here are some best practices and strategies:

    Utilize Azure Data Factory (ADF) or Synapse Pipelines

    Leverage ADF to create scalable and secure data pipelines.

    Use Copy Activity for straightforward data copying and Data Flows for any necessary transformations.

    Break Data into Manageable Chunks

    Instead of transferring all rows at once, split the data into smaller chunks based on logical partitions (e.g., date ranges or row IDs).

    This approach helps reduce load and enhances performance.

    Adopt Efficient Data Formats

    Use compressed formats like Parquet or ORC to minimize transfer time and storage costs.

    These formats are optimized for performance and can significantly reduce the size of the data being transferred.

    Implement Secure Authentication

    Utilize Managed Identity or Service Principal for secure authentication between tenants.

    This method eliminates the need for manual credential management and enhances security.

    Consider Delta Loads or Change Data Capture (CDC)

    For frequently updated data, use CDC or delta load strategies to transfer only the changes since the last load.

    This reduces the volume of data movement and optimizes transfer efficiency.

    Ensure Network Security with Data Gateway

    If there are network restrictions, employ an Azure Data Gateway to facilitate secure data transfer without exposing sensitive information.

    Optimize Performance

    Use the Azure Integration Runtime for high-performance data transfers.

    Ensure both tenants are located in nearby Azure regions to minimize latency.

    Monitor and Implement Retry Logic

    Set up monitoring to track the progress and identify any failures during the transfer.

    Implement retry logic to handle transient errors automatically.

    Conduct Post-transfer Validation

    After the transfer, validate the data integrity by comparing row counts or performing data checks.

    This step ensures that all data has been accurately transferred.

    By following these strategies, you can ensure that the transfer process is efficient, secure, and reliable.

    I hope this information helps. Please do let us know if you have any further queries.


    If this answers your query, do click Accept Answer and Yes for was this answer helpful. And, if you have any further query do let us know.


Your answer

Answers can be marked as 'Accepted' by the question author and 'Recommended' by moderators, which helps users know the answer solved the author's problem.