copy(7) - Linux man page
4 stars based on
This article outlines how to use the Copy Activity in Azure Data Factory to copy data from and to file system. It builds on the copy activity overview article that presents a general overview of copy activity.
This article applies to version 2 of Data Factory, which is currently in preview. You can copy data from file system to any supported sink data store, or copy data from any supported source data store to file system. See Self-hosted Integration Runtime article for details.
You can create a pipeline with the copy activity by using one of the following tools or SDKs. Select a link to go to a tutorial with step-by-step instructions to create a pipeline with a copy activity. The following sections provide details about properties that are used to define Data Factory entities specific to file system. For a full list of sections and properties available for defining datasets, see the datasets article. This section provides a list of properties copy binary data linux by file system dataset.
The following properties are supported:. For a full list of sections and properties available for defining activities, see copy binary data linux Pipelines article. This section provides a list of properties supported copy binary data linux file system source and sink. To copy data from file system, set the source type in the copy activity to FileSystemSource.
The following properties are supported in the copy activity source section:. To copy data to file system, set the sink type in the copy activity to FileSystemSink. The following properties are supported in the sink section:. This section describes the resulting behavior of the Copy operation for different combinations of recursive and copyBehavior values.
For a list of data stores supported as sources and copy binary data linux by the copy activity in Azure Data Factory, see supported data stores. Our new feedback system is built on GitHub Issues. For more information on this change, please read our blog post.
Note This article applies to version 2 of Data Factory, which is currently in preview. What type of feedback would you copy binary data linux to provide? Give product feedback Sign in to give documentation feedback Give documentation feedback Our new feedback system is built on GitHub Issues. Specifies the root path of the folder that you want to copy. Use the escape character "" for special characters in the string.
See Sample linked service and dataset definitions for examples. Specify the password for the user userid. The Integration Runtime to be used to connect to the data store. If not specified, it uses the default Azure Integration Runtime.
Local folder on Integration Runtime machine: The type property of the dataset must be set to: Path to the folder.
If you do not specify any value for this property, the dataset points to all files in the folder as copy binary data linux, and automatically generates file name.
Specify a filter to be used to select a subset of files in the folderPath rather than all files. Applies only when fileName is not specified. If you want to copy files as-is between file-based stores binary copyskip the format section in both input and output dataset definitions.
If you want to parse or generate files with a specific format, the following file format types are supported: Set the type property under format to one of these values. Specify the type and level of compression for the data. For more information, see Supported file formats and compression codecs. The type property of the copy activity source must copy binary data linux set to: Indicates whether the data is read recursively from the sub-folders or only from the specified folder.
The type property of the copy activity sink must copy binary data linux set to: Defines the copy behavior when the source is files from file-based data store. The relative path of source file to source folder is identical to the relative path of target file to target copy binary data linux. The target files have auto generated name. The target folder Folder1 is created with the same structure as the source: The target Folder1 is created with the following structure: