Stay organized with collections
Save and categorize content based on your preferences.
Transfer speeds are affected by factors including source location and provider,
file sizes, and number of files.
If your transfer is progressing more slowly than expected, refer to the
information on this page for possible reasons and resolutions.
Agentless transfers
For transfers to Cloud Storage from Amazon S3, Microsoft Azure, URL lists,
or Cloud Storage, Storage Transfer Service manages the transfer without the need
for hosted transfer agents.
Create multiple parallel transfers
Storage Transfer Service has a maximum number
of allowed queries per second (QPS) per transfer job. If your job involves
a large number of relatively small files, its transfer speed is limited by this
QPS cap. Transferring an object can trigger list, read, and write operations,
each of which count against the maximum QPS.
To work around the QPS limit, split your large transfer into multiple transfer
jobs. Use include and exclude prefixes to create transfer jobs containing
fewer files. You can create:
[[["Easy to understand","easyToUnderstand","thumb-up"],["Solved my problem","solvedMyProblem","thumb-up"],["Other","otherUp","thumb-up"]],[["Hard to understand","hardToUnderstand","thumb-down"],["Incorrect information or sample code","incorrectInformationOrSampleCode","thumb-down"],["Missing the information/samples I need","missingTheInformationSamplesINeed","thumb-down"],["Other","otherDown","thumb-down"]],["Last updated 2025-08-07 UTC."],[],[],null,["# Improve transfer speeds\n\nTransfer speeds are affected by factors including source location and provider,\nfile sizes, and number of files.\n\nIf your transfer is progressing more slowly than expected, refer to the\ninformation on this page for possible reasons and resolutions.\n\nAgentless transfers\n-------------------\n\nFor transfers to Cloud Storage from Amazon S3, Microsoft Azure, URL lists,\nor Cloud Storage, Storage Transfer Service manages the transfer without the need\nfor hosted transfer agents.\n\n### Create multiple parallel transfers\n\nStorage Transfer Service has a maximum number\nof allowed queries per second (QPS) per transfer job. If your job involves\na large number of relatively small files, its transfer speed is limited by this\nQPS cap. Transferring an object can trigger list, read, and write operations,\neach of which count against the maximum QPS.\n\nTo work around the QPS limit, split your large transfer into multiple transfer\njobs. Use [include and exclude prefixes](/storage-transfer/docs/filtering-objects-from-transfers) to create transfer jobs containing\nfewer files. You can create:\n\n- up to [5000 jobs per day](/storage-transfer/quotas#rate-quotas)\n- with [200 jobs running](/storage-transfer/quotas#transferOperations) at any point in time\n\nFor example, to transfer only files whose filename or path begin with the\nletters `a` through `e`: \n\n### gcloud CLI\n\n gcloud transfer jobs create \u003cvar translate=\"no\"\u003eSOURCE\u003c/var\u003e \u003cvar translate=\"no\"\u003eDESTINATION\u003c/var\u003e \\\n --include-prefixes=\"a,b,c,d,e\"\n\n### REST\n\n {\n \"description\": \"YOUR DESCRIPTION\",\n \"status\": \"ENABLED\",\n \"projectId\": \"PROJECT_ID\",\n \"schedule\": {\n \"scheduleStartDate\": {\n \"day\": 1,\n \"month\": 1,\n \"year\": 2015\n },\n \"startTimeOfDay\": {\n \"hours\": 1,\n \"minutes\": 1\n }\n },\n \"transferSpec\": {\n \"gcsDataSource\": {\n \"bucketName\": \"GCS_SOURCE_NAME\"\n },\n \"gcsDataSink\": {\n \"bucketName\": \"GCS_SINK_NAME\"\n },\n \"transferOptions\": {\n \"deleteObjectsFromSourceAfterTransfer\": true\n },\n \"objectConditions\": {\n \"includePrefixes\": [\n \"a\",\"b\",\"c\",\"d\",\"e\"\n ],\n \"excludePrefixes\": [\n \"path_1/subpath_2/object_5\"\n ]\n }\n }\n }\n\nAgent-based transfers\n---------------------\n\nFor tips on speeding up agent-based transfers, refer to\n[Best practices for file system transfers](/storage-transfer/docs/on-prem-agent-best-practices)."]]