To make the local directory "data" the same as the contents of gs://mybucket/data: Note: If you are synchronizing a large amount of data between clouds you might consider setting up a Google Compute Engine account and running gsutil there.
Since cross-provider gsutil data transfers flow through the machine where gsutil is running, doing this can make your transfer run significantly faster than running gsutil on your local workstation.
job runs, it evaluates the entire collection of resources and jobs in the subscription.
It figures out which items have changed based on the commit that triggered its execution, and then decides if it can successfully make the updates or not.
If the source and destination have matching checksums and only the source has an mtime, gsutil rsync will copy the mtime to the destination.
If neither mtime nor checksums are available, gsutil rsync will resort to comparing file sizes.
You can also cause large amounts of data to be lost quickly by specifying a subdirectory of the destination as the source of an rsync.
At the end of the synchronization run if any failures were not successfully retried, the rsync command will report the count of failures, and exit with non-zero status.