Skip to content

Commit

Permalink
Add chapter: Some useful rclone commands
Browse files Browse the repository at this point in the history
  • Loading branch information
gmacario committed Nov 5, 2023
1 parent 0ce02b2 commit baacf53
Showing 1 changed file with 220 additions and 0 deletions.
220 changes: 220 additions & 0 deletions docs/howto/howto-use-rclone-with-cubbit-ds3.md
Original file line number Diff line number Diff line change
Expand Up @@ -94,6 +94,226 @@ Once the `rclone config` command is complete the following command dumps the con
rclone config dump
```

## Some useful `rclone` commands

### List files from a bucket

```bash
rclone ls cubbit:${BUCKET_NAME}
```

Example result:

```text
gmacario@hw2228:~$ rclone ls cubbit:baroloteam
4985990 IMG_20230826_124002_1.jpg
4421972 IMG_20230826_124009_1.jpg
64546 newplot.png
gmacario@hw2228:~$
```

An error is returned if the bucket does not exist or we have no access:

```text
gmacario@hw2228:~$ rclone ls cubbit:newbucket
2023/11/05 11:05:33 Failed to ls: LambdaRuntimeError: Forbidden
status code: 403, request id: , host id:
gmacario@hw2228:~$
```

**NOTE**: The `rclone ls` command has a number of useful options which may be discovered with the `rclone ls --help` command:

```text
gmacario@hw2228:~$ rclone ls --help
Lists the objects in the source path to standard output in a human
readable format with size and path. Recurses by default.
Eg
$ rclone ls swift:bucket
60295 bevajer5jef
90613 canole
94467 diwogej7
37600 fubuwic
Any of the filtering options can be applied to this command.
There are several related list commands
* `ls` to list size and path of objects only
* `lsl` to list modification time, size and path of objects only
* `lsd` to list directories only
* `lsf` to list objects and directories in easy to parse format
* `lsjson` to list objects and directories in JSON format
`ls`,`lsl`,`lsd` are designed to be human-readable.
`lsf` is designed to be human and machine-readable.
`lsjson` is designed to be machine-readable.
Note that `ls` and `lsl` recurse by default - use `--max-depth 1` to stop the recursion.
The other list commands `lsd`,`lsf`,`lsjson` do not recurse by default - use `-R` to make them recurse.
Listing a nonexistent directory will produce an error except for
remotes which can't have empty directories (e.g. s3, swift, or gcs -
the bucket-based remotes).
Usage:
rclone ls remote:path [flags]
Flags:
-h, --help help for ls
# Filter Flags
Flags for filtering directory listings.
--delete-excluded Delete files on dest excluded from sync
--exclude stringArray Exclude files matching pattern
--exclude-from stringArray Read file exclude patterns from file (use - to read from stdin)
--exclude-if-present stringArray Exclude directories if filename is present
--files-from stringArray Read list of source-file names from file (use - to read from stdin)
--files-from-raw stringArray Read list of source-file names from file without any processing of lines (use - to read from stdin)
-f, --filter stringArray Add a file filtering rule
--filter-from stringArray Read file filtering patterns from a file (use - to read from stdin)
--ignore-case Ignore case in filters (case insensitive)
--include stringArray Include files matching pattern
--include-from stringArray Read file include patterns from file (use - to read from stdin)
--max-age Duration Only transfer files younger than this in s or suffix ms|s|m|h|d|w|M|y (default off)
--max-depth int If set limits the recursion depth to this (default -1)
--max-size SizeSuffix Only transfer files smaller than this in KiB or suffix B|K|M|G|T|P (default off)
--metadata-exclude stringArray Exclude metadatas matching pattern
--metadata-exclude-from stringArray Read metadata exclude patterns from file (use - to read from stdin)
--metadata-filter stringArray Add a metadata filtering rule
--metadata-filter-from stringArray Read metadata filtering patterns from a file (use - to read from stdin)
--metadata-include stringArray Include metadatas matching pattern
--metadata-include-from stringArray Read metadata include patterns from file (use - to read from stdin)
--min-age Duration Only transfer files older than this in s or suffix ms|s|m|h|d|w|M|y (default off)
--min-size SizeSuffix Only transfer files bigger than this in KiB or suffix B|K|M|G|T|P (default off)
# Listing Flags
Flags for listing directories.
--default-time Time Time to show if modtime is unknown for files and directories (default 2000-01-01T00:00:00Z)
--fast-list Use recursive list if available; uses more memory but fewer transactions
Additional help topics:
Use "rclone [command] --help" for more information about a command.
Use "rclone help flags" for to see the global flags.
Use "rclone help backends" for a list of supported services.
gmacario@hw2228:~$
```

### Sync folder to a bucket on Cubbit DS3

```bash
rclone sync -P ./backup-folder cubbit:${BUCKET_NAME}
```

Note that `BUCKET_NAME` must exist otherwise the command will fail (it may be created from <https://console.cubbit.eu/>)

On the other hand, if you sync files to a subfolder of a bucket, the subfolder will be created automatically if it does not exist.
Example:

```text
gmacario@gmpowerhorse:~ $ rclone sync -P ~/Downloads cubbit:bk-gmpowerhorse/test02
Transferred: 146.829M / 146.829 MBytes, 100%, 7.582 MBytes/s, ETA 0s
Errors: 0
Checks: 0 / 0, -
Transferred: 17 / 17, 100%
Elapsed time: 19.3s
gmacario@gmpowerhorse:~ $
```

### Backup files from gmpowerhorse (Ubuntu 20.04.6 LTS)

Prerequisites:

* Bucket already created from <https://console.cubbit.eu/>
* Bucket name: `bk-gmpowerhorse`
* Bucket versioning: Versioning disabled
Object Lock: Object Lock disable
* Ownership Control: Object writer
* Cubbit DS3 API key saved in a `.csv` file

Install rclone using apt

```bash
sudo apt install rclone
```

Check installed version

```text
gmacario@gmpowerhorse:~ $ rclone --version
rclone v1.50.2
- os/arch: linux/amd64
- go version: go1.13.8
gmacario@gmpowerhorse:~ $
```

Type the following command to make sure you can access bucket `bk-gmpowerhorse`:

```bash
rclone ls cubbit:bk-gmpowerhorse
```

Result:

```text
gmacario@gmpowerhorse:~ $ rclone ls cubbit:bk-gmpowerhorse
gmacario@gmpowerhorse:~ $
```

Now use rclone to synchronize contents of folder `~/Downloads` to bucket `bk-gmpowerhorse` on Cubbit DS3:

```bash
rclone sync -P ~/Downloads cubbit:bk-gmpowerhorse
```

Result:

```text
gmacario@gmpowerhorse:~ $ rclone sync -P ~/Downloads cubbit:bk-gmpowerhorse
Transferred: 146.829M / 146.829 MBytes, 100%, 7.824 MBytes/s, ETA 0s
Errors: 0
Checks: 0 / 0, -
Transferred: 17 / 17, 100%
Elapsed time: 18.7s
gmacario@gmpowerhorse:~ $
```

Now check from `gmacario@hw2228` that all the files have been transferred:

```text
gmacario@hw2228:~$ rclone ls cubbit:bk-gmpowerhorse
gmacario@hw2228:~$ rclone ls cubbit:bk-gmpowerhorse
80609819 CLI_Linux_Debian_5.5.2.zip
53237226 FingKit_CLI_Linux_Debian.zip
1572864 bios-gmpowerhorse/BIOS_CD/7F5_0146.iso
1048576 bios-gmpowerhorse/DOS_Flash/7F5_0146.bin
27660 bios-gmpowerhorse/DOS_Flash/ASSIGNPW.EXE
2841 bios-gmpowerhorse/DOS_Flash/DOSFM.txt
54441 bios-gmpowerhorse/DOS_Flash/FLASHBIN.EXE
3102 bios-gmpowerhorse/DOS_Flash/Flashbin.txt
1003 bios-gmpowerhorse/DOS_Flash/README.TXT
3388 bios-gmpowerhorse/DOS_Flash/flsh.cpu
2957 bios-gmpowerhorse/README
18746 bios-gmpowerhorse/hp-lxbios-1.5-1.i386.rpm
13894 bios-gmpowerhorse/hp-lxbios-mod-1.5-1_2.6.9.67.ELsmp.src.rpm
48836 bios-gmpowerhorse/lxbios_readme.pdf
411 iottly-device-agent.service
16079382 iottlyagent_1.6.4_linux_AMD64.tar.gz
1235710 sp59252.tgz
gmacario@hw2228:~$
```

TODO

<!-- EOF -->

0 comments on commit baacf53

Please sign in to comment.