Implement rsync like include and exclude - fixes #27

* Implement include/exclude
  * Implement rsync compatible file globbing
  * Implement command line filtering flags
    * --delete-excluded - Delete files on dest excluded from sync
    * --filter - Add a file-filtering rule
    * --filter-from - Read filtering patterns from a file
    * --exclude - Exclude files matching pattern
    * --exclude-from - Read exclude patterns from file
    * --include - Include files matching pattern
    * --include-from - Read include patterns from file
    * --files-from - Read list of source-file nam
    * --min-size - Don't transfer any file smaller than this in k or suffix k|M|G
    * --max-size - Don't transfer any file larger than this in k or suffix k|M|G
  * Document
This commit is contained in:
Nick Craig-Wood 2015-09-27 16:13:20 +01:00
parent d04c21b198
commit a91bcaaeb0
11 changed files with 1048 additions and 6 deletions

View File

@ -288,3 +288,22 @@ here which are used for testing. These start with remote name eg
### --cpuprofile=FILE ###
Write cpu profile to file. This can be analysed with `go tool pprof`.
Filtering
---------
For the filtering options
* `--delete-excluded`
* `--filter`
* `--filter-from`
* `--exclude`
* `--exclude-from`
* `--include`
* `--include-from`
* `--files-from`
* `--min-size`
* `--max-size`
* `--dump-filters`
See the [filtering section](/filtering/).

273
docs/content/filtering.md Normal file
View File

@ -0,0 +1,273 @@
---
title: "Filtering"
description: "Filtering, includes and excludes"
date: "2015-09-27"
---
# Filtering, includes and excludes #
Rclone has a sophisticated set of include and exclude rules. Some of
these are based on patterns and some on other things like file size.
Each path as it passes through rclone is matched against the include
and exclude rules. The paths are matched without a leading `/`.
For example the files might be passed to the matching engine like this
* `file1.jpg`
* `file2.jpg`
* `directory/file3.jpg`
## Patterns ##
The patterns used to match files for inclusion or exclusion are based
on "file globs" as used by the unix shell.
If the pattern starts with a `/` then it only matches at the top level
of the directory tree. If it doesn't start with `/` then it is
matched starting at the end of the path, but it will only match a
complete path element.
file.jpg - matches "file.jpg"
- matches "directory/file.jpg"
- doesn't match "afile.jpg"
- doesn't match "directory/afile.jpg"
/file.jpg - matches "file.jpg"
- doesn't match "afile.jpg"
- doesn't match "directory/file.jpg"
A `*` matches anything but not a `/`.
*.jpg - matches "file.jpg"
- matches "directory/file.jpg"
- doesn't match "file.jpg/anotherfile.jpg"
Use `**` to match anything, including slashes.
dir/** - matches "dir/file.jpg"
- matches "dir/dir1/dir2/file.jpg"
- doesn't match "directory/file.jpg"
- doesn't match "adir/file.jpg"
A `?` matches any character except a slash `/`.
l?ss - matches "less"
- matches "lass"
- doesn't match "floss"
A `[` and `]` together make a a character class, such as `[a-z]` or
`[aeiou]` or `[[:alpha:]]`. See the [go regexp
docs](https://golang.org/pkg/regexp/syntax/) for more info on these.
h[ae]llo - matches "hello"
- matches "hallo"
- doesn't match "hullo"
A `{` and `}` define a choice between elements. It should contain a
comma seperated list of patterns, any of which might match. These
patterns can contain wildcards.
{one,two}_potato - matches "one_potato"
- matches "two_potato"
- doesn't match "three_potato"
- doesn't match "_potato"
Special characters can be escaped with a `\` before them.
\*.jpg - matches "*.jpg"
\\.jpg - matches "\.jpg"
\[one\].jpeg - matches "[one].jpg"
### Differences between rsync and rclone patterns ###
Rclone implements bash style `{a,b,c}` glob matching which rclone doesn't.
Rclone ignores `/` at the end of a pattern.
Rclone always does a wildcard match so `\` must always escape a `\`.
## How the rules are used ##
Rclone maintains a list of include rules and exclude rules.
Each file is matched in order against the list until it finds a match.
The file is then included or excluded according to the rule type.
If the matcher falls off the bottom of the list then the path is
included.
For example given the following rules, `+` being include, `-` being
exclude,
- secret*.jpg
+ *.jpg
+ *.png
+ file2.avi
- *
This would include
* `file1.jpg`
* `file3.png`
* `file2.avi`
This would exclude
* `secret17.jpg`
* non `*.jpg` and `*.png`
## Adding filtering rules ##
Filtering rules are added with the following command line flags.
### `--exclude` - Exclude files matching pattern ###
Add a single exclude rule with `--exclude`.
Eg `--exclude *.bak` to exclude all bak files from the sync.
### `--exclude-from` - Read exclude patterns from file ###
Add exclude rules from a file.
Prepare a file like this `exclude-file.txt`
# a sample exclude rule file
*.bak
file2.jpg
Then use as `--exclude-from exclude-file.txt`. This will sync all
files except those ending in `bak` and `file2.jpg`.
This is useful if you have a lot of rules.
### `--include` - Include files matching pattern ###
Add a single include rule with `--include`.
Eg `--include *.{png,jpg}` to include all `png` and `jpg` files in the
backup and no others.
This adds an implicit `--exclude *` at the end of the filter list.
### `--include-from` - Read include patterns from file ###
Add include rules from a file.
Prepare a file like this `include-file.txt`
# a sample include rule file
*.jpg
*.png
file2.avi
Then use as `--include-from include-file.txt`. This will sync all
`jpg`, `png` files and `file2.avi`.
This is useful if you have a lot of rules.
This adds an implicit `--exclude *` at the end of the filter list.
### `--filter` - Add a file-filtering rule ###
This can be used to add a single include or exclude rule. Include
rules start with `+ ` and exclude rules start with `- `. A special
rule called `!` can be used to clear the existing rules.
Eg `--filter "- *.bak"` to exclude all bak files from the sync.
### `--filter-from` - Read filtering patterns from a file ###
Add include/exclude rules from a file.
Prepare a file like this `filter-file.txt`
# a sample exclude rule file
- secret*.jpg
+ *.jpg
+ *.png
+ file2.avi
# exclude everything else
- *
Then use as `--filter-from filter-file.txt`. The rules are processed
in the order that they are defined.
This example will include all `jpg` and `png` files, exclude any files
matching `secret*.jpg` and include `file2.avi`. Everything else will
be excluded from the sync.
### `--files-from` - Read list of source-file names ###
This reads a list of file names from the file passed in and **only**
these files are transferred. The filtering rules are ignored
completely if you use this option.
Prepare a file like this `files-from.txt`
# comment
file1.jpg
file2.jpg
Then use as `--files-from files-from.txt`. This will only transfer
`file1.jpg` and `file2.jpg` providing they exist.
### `--min-size` - Don't transfer any file smaller than this ###
This option controls the minimum size file which will be transferred.
This defaults to `kBytes` but a suffix of `k`, `M`, or `G` can be
used.
For example `--min-size 50k` means no files smaller than 50kByte will be
transferred.
### `--max-size` - Don't transfer any file larger than this ###
This option controls the maximum size file which will be transferred.
This defaults to `kBytes` but a suffix of `k`, `M`, or `G` can be
used.
For example `--max-size 1G` means no files larger than 1GByte will be
transferred.
### `--delete-excluded` - Delete files on dest excluded from sync ###
**Important** this flag is dangerous - use with `--dry-run` and `-v` first.
When doing `rclone sync` this will delete any files which are excluded
from the sync on the destination.
If for example you did a sync from `A` to `B` without the `--min-size 50k` flag
rclone sync A: B:
Then you repeated it like this with the `--delete-excluded`
rclone --min-size 50k --delete-excluded sync A: B:
This would delete all files on `B` which are less than 50 kBytes as
these are now excluded from the sync.
Always test first with `--dry-run` and `-v` before using this flag.
### `--dump-filters` - dump the filters to the output ###
This dumps the defined filters to the output as regular expressions.
Useful for debugging.
## Quoting shell metacharacters ##
The examples above may not work verbatim in your shell as they have
shell metacharacters in them (eg `*`), and may require quoting.
Eg linux, OSX
* `--include \*.jpg`
* `--include '*.jpg'`
* `--include='*.jpg'`
In Windows the expansion is done by the command not the shell so this
should work fine
* `--include *.jpg`

View File

@ -17,6 +17,7 @@
<ul class="dropdown-menu">
<li><a href="/install/"><i class="fa fa-book"></i> Installation</a></li>
<li><a href="/docs/"><i class="fa fa-book"></i> Usage</a></li>
<li><a href="/filtering/"><i class="fa fa-book"></i> Filtering</a></li>
<li><a href="/changelog/"><i class="fa fa-book"></i> Changelog</a></li>
<li><a href="/bugs/"><i class="fa fa-book"></i> Bugs</a></li>
<li><a href="/faq/"><i class="fa fa-book"></i> FAQ</a></li>

View File

@ -160,6 +160,7 @@ type ConfigInfo struct {
Timeout time.Duration // Data channel timeout
DumpHeaders bool
DumpBodies bool
Filter *Filter
}
// Transport returns an http.RoundTripper with the correct timeouts
@ -252,6 +253,12 @@ func LoadConfig() {
}
}
// Load filters
Config.Filter, err = NewFilter()
if err != nil {
log.Fatalf("Failed to load filters: %v", err)
}
// Start the token bucket limiter
startTokenBucket()
}

248
fs/filter.go Normal file
View File

@ -0,0 +1,248 @@
// Control the filtering of files
package fs
import (
"bufio"
"fmt"
"os"
"regexp"
"strings"
"github.com/spf13/pflag"
)
// Global
var (
// Flags
deleteExcluded = pflag.BoolP("delete-excluded", "", false, "Delete files on dest excluded from sync")
filterRule = pflag.StringP("filter", "f", "", "Add a file-filtering rule")
filterFrom = pflag.StringP("filter-from", "", "", "Read filtering patterns from a file")
excludeRule = pflag.StringP("exclude", "", "", "Exclude files matching pattern")
excludeFrom = pflag.StringP("exclude-from", "", "", "Read exclude patterns from file")
includeRule = pflag.StringP("include", "", "", "Include files matching pattern")
includeFrom = pflag.StringP("include-from", "", "", "Read include patterns from file")
filesFrom = pflag.StringP("files-from", "", "", "Read list of source-file names from file")
minSize SizeSuffix
maxSize SizeSuffix
dumpFilters = pflag.BoolP("dump-filters", "", false, "Dump the filters to the output")
//cvsExclude = pflag.BoolP("cvs-exclude", "C", false, "Exclude files in the same way CVS does")
)
func init() {
pflag.VarP(&minSize, "min-size", "", "Don't transfer any file smaller than this in k or suffix k|M|G")
pflag.VarP(&maxSize, "max-size", "", "Don't transfer any file larger than this in k or suffix k|M|G")
}
// rule is one filter rule
type rule struct {
Include bool
Regexp *regexp.Regexp
}
// Match returns true if rule matches path
func (r *rule) Match(path string) bool {
return r.Regexp.MatchString(path)
}
// String the rule
func (r *rule) String() string {
c := "-"
if r.Include {
c = "+"
}
return fmt.Sprintf("%s %s", c, r.Regexp.String())
}
// filesMap describes the map of files to transfer
type filesMap map[string]struct{}
// Filter describes any filtering in operation
type Filter struct {
DeleteExcluded bool
MinSize int64
MaxSize int64
rules []rule
files filesMap
}
// NewFilter parses the command line options and creates a Filter object
func NewFilter() (f *Filter, err error) {
f = &Filter{
DeleteExcluded: *deleteExcluded,
MinSize: int64(minSize),
MaxSize: int64(maxSize),
}
if *includeRule != "" {
err = f.Add(true, *includeRule)
if err != nil {
return nil, err
}
// Add implicit exclude
err = f.Add(false, "*")
if err != nil {
return nil, err
}
}
if *includeFrom != "" {
err := forEachLine(*includeFrom, func(line string) error {
return f.Add(true, line)
})
if err != nil {
return nil, err
}
// Add implicit exclude
err = f.Add(false, "*")
if err != nil {
return nil, err
}
}
if *excludeRule != "" {
err = f.Add(false, *excludeRule)
if err != nil {
return nil, err
}
}
if *excludeFrom != "" {
err := forEachLine(*excludeFrom, func(line string) error {
return f.Add(false, line)
})
if err != nil {
return nil, err
}
}
if *filterRule != "" {
err = f.AddRule(*filterRule)
if err != nil {
return nil, err
}
}
if *filterFrom != "" {
err := forEachLine(*filterFrom, f.AddRule)
if err != nil {
return nil, err
}
}
if *filesFrom != "" {
err := forEachLine(*filesFrom, func(line string) error {
return f.AddFile(line)
})
if err != nil {
return nil, err
}
}
if *dumpFilters {
fmt.Println("--- start filters ---")
fmt.Println(f.DumpFilters())
fmt.Println("--- end filters ---")
}
return f, nil
}
// Add adds a filter rule with include or exclude status indicated
func (f *Filter) Add(Include bool, glob string) error {
re, err := globToRegexp(glob)
if err != nil {
return err
}
rule := rule{
Include: Include,
Regexp: re,
}
f.rules = append(f.rules, rule)
return nil
}
// AddRule adds a filter rule with include/exclude indicated by the prefix
//
// These are
//
// + glob
// - glob
// !
//
// '+' includes the glob, '-' excludes it and '!' resets the filter list
//
// Line comments may be introduced with '#' or ';'
func (f *Filter) AddRule(rule string) error {
switch {
case rule == "!":
f.Clear()
return nil
case strings.HasPrefix(rule, "- "):
return f.Add(false, rule[2:])
case strings.HasPrefix(rule, "+ "):
return f.Add(true, rule[2:])
}
return fmt.Errorf("Malformed rule %q", rule)
}
// AddFile adds a single file to the files from list
func (f *Filter) AddFile(file string) error {
if f.files == nil {
f.files = make(filesMap)
}
file = strings.Trim(file, "/")
f.files[file] = struct{}{}
return nil
}
// Clear clears all the filter rules
func (f *Filter) Clear() {
f.rules = nil
}
// Include returns whether this object should be included into the
// sync or not
func (f *Filter) Include(remote string, size int64) bool {
// filesFrom takes precedence
if f.files != nil {
_, include := f.files[remote]
return include
}
if f.MinSize != 0 && size < f.MinSize {
return false
}
if f.MaxSize != 0 && size > f.MaxSize {
return false
}
for _, rule := range f.rules {
if rule.Match(remote) {
return rule.Include
}
}
return true
}
// forEachLine calls fn on every line in the file pointed to by path
//
// It ignores empty lines and lines starting with '#' or ';'
func forEachLine(path string, fn func(string) error) (err error) {
in, err := os.Open(path)
if err != nil {
return err
}
defer checkClose(in, &err)
scanner := bufio.NewScanner(in)
for scanner.Scan() {
line := scanner.Text()
line = strings.TrimSpace(line)
if len(line) == 0 || line[0] == '#' || line[0] == ';' {
continue
}
err := fn(line)
if err != nil {
return err
}
}
return scanner.Err()
}
// DumpFilters dumps the filters in textual form, 1 per line
func (f *Filter) DumpFilters() string {
rules := []string{}
for _, rule := range f.rules {
rules = append(rules, rule.String())
}
return strings.Join(rules, "\n")
}

252
fs/filter_test.go Normal file
View File

@ -0,0 +1,252 @@
package fs
import (
"io/ioutil"
"os"
"strings"
"testing"
)
func TestNewFilterDefault(t *testing.T) {
f, err := NewFilter()
if err != nil {
t.Fatal(err)
}
if f.DeleteExcluded != false {
t.Errorf("DeleteExcluded want false got %v", f.DeleteExcluded)
}
if f.MinSize != 0 {
t.Errorf("MinSize want 0 got %v", f.MinSize)
}
if f.MaxSize != 0 {
t.Errorf("MaxSize want 0 got %v", f.MaxSize)
}
if len(f.rules) != 0 {
t.Errorf("rules want non got %v", f.rules)
}
if f.files != nil {
t.Errorf("files want none got %v", f.files)
}
}
// return a pointer to the string
func stringP(s string) *string {
return &s
}
// testFile creates a temp file with the contents
func testFile(t *testing.T, contents string) *string {
out, err := ioutil.TempFile("", "filter_test")
if err != nil {
t.Fatal(err)
}
defer out.Close()
_, err = out.Write([]byte(contents))
if err != nil {
t.Fatal(err)
}
s := out.Name()
return &s
}
func TestNewFilterFull(t *testing.T) {
mins := int64(100 * 1024)
maxs := int64(1000 * 1024)
emptyString := ""
isFalse := false
isTrue := true
// Set up the input
deleteExcluded = &isTrue
filterRule = stringP("- filter1")
filterFrom = testFile(t, "#comment\n+ filter2\n- filter3\n")
excludeRule = stringP("exclude1")
excludeFrom = testFile(t, "#comment\nexclude2\nexclude3\n")
includeRule = stringP("include1")
includeFrom = testFile(t, "#comment\ninclude2\ninclude3\n")
filesFrom = testFile(t, "#comment\nfiles1\nfiles2\n")
minSize = SizeSuffix(mins)
maxSize = SizeSuffix(maxs)
rm := func(p string) {
err := os.Remove(p)
if err != nil {
t.Logf("error removing %q: %v", p, err)
}
}
// Reset the input
defer func() {
rm(*filterFrom)
rm(*excludeFrom)
rm(*includeFrom)
rm(*filesFrom)
minSize = 0
maxSize = 0
deleteExcluded = &isFalse
filterRule = &emptyString
filterFrom = &emptyString
excludeRule = &emptyString
excludeFrom = &emptyString
includeRule = &emptyString
includeFrom = &emptyString
filesFrom = &emptyString
}()
f, err := NewFilter()
if err != nil {
t.Fatal(err)
}
if f.DeleteExcluded != true {
t.Errorf("DeleteExcluded want true got %v", f.DeleteExcluded)
}
if f.MinSize != mins {
t.Errorf("MinSize want %v got %v", mins, f.MinSize)
}
if f.MaxSize != maxs {
t.Errorf("MaxSize want %v got %v", maxs, f.MaxSize)
}
got := f.DumpFilters()
want := `+ (^|/)include1$
- (^|/)[^/]*$
+ (^|/)include2$
+ (^|/)include3$
- (^|/)[^/]*$
- (^|/)exclude1$
- (^|/)exclude2$
- (^|/)exclude3$
- (^|/)filter1$
+ (^|/)filter2$
- (^|/)filter3$`
if got != want {
t.Errorf("rules want %s got %s", want, got)
}
if len(f.files) != 2 {
t.Errorf("files want 2 got %v", f.files)
}
for _, name := range []string{"files1", "files2"} {
_, ok := f.files[name]
if !ok {
t.Errorf("Didn't find file %q in f.files", name)
}
}
}
type includeTest struct {
in string
size int64
want bool
}
func testInclude(t *testing.T, f *Filter, tests []includeTest) {
for _, test := range tests {
got := f.Include(test.in, test.size)
if test.want != got {
t.Errorf("%q,%d: want %v got %v", test.in, test.size, test.want, got)
}
}
}
func TestNewFilterIncludeFiles(t *testing.T) {
f, err := NewFilter()
if err != nil {
t.Fatal(err)
}
f.AddFile("file1.jpg")
f.AddFile("/file2.jpg")
testInclude(t, f, []includeTest{
{"file1.jpg", 0, true},
{"file2.jpg", 1, true},
{"potato/file2.jpg", 2, false},
{"file3.jpg", 3, false},
})
}
func TestNewFilterMinSize(t *testing.T) {
f, err := NewFilter()
if err != nil {
t.Fatal(err)
}
f.MinSize = 100
testInclude(t, f, []includeTest{
{"file1.jpg", 100, true},
{"file2.jpg", 101, true},
{"potato/file2.jpg", 99, false},
})
}
func TestNewFilterMaxSize(t *testing.T) {
f, err := NewFilter()
if err != nil {
t.Fatal(err)
}
f.MaxSize = 100
testInclude(t, f, []includeTest{
{"file1.jpg", 100, true},
{"file2.jpg", 101, false},
{"potato/file2.jpg", 99, true},
})
}
func TestNewFilterMatches(t *testing.T) {
f, err := NewFilter()
if err != nil {
t.Fatal(err)
}
add := func(s string) {
err := f.AddRule(s)
if err != nil {
t.Fatal(err)
}
}
add("+ cleared")
add("!")
add("- file1.jpg")
add("+ file2.png")
add("+ *.jpg")
add("- *.png")
add("- /potato")
add("+ /sausage1")
add("+ /sausage2*")
add("+ /sausage3**")
add("- *")
testInclude(t, f, []includeTest{
{"cleared", 100, false},
{"file1.jpg", 100, false},
{"file2.png", 100, true},
{"afile2.png", 100, false},
{"file3.jpg", 101, true},
{"file4.png", 101, false},
{"potato", 101, false},
{"sausage1", 101, true},
{"sausage1/potato", 101, false},
{"sausage2potato", 101, true},
{"sausage2/potato", 101, false},
{"sausage3/potato", 101, true},
{"unicorn", 99, false},
})
}
func TestFilterForEachLine(t *testing.T) {
file := testFile(t, `; comment
one
# another comment
two
# indented comment
three
four
five
six `)
defer os.Remove(*file)
lines := []string{}
forEachLine(*file, func(s string) error {
lines = append(lines, s)
return nil
})
got := strings.Join(lines, ",")
want := "one,two,three,four,five,six"
if want != got {
t.Errorf("want %q got %q", want, got)
}
}

117
fs/glob.go Normal file
View File

@ -0,0 +1,117 @@
// rsync style glob parser
package fs
import (
"bytes"
"fmt"
"regexp"
"strings"
)
// globToRegexp converts an rsync style glob to a regexp
//
// documented in filtering.md
func globToRegexp(glob string) (*regexp.Regexp, error) {
var re bytes.Buffer
if strings.HasPrefix(glob, "/") {
glob = glob[1:]
_, _ = re.WriteRune('^')
} else {
_, _ = re.WriteString("(^|/)")
}
consecutiveStars := 0
insertStars := func() error {
if consecutiveStars > 0 {
switch consecutiveStars {
case 1:
_, _ = re.WriteString(`[^/]*`)
case 2:
_, _ = re.WriteString(`.*`)
default:
return fmt.Errorf("too many stars in %q", glob)
}
}
consecutiveStars = 0
return nil
}
inBraces := false
inBrackets := 0
slashed := false
for _, c := range glob {
if slashed {
_, _ = re.WriteRune(c)
slashed = false
continue
}
if c != '*' {
err := insertStars()
if err != nil {
return nil, err
}
}
if inBrackets > 0 {
_, _ = re.WriteRune(c)
if c == '[' {
inBrackets++
}
if c == ']' {
inBrackets--
}
continue
}
switch c {
case '\\':
_, _ = re.WriteRune(c)
slashed = true
case '*':
consecutiveStars++
case '?':
_, _ = re.WriteString(`[^/]`)
case '[':
_, _ = re.WriteRune(c)
inBrackets++
case ']':
return nil, fmt.Errorf("mismatched ']' in glob %q", glob)
case '{':
if inBraces {
return nil, fmt.Errorf("can't nest '{' '}' in glob %q", glob)
}
inBraces = true
_, _ = re.WriteRune('(')
case '}':
if !inBraces {
return nil, fmt.Errorf("mismatched '{' and '}' in glob %q", glob)
}
_, _ = re.WriteRune(')')
inBraces = false
case ',':
if inBraces {
_, _ = re.WriteRune('|')
} else {
_, _ = re.WriteRune(c)
}
case '.', '+', '(', ')', '|', '^', '$': // regexp meta characters not dealt with above
_, _ = re.WriteRune('\\')
_, _ = re.WriteRune(c)
default:
_, _ = re.WriteRune(c)
}
}
err := insertStars()
if err != nil {
return nil, err
}
if inBrackets > 0 {
return nil, fmt.Errorf("mismatched '[' and ']' in glob %q", glob)
}
if inBraces {
return nil, fmt.Errorf("mismatched '{' and '}' in glob %q", glob)
}
_, _ = re.WriteRune('$')
result, err := regexp.Compile(re.String())
if err != nil {
return nil, fmt.Errorf("Bad glob pattern %q: %v (%q)", glob, err, re.String())
}
return result, nil
}

64
fs/glob_test.go Normal file
View File

@ -0,0 +1,64 @@
package fs
import (
"strings"
"testing"
)
func TestGlobToRegexp(t *testing.T) {
for _, test := range []struct {
in string
want string
error string
}{
{``, `(^|/)$`, ``},
{`potato`, `(^|/)potato$`, ``},
{`potato,sausage`, `(^|/)potato,sausage$`, ``},
{`/potato`, `^potato$`, ``},
{`potato?sausage`, `(^|/)potato[^/]sausage$`, ``},
{`potat[oa]`, `(^|/)potat[oa]$`, ``},
{`potat[a-z]or`, `(^|/)potat[a-z]or$`, ``},
{`potat[[:alpha:]]or`, `(^|/)potat[[:alpha:]]or$`, ``},
{`'.' '+' '(' ')' '|' '^' '$'`, `(^|/)'\.' '\+' '\(' '\)' '\|' '\^' '\$'$`, ``},
{`*.jpg`, `(^|/)[^/]*\.jpg$`, ``},
{`a{b,c,d}e`, `(^|/)a(b|c|d)e$`, ``},
{`potato**`, `(^|/)potato.*$`, ``},
{`potato**sausage`, `(^|/)potato.*sausage$`, ``},
{`*.p[lm]`, `(^|/)[^/]*\.p[lm]$`, ``},
{`[\[\]]`, `(^|/)[\[\]]$`, ``},
{`***potato`, `(^|/)`, `too many stars`},
{`***`, `(^|/)`, `too many stars`},
{`ab]c`, `(^|/)`, `mismatched ']'`},
{`ab[c`, `(^|/)`, `mismatched '[' and ']'`},
{`ab{{cd`, `(^|/)`, `can't nest`},
{`ab{}}cd`, `(^|/)`, `mismatched '{' and '}'`},
{`ab}c`, `(^|/)`, `mismatched '{' and '}'`},
{`ab{c`, `(^|/)`, `mismatched '{' and '}'`},
{`*.{jpg,png,gif}`, `(^|/)[^/]*\.(jpg|png|gif)$`, ``},
{`[a--b]`, `(^|/)`, `Bad glob pattern`},
{`a\*b`, `(^|/)a\*b$`, ``},
{`a\\b`, `(^|/)a\\b$`, ``},
} {
gotRe, err := globToRegexp(test.in)
if test.error == "" {
if err != nil {
t.Errorf("%q: not expecting error: %v", test.in, err)
} else {
got := gotRe.String()
if test.want != got {
t.Errorf("%q: want %q got %q", test.in, test.want, got)
}
}
} else {
if err == nil {
t.Errorf("%q: expecting error but didn't get one", test.in)
} else {
got := err.Error()
if !strings.Contains(got, test.error) {
t.Errorf("%q: want error %q got %q", test.in, test.error, got)
}
}
}
}
}

View File

@ -438,13 +438,20 @@ func syncCopyMove(fdst, fsrc Fs, Delete bool, DoMove bool) error {
go func() {
for src := range fsrc.List() {
remote := src.Remote()
dst, found := delFiles[remote]
if found {
delete(delFiles, remote)
toBeChecked <- ObjectPair{src, dst}
dst, dstFound := delFiles[remote]
if !Config.Filter.Include(remote, src.Size()) {
Debug(src, "Excluding from sync")
if dstFound && !Config.Filter.DeleteExcluded {
delete(delFiles, remote)
}
} else {
// No need to check since doesn't exist
toBeUploaded <- ObjectPair{src, nil}
if dstFound {
delete(delFiles, remote)
toBeChecked <- ObjectPair{src, dst}
} else {
// No need to check since doesn't exist
toBeUploaded <- ObjectPair{src, nil}
}
}
}
close(toBeChecked)

View File

@ -407,6 +407,59 @@ func TestSyncAfterRemovingAFileAndAddingAFile(t *testing.T) {
fstest.CheckListingWithPrecision(t, fremote, items, fs.Config.ModifyWindow)
}
// Test with exclude
func TestSyncWithExclude(t *testing.T) {
WriteFile("enormous", "AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA", t1) // 100 bytes
fs.Config.Filter.MaxSize = 80
defer func() {
fs.Config.Filter.MaxSize = 0
}()
err := fs.Sync(fremote, flocal)
if err != nil {
t.Fatalf("Sync failed: %v", err)
}
items := []fstest.Item{
{Path: "empty space", Size: 0, ModTime: t2, Md5sum: "d41d8cd98f00b204e9800998ecf8427e"},
{Path: "potato2", Size: 60, ModTime: t1, Md5sum: "d6548b156ea68a4e003e786df99eee76"},
}
fstest.CheckListingWithPrecision(t, fremote, items, fs.Config.ModifyWindow)
}
// Test with exclude and delete excluded
func TestSyncWithExcludeAndDeleleteExcluded(t *testing.T) {
fs.Config.Filter.MaxSize = 40
fs.Config.Filter.DeleteExcluded = true
reset := func() {
fs.Config.Filter.MaxSize = 0
fs.Config.Filter.DeleteExcluded = false
}
defer reset()
err := fs.Sync(fremote, flocal)
if err != nil {
t.Fatalf("Sync failed: %v", err)
}
items := []fstest.Item{
{Path: "empty space", Size: 0, ModTime: t2, Md5sum: "d41d8cd98f00b204e9800998ecf8427e"},
}
fstest.CheckListingWithPrecision(t, fremote, items, fs.Config.ModifyWindow)
// Tidy up
reset()
err = os.Remove(localName + "/enormous")
if err != nil {
t.Fatalf("Remove failed: %v", err)
}
err = fs.Sync(fremote, flocal)
if err != nil {
t.Fatalf("Sync failed: %v", err)
}
items = []fstest.Item{
{Path: "empty space", Size: 0, ModTime: t2, Md5sum: "d41d8cd98f00b204e9800998ecf8427e"},
{Path: "potato2", Size: 60, ModTime: t1, Md5sum: "d6548b156ea68a4e003e786df99eee76"},
}
fstest.CheckListingWithPrecision(t, fremote, items, fs.Config.ModifyWindow)
}
// Test a server side move if possible, or the backup path if not
func TestServerSideMove(t *testing.T) {
fremoteMove, finaliseMove, err := fstest.RandomRemote(*RemoteName, *SubDir)

View File

@ -16,6 +16,7 @@ docs = [
"about.md",
"install.md",
"docs.md",
"filtering.md",
"overview.md",
"drive.md",
"s3.md",