Syncthing user testing

Vladyslav Chaban chabavla(at)fel.cvut.cz Table of Contents 1.Abstract...... 3 2.Goal...... 3 3.Target group...... 3 4.Pre-screener...... 3 5.Pre-test questionnaire...... 3 6.Testing method...... 4 6.1.Platform...... 4 6.2.Testing setup...... 4 6.3.Tasks...... 4 6.4.Post-test questionnaire...... 4 7.Data gathering...... 4 7.1.Participants...... 4 7.1.1.Participant #1...... 5 7.1.1.1Participant background as determined by pre-screener and pre-test questionnaire....5 7.1.1.2Tasks...... 5 7.1.1.3Post-test questionnaire analysis...... 5 7.1.2.Participant #2...... 6 7.1.2.1Participant background as determined by pre-screener and pre-test questionnaire....6 7.1.2.2Tasks...... 6 7.1.2.3Post-test questionnaire analysis...... 6 7.1.3.Participant #3...... 7 7.1.3.1Participant background as determined by pre-screener and pre-test questionnaire....7 7.1.3.2Tasks...... 7 7.1.3.3Post-test questionnaire analysis...... 7 7.2.Result analysis...... 8 8.Outline...... 8 1.Abstract Syncthing (https://www.syncthing.net/, https://github.com/syncthing/syncthing) is a popular open- source P2P folder synchronization tool, which is mostly used to synchronize folder changes between multiple devices, provide backups and rudimentary revision control. It is written in Go, uses web interface and is available for , OS X, Windows, Android and BSD. The most common use case is to sync work folder between personal and corporate notebooks and NAS or some other long-term data storage to back data up. According to live statistics (https://data.syncthing.net/), the most popular setup is Syntcthing v0.12.24 running on linux-amd64, so this is the one we will be using. The most popular browser between advanced users and computer professionals is Google Chrome, thus Chromium 49.0.2623.108 will be used to access the web UI.

2.Goal User testing should help us discover UI deficiencies from the user's eye view.

3.Target group Software developers with Computer Science/Software Engineering college education.

The product's target audience is rather specific and mostly includes “computer geeks”, thus omission of laymen from our selection seems logical.

4.Pre-screener

Question 2 points 1 points 0 points Sporadically or How often do you make backups? Weekly Monthly never Do you any automated tools for this? Yes, actively Yes, occasionally No Yes, across Yes, between two Do you use synchronized folders? No multiple devices devices

5.Pre-test questionnaire Pre-test questionnaire is used to get a more fine-grained division of test subjects to enhance results analysis.

Question 1 points 0 points Do you know what delta encoding is? Yes No Do you know how DHT works? Yes No Do you know what acronyms TURN and STUN Yes No mean? 6.Testing method

6.1. Platform Hardware Software 15.6” 1366x768 display Cinnamon DE

Logitech Anywhere Mouse MX Linux 3.13.0-24 Chromium 49.0.2623.108

6.2. Testing setup The weakest spot of the whole experiment, since testing was carried out in a regular office kitchen with continuous interruptions.

6.3. Tasks 1. Enable automatic synchronization for two directories.

2. Force synchronization of a chosen directory.

3. Add a remote node.

Tasks related to installation/maintenance purposes were deliberately skipped due to the fact that the tested application is open-source and multi-platform, and thus there are dozens of ways to install it under different circumstances, i.e., one can install it via PPA, or manually as .deb/.rpm package, or build from source.

6.4. Post-test questionnaire Post-test questionnaires are used to funnel vague user thoughts and feelings into something more accessible and usable in the final analysis.

1. Did you feel lost at any point in the testing process? If yes, when and why?

2. Was there any unexpected behavior, and if yes, where and what were you expecting instead?

3. Do you think there is something missing?

7.Data gathering

7.1. Participants Three author's co-workers, enterprise back-end Java developers in telco field with 1…8 years of commercial development experience. One was vaguely aware of how automated synchronization and backups look like, the second one had experience with BitTorrent Sync and the third, the most advanced in the field, was using plain rsync with a bunch of cron tasks 7.1.1. Participant #1

7.1.1.1 Participant background as determined by pre-screener and pre- test questionnaire. Junior developer, does not use automatic backup and directory sync tools, not particularly well- versed in distributed systems internals, NAT traversal and delta encoding.

7.1.1.2 Tasks 1. The user did stumble on this task, because web UI does not use default file chooser, instead going for a simple combo box. Moreover, the combo box shows only the first 10 entries in the Syncthing folder, which tend to be hidden folders, with names starting with dot symbol, since sorting is lexicographic.

2. There is a big dedicated button for this task, so it was not a problem.

3. Adding a remote node was also fairly simple for the user, though he was really bored from having to manually type in 56-symbol node id.

7.1.1.3 Post-test questionnaire analysis All in all, user was satisfied with the application. His biggest problems were the need to manually type in remote node id (although there is a possibility to export it to QR-code, but desktop versions cannot read from it) and a weird widget selection in directory addition dialog. 7.1.2. Participant #2

7.1.2.1 Participant background as determined by pre-screener and pre- test questionnaire. Junior developer, does use automated syncing tools occasionally, but does not know too much about their internals.

7.1.2.2 Tasks 1. This user did not have the aforementioned directory choosing problem, as he's just started to type in the absolute path immediately.

2. Again, it's not a problem to initiate update before schedule.

3. Same problem with a long ID.

7.1.2.3 Post-test questionnaire analysis This participant had a better experience with the tested application, maybe because he's more used to this class of utilities. However, long ID is still a major culprit. 7.1.3. Participant #3

7.1.3.1 Participant background as determined by pre-screener and pre- test questionnaire. Middle developer, does use automated backup (although with really old-school tools) and knows something about how such things work.

7.1.3.2 Tasks 1. Again, the user has just simply typed the path in, without using any aids.

2. This task seems to be the simplest of three, and this user did not have any issues with it.

3. Yet again, the whole process is relatively simple and logical, but typos in long node ID may infuriate the user, and this was the case.

7.1.3.3 Post-test questionnaire analysis This user had the most positive reaction to Syncthing, but node adding was a pain for him too. 7.2. Result analysis By and large, the application UI is okay, but there is still a field for improvement.

1. Enable users to use a regular file chooser.

2. Add something to bypass a requirement to manually enter node identifier, maybe QR reading from a webcam.

8.Outline While the application is doing its job, isn't too complex and has a great documentation, which is available from its main page, it does seem to somehow favor experienced users. While they are the current audience, it may be worth it spending some time to make it more accessible for non- specialists. Also, every participant mentioned the tedious node ID typing, which seems to be a biggest UX issue.