Novi Survey Software Run-Through

Contents Conceptual ...... 3 Login ...... 3 User Interface Basics ...... 3 Novi offers Help ...... 5 Constructing a Survey ...... 6 A New Survey ...... 6 Survey Options ...... 6 Incorporating Ethics ...... 7 Not Answering the Question ...... 8 Setting up a Page ...... 10 Adding Questions ...... 11 A Yes/No Question ...... 12 A Categorical Question ...... 14 An Open-Ended Question ...... 17 A Matrix Question ...... 18 Advanced Survey Logic...... 22 Key Points ...... 26 Deploying a Survey ...... 27 Response Control ...... 27 Invitations ...... 28 Contacts ...... 32 Styling ...... 34 Alerts ...... 37 Final Deployment ...... 37 Key Points ...... 38 Analysing a Survey ...... 39 Responses ...... 39 Reports ...... 40 Exports ...... 44 Key Points ...... 45 Summary ...... 46 UI things to note ...... 47 Bugs Found ...... 47

2

Conceptual This guide describes how to build and manage an online survey using the Novi Survey software. It isn’t a guide to how to best formulate questions, or decide who the target audience should be, or what responses should be analysed or reported afterwards, all that is outside the scope of this document. Its purpose is just to help map a set of questions to an online survey via the software and then get the survey out to selected respondent populations and then gather the results, which pretty much describes the three stages in the survey process that NOVI supports:

1. Constructing a Survey 2. Deploying a Survey 3. Reporting Survey Results

After a brief description of the basics of the NOVI user interface, the rest of the guide is split into three main sections that cover each of these consecutive phases in a survey life cycle in turn.

Login Login using the link and instructions found at http://staff.napier.ac.uk/services/cit/AcademicApplications/Pages/CreatingSurveys.aspx

The survey tool has been tested on these browsers without any browser-based problems so far: IE9+, Opera 19+, Firefox 27+, Chrome 31+.

User Interface Basics Novi’s top-level GUI () is constructed around a visual device known as a , as used in newer versions of Microsoft products such as Office. Essentially high-level tasks / functionality are grouped into tabs such as “Surveys”, “Contacts” etc. as seen in the screenshot in Figure 1. Clicking one of these tabs populates the horizontal ribbon below it with functions for that particular group, which may be split into logical subgroups by placement of thin vertical dividers.

In the top right corner, separate from the ribbon, are four ever-present buttons to control some global settings for your NOVI account (e.g. changing the default language to en-GB from en-US), contact Novi in the event of an issue (i.e. an error in their software), read some software versioning blurb and finally to logout.

Tabs Buttons

Ribbon

Figure 1. Top-level Novi Survey interface elements.

Selecting any function in the ribbon may in turn re-populate the ribbon with a different set of options (obviously related to the selected function) or, if the function leads to allowing the survey designer to change settings, will display either a dialog or a form*. In a dialog – an example of which is shown in Figure 2 – the survey designer makes some choices and then chooses to act upon them by selecting “OK” or, if they decide not to go ahead, selects “Cancel”. A form is used when the choices to be made or the information that can be changed is more complex or wide-ranging, and is formed, naturally

3 enough, of a group of sidetabs and associated panels containing controls, an example of which is shown in Figure 3. In a form, the top-level tabs disappear and the ribbon changes to hold at least these two options: “Save Changes” and “Cancel Changes”. These control whether to either keep or discard any changes made to the settings on that form, mirroring the “OK” and “Cancel” buttons in a dialog. Choosing either will also return the interface to the point where the form was entered.

It is recommended to use the Cancel buttons to undo/clear any changes made rather than use the browser back ; in fact don’t use the browser back button for any kind of navigation within NOVI. Even though it works it’s just not considered playing ‘nice’, and of course using the back button will definitely not work for saving any changes.

Asterisks mark fields Dialog that must be completed

Figure 2. Dialogs are used for capturing a few user selected options.

Tabless Ribbon

Panel

List Builder

Side

- tabs

Textbox

Drop-down box

Figure 3. Individual GUI components combine to make forms*.

4

* - The standard GUI terminology is actually “page”, but since NOVI describes surveys in terms of pages as well then it’s probably best not to confuse the two uses, so the term “form” is used instead.

Novi offers Help Novi does have a lot of online help, both for particular functions (how do I set a question to be a certain style?) and for performing particular tasks (how do I start a new survey?) The trouble is these are all listed alphabetically by parts rather than being composed into an overall example. But still, if there is a particular issue which can be described in the terms the help system is couched, the online help system is a useful resource. The main online help feature is always found at the far right of the ribbon (immediately to the left of the “Ribbon” arrow annotation in Figure 1).

5

Constructing a Survey The fundamental principle to understand when building a survey in Novi is that it is designed around grouping questions onto pages. This is not only for visual presentation but is also the basis for its skipping/branching logic capabilities. Whole pages can be skipped, and thus the groups of questions on those pages, but individual questions on pages can’t be skipped: individual questions can be set to not require an answer but not to prevent an answer, and the respondent will still see them. Further, pages can then be gathered into sections. These form sub-surveys within the main survey, and can be given their own introductions and even timers.

This essentially is a task that can be 90% done on paper before trying to do anything with the survey software. In fact, it’s probably best to sit down and work out the types of questions and the logical flow of a survey on paper beforehand, otherwise there is the risk of compounding uncertainties/difficulties with the survey with uncertainties/difficulties in using the Novi software.

A New Survey The first action is to set up a new survey. This option is found by selecting the Surveys , and appears on the ribbon’s far left, and should be easy to spot given it’s one of the few options in the ribbon that is not greyed out (greying out indicates a function is currently unavailable, and most functions will be until a survey is active). Novi’s Online Help functionality has this covered under Tutorials > Create a Survey – with slightly different language used (New vs. Create) which makes it a bit less obvious to find than it should be. Anyways, the instructions are there along with annotated screenshots, so name the survey and select a default language and then select Save.

The ribbon options and screen will now change to what is later discovered to be the “Manage Questions” interface, but for the moment this is getting a bit ahead of ourselves, as other settings for the survey need to be set-up first. Select the “Back To Surveys” option in the ribbon to return to the main Surveys panel. Here, the newly named survey now appears in the survey list. At the moment, two elements are of interest. Firstly, there is the value in the drop-down box in the “Status” column. It should be “Design”, to indicate the survey is currently under development. Once the survey is fully developed, this will be changed to “Open” at which point the survey is accessible through the link in the “Deployment URL” column. The third alternative is “Closed” i.e. an old survey that’s served its purpose. Secondly of interest is the checkbox in the unlabelled column on the far left. This checkbox indicates whether the survey is the focus of the functions in the ribbon bar. Try un-ticking it and see that most of the options are greyed out but leaving the “New Survey” function available. Tick it back.

Survey Options Now that a range of actions are possible with the new survey, it would be best to set some aspects of the survey before setting up pages and questions. The third option along in the survey ribbon is “Survey Options”, and selecting this gives access to a form which allows the overriding of some of the default settings for this particular survey. As seen in Figure 4 the form is divided into five side-tabs, and selecting any of these brings up an associated panel of controls.

6

Figure 4. Survey settings view showing the five sidetabs, currently showing the panel for the “Main” tab.

Comprehensive Novi Help files for this view are found under “Survey Options” in the Online Help section. But settings which may be of particular interest at the beginning of a survey are:

Main Tab > Template (standard): Templates control the look of the survey presented to respondents, and the default template is quite plain. There are numerous other options, the Napier ones including the Edinburgh Napier logo and branding.

Main Tab > Privileges: You are the owner of the survey, and if it is just you setting up and running the survey then read no further on this point. Other NOVI users can be selected as shared owners, designers or analysts of the survey. In practice, a shared owner has the same rights as the original owner (including deleting the survey: CHECK), a designer can set up questions, and an analyst can access respondents’ answers. Setting up Invited Respondents is also a choice here, but is a task better approached through the “Invitations” function on the Survey ribbon.

Return to the top-level Surveys interface again by either saving or cancelling any changes made in the Survey Options form.

Incorporating Ethics One major issue to pay attention to is ethics. Napier has an Informed Consent Form (ICF) template available at http://www.ethics.napier.ac.uk which should be seen and accepted by respondents (the next sections show how to work it into an actual survey), and there are several points within it that affect how each and every question should be set up.

For example, point 3 of the ICF states: “I have been told that my responses will be anonymised. My name will not be linked with the research materials, and I will not be identified or identifiable in any report subsequently produced by the researcher”. The easiest way to do this is by setting the survey to be either anonymous or semi-anonymous as stated in the “Survey Options” section, as this is the only way to anonymise the metadata associated with the survey (it could be done by hand afterwards, but that runs the risk of not being foolproof.) The second point about not being identifiable in any reports will be covered later on.

Further, Point 5 of the ICF states: “In addition, should I not wish to answer any particular question or questions, I am free to decline”. In effect every question in the survey needs a mechanism whereby the respondent can choose to ignore that question, and the different ways to do this are discussed later on in this section.

7

These two points are salient given Novi’s Personal data: From EU directive 95/46/EC: “Article “Demographic” question type (see Figure 8.) 2a: 'personal data' shall mean any information This is a misnomer as it doesn’t ask for personal relating to an identified or identifiable natural but shared attributes such as race, age, gender person ('data subject'); an identifiable person is one etc. but rather name and address which make who can be identified, directly or indirectly, in respondents uniquely identifiable. What’s more particular by reference to an identification number it makes name and email required fields, so the or to one or more factors specific to his physical, questions must be answered, violating point 5 physiological, mental, economic, cultural or social identity;” of the ICF. For this reason we recommend eschewing NOVI’s built-in demographic In layman’s terms, if I say I’m 40, male and six foot question type. tall, that doesn’t become personal information until it can be attached specifically to me, as would happen if my NI number or name and address were gathered at the same time (or could be gathered

through some other key such as email address or an ID returned by the NOVI software as meta-data).

Not Answering the Question See http://vcu.visioncritical.com/wp- content/uploads/2012/01/VCU_BPBriefing_Other_DK_None_FINAL.pdf for the background to this.

The previous section brought up the disinclination of a respondent to answer a question. Given any question there may be several reasons why a respondent thinks they cannot answer, the most common being:

 Not Applicable (NA) – the question does not apply to them.  Prefer Not To Say - the respondent does not wish to answer.  Don’t Know - The respondent does not know the answer.  Other – The respondent has an answer, but it’s not in the choices they’ve been presented with.

Not Applicable (NA) – The respondent feels the question is not pertinent to them. Logically, the survey should be constructed with the use of pages, sections and logic ‘jumps’ (that are described later on) such that most non-applicable questions are never seen by a respondent.

Prefer Not To Say – The respondent does not feel like disclosing an answer to this particular question. This needs to be dealt with as a) under the ICF respondents should be allowed to skip questions and b) if respondents are forced to answer, but don’t want to give that info, they will either quit the survey or respond with random answers, neither of which is good for analysis. Think of the times you’ve been asked for email addresses you haven’t wanted to give and typed in nonsense just to continue, e.g. [email protected].

Don’t Know – the respondent genuinely doesn’t know. Forcing them to guess isn’t a great solution, so there needs to be an option for this.

Other – this can occur if a list of possible answers is not exhaustive. For instance if in the age question earlier the respondent is under 16 (though of course now being a child, it’s a completely different can of ethical worms), there is no suitable category for them to answer yes. This may be acceptable if the survey is aimed only at people within those categories i.e. 16 and over’s only.

8

The problem here is how far can Novi Survey support all these different exceptions to the rule, both in the question construction and in the subsequent analysis? Whatever solution found needs to be consistent in the survey otherwise respondents will just become confused. Novi Survey offers an “N/A” option for all questions, and the “other” option for categorical questions, both singly and as part of a matrix-style question. However, it has nothing specific to capture “don’t know” or “prefer not to say”. While all questions have a “required” setting that could be left unchecked to let respondents ignore questions, this gives rise to the issue of not knowing whether they’ve deliberately not answered the question or whether they’ve missed it out accidentally.

The least optimal solution is to add “don’t know”, “n/a”, “prefer not to say” as regular choices to questions, as Novi then has no mechanism for differentiating these answers from the “regular” ones. However, in some circumstances a “none” choice may be added as a regular choice as it would be an expected answer i.e. “which of these washing powders do you regularly buy”.

There are two basic patterns depending on the level of determination to record types of non-response. The first pattern is to use the “Not Applicable answer” option for both non-applicable questions and to indicate where the respondent does not wish to answer a question, and to use the ”Other” option where available to indicate an alternative answer or a ‘don’t know’. This distinction makes sense as an “other” or a “don’t know” answer still contain definite answers to a question, whereas not giving an answer due to perceived non-applicability or reluctance results in the absence of an answer.

Figure 5. Editing standard messages to indicate to respondents how to prefer not to answer.

To make this distinction clearer to respondents, the standard text messages displayed in the survey can be changed. In the Survey ribbon, select the “Text Messages” option, and a form will appear showing tables of default labels for each language the survey uses. Changing the “Not Applicable” message text, as ringed in Figure 5, to indicate that a refusal to answer can also be accommodated here will make it clearer to respondents how to proceed in these cases. The same can be done with the message for “Other”.

The second pattern is quite simply to not set any question to “Required” and skip all the “Non- Applicable” settings mentioned in the next section (though “other” and “don’t know” as possible answers should still be accommodated). Here we just accept the risk of respondents inadvertently missing out questions but at the same time present them with less options, reducing clutter and possibly also reducing confusion. The two approaches are summarised in Table 1.

9

Table 1. Two approaches to handling non-answers in NOVI survey.

Approach & Always set Q’s to required and allow NA option Set Q’s to not required Drawbacks (and allow respondent to state reason why) Good Collects data on why question not answered. Less clutter on screen. Respondent cannot inadvertently miss out Less data entry for respondent. questions. Bad Increases amount of instructions on screen. Respondent can miss questions by accident.

Setting up a Page Our survey at the moment, as seen in the Manage Questions form, and not immediately obvious from NOVI’s interface, consists of one existing page with no questions on it as seen in Figure 6. Here though, NOVI’s design of questions being divided onto ‘pages’ starts to emerge.

Figure 6. An empty page with no questions asked.

Choose the “Edit Page” option from the ribbon which then launches a dialog for setting up this page. This has options for giving this empty first page a name other than “Undefined”, along with other options such as adding on a suitable page introduction. This is limited to 2,000 characters which is a pain as it makes any reasonably comprehensive introduction to a survey difficult, though one solution is to chop up an introduction over multiple, consecutive questionless pages.

One thing to do on a first page is to add the ICF text as an introduction, and (covered in the next section) then add a question asking respondents if they accept the terms of the consent form. So start by setting the page name to “Informed Consent Form”, and then amending, cutting and pasting the template ICF at the end of this guide into the Page Introduction field. It includes HTML-based tags (“
”) which ensures the text isn’t presented as one paragraph. Further make sure the to show the page name and introduction are both set, otherwise respondents will have no idea what they’re being asked to agree with. Finally, select “Save” to close the dialog and save the changes. Figure 7 shows how the dialog should appear.

Most of the rest of the page-based functions in the ribbon are self-explanatory: copy, move or merge pages, delete pages, etc. To make a new page, select the “New Page” function in the ribbon and fill in the details in the same way as for the first page. Pages can then be navigated using the control ringed in red in Figure 6. The more involved functionality starts at “Page Show & Hide” and “Page Skip Logic” and until some questions have been added it’s not worth going into those functions in detail until later.

10

Figure 7. Setting up the first page to act as the informed consent agreement.

Adding Questions A survey with no questions is pretty redundant, so the “Insert Question” function will become a familiar face. It’s the leftmost of the three button group at the bottom of the screenshot in Figure 6. Selecting it reveals a large dialog with a similarly large number of choices as shown in Figure 8.

Figure 8. The choose question dialog. Many options are just slight variations on a theme.

11

However, many of these choices are just offering slightly different visual variations on a common theme, or are convenience functions for variations on a particular response type. For instance the choices bounded in red in Figure 8 all offer the same type of question, but in slightly different presentations (horizontal, vertical etc.) Also, the open ended text, number and date options bounded by the green box all subsequently (after pressing OK) reveal the same dialog, but with different pre-set values. What is perhaps more useful is to match the type of question to the type of control that question will present itself as, and a rough guide is shown in Table 2.

Table 2. Rough and ready matcher between question type and the widget (‘control’) to ask it.

Question ‘type of reply’ Novi Survey Question ‘control’ Choose one and only one from a list of Radio buttons. Drop-down list. See . alternatives or categories Choose one or more from a list of alternatives Checkboxes. or categories Open Ended, or where possible choices Open Ended Text, Number, Date exceeds a sensible limit for the above options A multiple of any of the above (e.g. for a Matrix Likert-type scale)

The insert question dialog offers checkboxes as a choice Radio buttons always show the selected for selecting both single and multiple selections. These value in the context of alternative again seem to be convenience functions as the values. Drop-down lists only show the subsequent dialog offers choices to change all these value that’s been selected. Drop-down settings again. Sometimes it is simply a visual preference, lists are preferable when there are while a radio button is the preferred GUI element for a large numbers of possible values, radio single selection, people often just like the look of a tick in buttons are preferred when there are a checkbox better than a dot in a circle as radio buttons only a few options. The most commonly accepted cut-off between the two is present. seven or less options for radio buttons.

A Yes/No Question The most basic type of response that can be given is a yes/no answer, for which radio buttons or checkboxes are the most appropriate widget – considering the previous table, what is happening is the respondent is selecting one value from a list of one. Perform the following steps to construct this question:

 In the first page (the one with the informed consent introduction), select the “Insert Question” button to enter the question choice dialog.  In this dialog, select the “Checkboxes, vertical” option under the “Multiple choice, single selection” heading in the table.  Select OK at the bottom of the dialog. A form will appear as in Figure 9.  Here, in the panel connected to the Main side-tab, tick the “Required” checkbox,  In the panel connected to the Answers side-tab, delete all the answers apart from one, and then name that answer (the “Response Text”) as “I Agree”.  Select “Save Changes” from the ribbon.

The question is now presented as in Figure 10. In essence it is a group checkboxes from which the respondent is required to pick one, but since this is a ‘group of one’ the respondent has no choice which to pick. Checking this box is then interpreted as them giving their consent to the ICF terms. If

12 they try to continue to subsequent pages without selecting the checkbox they will be prevented from doing do, and an error message will display (as will happen whenever a respondent tries to proceed without answering a ‘required’ question).

However, this view does not show the question as it would be presented to a respondent; most importantly, it doesn’t show the introduction. To see the page as a respondent would see it, the “Preview Page” option is available in the ribbon when in the “Manage Questions” section (circled in purple back in Figure 6), and when at the top-level Survey page, there is a “Preview Survey” option in the ribbon. Figure 11 demonstrates the results of previewing the survey page and question just constructed.

Figure 9. Setting up the options to make a single checkbox question.

Figure 10. The completed consent page with a single question.

13

Figure 11. How a respondent will see the consent page and the initial question. Remember to fill in the bits between square brackets though.

A Categorical Question Let’s set up a new page to collect some demographic information then (call the page “Demographics”), but avoiding NOVI’s built-in demographic question type. Begin by collecting the rough age of respondents if they are willing to give that information. The first thing to do is to pick the correct question control for this task. Deciding to collect age data as ranges rather than as exact numbers (helps anonymisation) means…

 There is a set of categories as possible choices.  People are only one age at any moment in time, so there will be only one choice made.  The number of age categories will be seven or less

Looking at the rough guide in Table 2 it appears a radio button-based control is best for this question. So on the new page, select the “Insert Question” option and select one of the radio button options in

14 the top left corner of the popup (see Figure 8), then press OK. A form similar to that in Figure 12 will then present itself.

Figure 12. Radio button setup form for a question.

The “Main” tab controls settings for everything apart from the answer categories for this particular question, and the first task is to set the question text to something suitable (and a short name). If following the first pattern of accommodating non-answers, check both the “Required” and “Allow not applicable answers” fields. Further, allow respondents to input a reason why they haven’t divulged this information; Figure 12 shows these three options checked. This could be useful in differentiating between genuine non-applicability and reluctance to disclose information, but as this is a free-text entry it is at the mercy of respondents. This mechanism lets respondents acknowledge they have read the question and not missed it by accident. As said previously, the second pattern of handling non- answers is just to set required to false (not checked) and hope they don’t miss the question by accident.

As an experiment, changing the min or max number of answers from ‘1’ will remove radio buttons and drop-down lists as possible control choices for this question – this is in fact a common control dialog for all the checkbox and radio button options shown in Figure 8. To reset the original values, change the answer value back to ‘1’ and re-select one of the radio button options.

Next, go to the “Answers” tab. In here fill in typical age ranges in the Response Text column adding answers for a reasonable number of choices (see Figure 13 for an example). Make sure they are in age order, as mixing them up will make measures like the median age range much more difficult to calculate once the information is gathered. Do not use an “Other” option for this question, the ranges cover all the age ranges of the respondents expected to take this survey, and people are also expected to know their age, so “don’t know” isn’t considered as an option.

15

One thing to note is the “Predefined answers” dropdown, this offers a quick way of setting up commonly used response sets and ranges such as “easy-hard”. While there isn’t such a choice for age ranges there are options here for many common response types asked in Likert-type scales etc. Underneath is also a check-boxed option to grab other predefined answer sets from a database but that is way out of scope for this guide!

Figure 13. The Answers tab for a categorical question.

Finally, the form should start to look like the one in Figure 13, and selecting the “Save Changes” option in the ribbon should show the finished question as it appears in Figure 14. Note that each question appears in a box with options to Edit, Copy, Delete and Move the question in a bar at the top of each box. Now the survey contains a question that respects a respondent’s decision not to give an answer.

Figure 14. How the finished question appears in the “Manage Questions” view. Note the insert question button group occurs before and after every question.

A similar process could be used to add more categorical questions such as gender, marital status etc, just remember to add a consistent choice of “non-answer” opt-outs to each question, and in the case of gender / marital status activate the catchall “other” answer option.

16

An Open-Ended Question Often, there is a need to grab some data that doesn’t fit neatly into a categorical style of question. Either the response is truly open-ended e.g. “What’s your name?” or there are so many possibilities that listing them all is laborious for the survey designer to construct and for the survey respondent to wade through e.g. “When did you last go to the dentist?”

Figure 15. Typical dialog for setting-up an "open-ended" question, in this case the text version. The warning at the top is because survey responses have already been collected, now it’s warning (rightly) that changing the question could affect that collected data.

In this case the option to choose when constructing a question is one of the Open-Ended options shown in Figure 8, which covers text, numbers and dates. Numbers and dates can be limited to a range (though this functionality isn’t working for date types in Napier’s current version of the software). Text responses can be limited to a certain number of characters, and to single or multiple lines of response. In these question types there isn’t the option of specifying an “Other” option, it’s presumed the question form has enough flexibility to take in all the answers that may occur. A default answer for all types of open-ended question can be supplied, but this disappears once overwritten by a respondent, and it doesn’t return on deleting the over-riding answer. Figure 15 shows a typical form for an open- ended text question. The ‘Answer’ side-tab can be used to set whether this question covers just one or several sub-questions, with the default being three sub-parts to a question.

A fitting open-ended question to ask on the demographics page would be to ask what the respondents nearest city was, which should make them untraceable for most purposes, and it is also one that can be used later on in the survey. The following procedure will construct such a question:

17

 Go to the “Manage Questions” form and select the page the age-based question is on.  Select the “Insert Question” button in the last group of buttons on the page, which means the question will be inserted at the bottom of the page.  From the “Insert Question” dialog choose the Open-Ended Text > One Line option and select “OK”.  Within the resulting question form fill in the details as they appear in Figure 15 for the Main side-tab (or simply turn off the required checkbox if allowing respondents to omit responses that way), and within the Answer side-tab enter the value “What is your city” as the and “None” as the default.  Finally, select “Save Changes” from the ribbon.

Figure 16. How the finished open-ended text question appears in the "Manage Questions" form.

The question will appear in the “Manage Questions” view as in Figure 16, and again we have designed a question that allows a respondent to choose not to offer a reply.

A Matrix Question The matrix options of the question-type choices look daunting at first, but essentially they are simply shortcuts to gathering replies to multiple issues within a single question, using the same rating scale, rather than spreading them out over multiple questions. The “matrix” is just the combination of a set of sub-parts (the rows) to a question by the possible set of answers (the columns). Think of it in terms of being the same setup as a multiple choice test (answer A, B, C, or D) over a set of questions, and obviously they then are best suited to situations such as Likert-type scales where feedback to a group of related questions using the same standard set of replies is required.

Firstly, make a new page to put the matrix question on, and call it “Services”. Then insert a question on this new page, picking from the popup dialog in Figure 8 the “Numeric rating (dual)” option under the Matrix heading, and then select “OK”.

A form similar to that in Figure 17 now appears. It has three side-tabs: “Main”, “Matrix Row”, and “Matrix Segments”. The “Main” side-tab has options which are self-explanatory, what’s the name of the question and its short name, and whether the question as a whole can be declared non-applicable or disinclined to be answered (if using the first pattern of allowing non-answers, obviously set this so it is possible). The “Matrix rows” side-tab again is fairly straightforward: what is the name of each row (i.e. sub-question) in the matrix? If, for example, in a survey of what respondents think of the public services in Edinburgh, an example of a public service would be named in each row e.g. Public Transport, Street Lighting and Waste Collection.

The meat of the question set-up is in the “Matrix Segments” side-tab. Here, define the number of segments to the question - how many different things to ask per row – and the type of responses

18 wanted from these questions. Within each segment, since one of the “Numeric rating” options was originally picked, the column type defaults to “Numeric rating”, with a default range of 1 to 5 for responses. Descriptive labels can be added e.g. “Poor” for the low value and “Excellent” for the high value – these can be left blank but the question then needs some other way, perhaps in the question text or sub-text fields, of indicating which end of the scale maps to what semantics (in fact the preferred way is to supply both sets of cues, but make sure they agree with each other).

Figure 17. The “Matrix Segments” side-tab of the Matrix question setup form.

For handling non-answers in each individual row either A) select the “Include NA option” and give an appropriate text label for that, and then select the “answer count per row” option, and set the minimum number of answers per row to 1. This means the respondent must select a value per row, but that value can be the NA option. Alternatively, B) leave the “Answer count per row” and “Include NA option” fields unchecked, this is the equivalent of not checking the required checkbox in the previous types of question.

Since the “Numeric rating (dual)” option was originally chosen, there are two existing segments within this question. If one is set up to be about “punctuality” for services, the other can be set up to be about “effectiveness”, and the different segments can be setup with different answers such as different labels, larger numeric ranges etc. (though for this example, make things exactly the same apart from the segment name). This now makes the question a “multi matrix” and is useful when needing to measure different dimensions of the same subjects at the same time.

19

Figure 18. What the multi-matrix question looks like.

After selecting “Save Changes” in the ribbon, the interface returns to the view presented in Figure 18: showing an example of what is a multi-matrix question. Looking back at Figure 17 and the second field down in the panel – “Column Type” – it can be seen that both parts to the matrix are of the type “Numeric Rating”. Essentially each row is capturing a single value from one to five for each matrix, the separate columns within each matrix representing different possible values rather than different types of information.

Matrices can also be set up so possible choices are compressed into single columns, rather than spread across multiple columns as with the radio button/checkbox selections. Select the “Edit” option in the top left corner of the question in Figure 18 to return back to the question set up form. Then select the “Matrix segments” side-tab and after scrolling to the bottom of the panel, select the “Add Segment” button; a new segment should present itself to be filled in.

Here, and using Figure 19 as a guide, enter the Segment heading as “Accountability” and set the “Column Type” to “Label”. Now a type of control for the cells in the column needs picked (the “Cell Value” field) – the radio button and checkbox options have been seen and offer the choice of being selected or not, and the number, text and date options offer open-ended entry in each cell for a respondent; the most complex option to set up is the “Drop-down list” so choose that. Now the list of column descriptors in the column table has expanded to include a “Values” element (see the bottom right of Figure 19.)

To tidy things up, delete all the columns (which are, confusingly, the rows) in the column table except for the first one, using the red X icons. Then in the one remaining column, set the name to be “Accountability”, and select the cell under the “Values” header (the one with “0 values” and the pencil ) and a dialog presents itself (shown in Figure 20) that gives the option of defining the values to be used in a drop-down list. In this case enter “Full”, “Some” and “None” as possible choices.

Adding a “N/A” style choice is slightly more problematic here, it can be added here as an alternative value in the drop-down list, but it will not be recorded by NOVI as a special case. Back in the column table there is the option to set an answer as “Required” or not, and setting it as not allows the respondent to select a blank value for the question - though now it’s unknown again whether the respondent chose to purposely ignore the question or missed it by mistake. One alternative would be to set one of the values as a default so the answers are initially populated and the respondent can

20 blank them out if they want to, but then in that case if the respondent misses the question by mistake the response will stay populated with that value. It’s swings and roundabouts again.

Also, it’s worth noting that this way of constructing matrices offers the possibility of columns having different sets of choices within the same matrix. For instance, a new column could be added to this segment, called “Contactability”, and given the values “Hard”, “Medium” and “Easy” for a drop-down list.

Figure 19. Setting up a matrix to use drop-down lists of defined values rather than simple radio buttons.

Figure 20. The Edit values dialog is where the options for the drop-down list are set.

21

Finally, select Save to return to the question form, and there in turn, select “Save Changes” in the ribbon to save the changes made to the question. Figure 21 shows the appearance of the final matrix question, now incorporating the different types of response mechanisms, and along with the single- part selection and open-ended questions form the basic question templates that can be used to construct a survey.

Figure 21. The expanded multi-matrix with different types of response mechanism in the same question.

Advanced Survey Logic So far, a fairly inglorious survey full of avoidable questions can be constructed. However anyone who’s ever filled in a complicated paper survey or form will recognise the likes of “If you answered NO to this question, jump to Section 5”-type instructions that regularly pop up. The advantage with online survey software such as NOVI is that this kind of logic can be done automatically, so if a respondent says they are male, the rest of the survey can skip any questions aimed solely at women.

NOVI has two main methods of achieving this type of logic and both are page-based, so this is where arranging questions onto pages logically comes into play. The two methods are “Page Conditions” and “Skip Logic”, and can be found in the “Manage Questions” form on the ribbon next to each other (though Page Conditions is labelled as “Page Show/Hide”). There is a detailed tutorial on the difference between the two in Novi’s Online Help under “Tutorials > Work with page conditions and skip logic” and it is not intended to repeat it here. Simply put, the difference is that Skip Logic is calculated when a page is completed (and the next arrow pressed) and can push the survey forwards over one or more pages, whereas Page Conditions are calculated when a page is entered and decide whether that particular page is shown or instead should the survey leapfrog to the next page (where page conditions for that page are then calculated in turn). The only difference in practice between calculating “skip logic” on finishing a page and calculating “page conditions” on entering the next page is that “skip logic” allows more than one page to be passed over. This has a follow-on effect in that if multiple conditions are set for skip logic the first one that matches is acted upon, so the ordering of conditions matters. This means respondents can be sent to different parts of the survey if say the first condition is set to jump to page three but the second condition jumps to page seven.

Both options lead to a similar form, so set the current page to the one with the matrix question about services, and then select the “Page Show & Hide” option from the Survey ribbon, and a form similar to the one in Figure 22 should appear. There are five expandable sections (this is called an ‘’,

22 where only one section can be expanded at a time), currently all blank, which can be populated with “conditions” that decide whether this page is presented or not to a respondent.

Of the five sections, ignore “Participants”. This is populated with data taken from the special demographics-type question seen in Figure 8, and as explained back then it’s a dubious set of data to collect.

Figure 22. Initial form for setting page conditions (show/hide pages).

“Scores” haven’t been discussed yet. They are numbers attached to answers for questions which are totalled up per page / section / survey, and this guide doesn’t explore that section further.

Similarly undiscussed, “Parameters” are fields attached to URLs that can be used to differentiate survey respondents (and for that reason should be treated with caution), commonly seen as a “?key=value”-type syntax in web browser address bars. The most likely use for these is for gathering feedback from known individuals to specific issues i.e. “have we resolved issue 10123 with Novi Survey?” and “?issue=10123” would be appended to the URL of a survey. They’re a specialist feature which the NOVI help covers if needed.

“Responses” is related to the initial setting up of the survey, and this section allows page logic to be controlled by when the survey was filled in, useful if time-sensitive data is to be collected. “Response culture” is a euphemism for language.

This leaves only the “Questions” section as being of real interest, so expand that section and then select “New Condition” from the top of the empty table (termed a “Condition Group”) in that section, as shown in Figure 23.

23

Figure 23. Expanded Question accordion section for controlling page flow logic.

A dialog appears with three initial parts to set. The first two parts are to set the page and question whose answers are being using to decide whether to show this current page or not. Obviously, this can only be a previous page and question, and in this case it is the Demographics Page and the “which city do you live nearest to?” question. Setting these reveals an extra sub-question field to fill in, which as there is only one part to the “which city?” question, should be easy enough to set. The last part is deciding what answer(s) and logic in particular will be used to decide the hide/show behaviour of the page. With an open-ended text question our options are looking for text which is present or absent in the answer, or looking for the presence or absence of an answer altogether. In this case select “Contains all of”, revealing a fifth setting which asks for the value: in here, type “Edinburgh”. At the end, the dialog should appear similar to that in Figure 24.

Figure 24. Setting a condition on a question.

24

Select “OK” to save the condition, and it should now be visible in the table (“Condition Group”) in the questions section. Currently, there is one condition and one condition only, but in the case of multiple conditions how would they all combine to produce a final yes or no decision?

Multiple conditions can be combined within each such group using either the “all the following conditions apply” (all conditions are needed) or “any of the following conditions apply” (just one is enough) options in the drop-down box in each group’s header. With just one condition in a group it doesn’t make a difference which of those rules is selected.

Within each section, the “New Condition Group” button can be used to set up a new group of rules with a different logic (any or all), and then that group can be combined with the first using another logic preposition function (“And” or “Or”, see below) found as a drop-down list sitting between the condition groups. Adding new condition groups can be repeated pretty much indefinitely.

Finally, combining logic between the different types of sections is done using the drop-down box in the section header: “And” means the same as “All”, and “Or” means the same as “Any”. Again, when no conditions exist within a section, it doesn’t matter what the setting is, but clearly (or rather, not), quite complicated and nested chains of logic can be set up. The very simple example is shown in Figure 25, after adding a new empty condition group to “Questions”. Nevertheless, in its current state the survey should now skip the page with the Services questions unless a respondent answers “Edinburgh” as the nearest city in the previous questions.

Figure 25. Setting conditions at different levels. a) In green - combining conditions within the same group. c) In purple – combining the results of multiple groups within the same section. c) In orange - combining the results of each section's overall result.

25

The main thing to point out about all the questions and skip logic is that they should be thoroughly tested, even to the point of setting up bogus invites to yourself and testing them as the respondent would see them. It saves a lot of hassle to find errors before the survey is deployed, and any ‘test’ responses can always be deleted or filtered out (as shown in the responses section later on in this guide).

Key Points  Map out your survey questions and flow before putting it in the NOVI software.  It is mandatory to anonymise responses through the Survey Options  It is mandatory to allow respondents to choose to not answer questions. Either make them non-required or allow N/A responses for questions.  Avoid NOVI’s Demographic question type for the above reason.  Test your questions.

26

Deploying a Survey A lot of the functionality to learn in Novi revolves around settings that are nothing to do with questions. Many of these options concern deployment issues and the visual presentation of the survey to respondents.

Response Control Conceptually, deployment/response control is about restricting who can answer the survey, how many times they can respond, from where, and what leeway they have to change their answers. Beware that some features, subject to setting, may start off controlling access per respondent but then switch to actually controlling access per computer. These pitfalls are described in the following sections where necessary.

In terms of controlling responses, there are a number of settings to control who can respond to the survey, how often they can respond, whether completed or half-filled surveys can be changed or resumed, and whether IP restrictions are to be imposed. All these settings, and others, are controlled from the “Survey Options” choice in the Survey tab’s ribbon bar. A screenshot of the Survey Options form was shown back in Figure 4, but this time it is the functions contained in the side-tabs other than “Main” that are of concern, the most important of which are described next:

Responses Side-Tab > Access Control: Decide whether survey respondents are limited to anyone with a login to the survey system, and/or those especially invited to respond. If neither box is ticked then access is public when the survey is set as “open”.

Responses Side-Tab > Tracking of respondents: IMPORTANT: One thing to note is the anonymisation of responses. Anonymising respondent metadata returned with the survey is NOT the same as anonymising data in the survey. For instance, the tracking of respondents can be set to “anonymous” in this dialog, which blocks associated info such as their email and survey login being passed back with their survey. It is recommended that this setting be set to anonymous or semi- anonymous to comply with the terms of the ICF. However, responses to demographic and identification questions in the actual survey will still be there.

Respondent: J. Smith IP: 146.176.144.144 Anonymised Email: [email protected]

Q: Who are Q: Who are you? you?

A: I am John A: I am John Smith Smith

Figure 26. Survey anonymisation only affects metadata sent along with the survey, not responses in the survey itself.

Responses Side-Tab > Maximum number of responses per respondent / Resumable Survey / Allow resuming a completed response: All these options are about controlling the flexibility of survey filling, and require respondent tracking to be either semi-anonymous or identifying. If respondents are anonymous there is both no way of restricting the number of surveys they complete, or of letting

27 them resume a part-filled survey or change a completed survey. If tracking is set to semi-anonymous then resumption and/or multiple responses depends on the conditions listed in Table 3.

Table 3. Resumption of survey by" tracking respondents" setting

Tracking respondents Anonymous Semi-Anonymous Identifying setting

Survey resumes or is Never Same browser, computer and user Same NOVI login changeable when account (i.e. napier student login) using… Any difference in that set of three will resume a different survey or start a new one.

Multiple responses Never As above As above are measured by…

It is not entirely clear how resuming completed surveys works with the option to complete multiple surveys i.e. how to tell if a respondent wants to edit an old response or fill in another new response? The only logical option seems that once resuming a completed survey has passed a set time limit, a new survey is set up.

Thus, the responses tab is full of options that all need considered before a survey is opened up to the world. Changing the settings when responses are already being received is likely to confuse respondents and subsequent analysis.

IP Addresses > Decide whether there’s a need to record respondents’ IP addresses and filter in or out certain IP ranges. This is a blunt instrument as IP addresses cover computers (and dynamic IPs often mean this is not a constant 1-to-1 mapping), not people, and cannot be considered a useful proxy for limiting who can access the survey – different people may use devices with the same IP, one person may use several devices with different IPs. It can be used to limit responses to devices connected to a certain sub-domain such as say Napier University or Sky Broadband (if the IP blocks they are allocated across are known). However, that is not the same as restricting respondents.

Deployment > Another place to set the status of a survey {Design, Open, Closed}, for controlling when the survey is available when open, and what additional functionality the respondent sees in the survey e.g. does each page have a back button (to change earlier questions), a print button etc.

Completion > Decide what happens when the survey is complete, in terms of what the respondent sees and receives.

When satisfied with the changes, select “Save Changes” in the ribbon and the interface will return to the main Survey tab and associated ribbon.

Invitations To make a survey exclusive to a select group of respondents then setting up access to the survey as invitations only is the way to proceed. On the other hand if it is to be open to the general public or to only existing Novi Survey account holders then this following section can be skipped (this guide’s paper version of page logic!). Firstly, the appropriate choices need to be set in Responses > Access Control

28 as outlined in the previous section. Then, select the “Invitations” option that sits on the Surveys ribbon bar (at the bottom of the column that has Alerts at the top) which will switch the interface to the Invitations form.

The conceptual design of Invitations is that groups of people can be invited separately to the survey (and at different times too). In the event that some groups overlap in terms of members there are settings to decide whether people get invited just the once, or multiple times.

Figure 27. Initial empty dialog for "Invitations".

The initial form will look empty apart from a skeletal framework of “Groups of Invitations”, as shown in Figure 27, which is to be expected given no-one has been invited to this survey yet. Also, given no invitations have been issued yet, the options are restricted in the ribbon to “New Invitations” and “Copy Invitations” (for reusing invitation lists from another survey). Selecting “New Invitations” launches another form with a set of side-tabs for constructing a new group to invite to the survey, as shown in Figure 28.

Probably the first thing to set straightaway in a new invitation group is the Scheduling of invitations. By default, NOVI schedules new invitations to be sent at the next quarterly-hour interval in the immediate future. Beware that if for some reason the survey is set as “Open” (which it really shouldn’t be before it’s ready to deploy), this immediate default setting of NOVI Survey can end up firing off emails to invitees before it’s expected, probably when halfway through constructing invitee lists too, and adding new contacts to this invitee group will then be frozen. So first things first, go into the “Scheduling” side-tab and set the invitation time to when the invitations should be actually sent. Other options such as if/when to send reminders to unanswered invitations can also be set here. The at- first-glance nonsensical option to not send invitation emails is there if another mechanism to send invitations has been arranged.

There is actually a checkbox in the Main side-tab called “paused” which will stop invitations getting sent out when it is checked. Bizarrely it is not set to be checked by default, which would avoid the problem outlined above, though this can be done manually when setting up a new invitation group.

29

Figure 28. Main side-tab of the 'New Invitations' form.

The meat of invitations is in the setting-up and amending of invitee lists, which is only partially tackled under the “Contacts to invite” side-tab (more can be read on invitee / distribution lists under the Contacts section later on) shown in Figure 29. This panel allows the addition of individuals and pre- determined lists of individuals to this invitee group, and is also where the options to re-invite people who have not replied to earlier invites, or invite people more than once, are encountered.

Figure 29. Contact invitee side-tab in "New Invitations" form.

The “Add invitees” option in the “invited respondents” section in turn launches a dialog, shown in Figure 30, which asks to either select a pre-determined list of invitees to add to this group, or to input the details of an individual person, and then choose a list of invitees to add this person to. Whichever list is eventually chosen in either case is added to the invited respondents table in Figure 29.

30

Figure 30. Adding individuals to invitation groups is done by adding their details to lists.

At this point, several levels deep into the NOVI interface, a diagram to recap the current position in the interface may help allay confusion, as shown in Figure 31:

Figure 31. The interface path from "Invitations" in the survey ribbon to adding an individual to an invitation group.

31

Multiple invitation groups can be added, with the rules for multiple invitations, re-invitations etc. then taking effect.

Contacts A pertinent question to now ask is how these pre-existing lists of potential invitees get set up, as adding individuals repeatedly would get tiresome very quickly, and if lists could be reused between surveys and also shared between survey designers then a lot of repetitive work could be avoided.

In summary, each user account in NOVI has a default list of contacts that can be added to, and others can be generated and accessed through a separate part of the NOVI interface called the Address Book (unsurprisingly enough, it’s accessed via the “Contacts” tab at the top level). Selecting this tab reveals all the contact lists that a survey designer has access to in NOVI, as shown in Figure 32.

Figure 32. Contact group list interface.

Figure 33. Setting up and editing a contact list is done via this dialog.

Setting up a new contact list is performed by selecting “New List” in the ribbon. The options are fairly basic and accessed via the “Main” and “Contacts” side-tabs (see Figure 33): give the list a name, decide

32 if any other owners should be added, and add contacts individually. The “Edit list” option in the ribbon shows essentially the same set of controls, and “Delete List” does what it implies. Editing, deleting, activating or inactivating a contact list can only be done by one of the owners of that contact list.

Bulk importing of contacts into a list can be done in one of two ways, the “Import Contacts” and “Import Directory” options in the ribbon in Figure 32. Import Contacts asks for a CSV file of contacts to import: the precise format needed can be gathered by downloading and opening the example CSV file NOVI offers when this option is selected. Of more relevance is the “Import Directory” option, within the drop-down list in this dialog are contact lists for most Napier courses/offices/departments within the university. Selecting one of these groups, then picking which members to exclude/include (if any) via the checkboxes and selecting “Save” will add all these contacts to the contact list, as shown in Figure 34. Be careful and notice the dialog is paged, there are often hundreds of people in a group, not just the few that are shown first. It is a much quicker way of setting up contacts than adding individuals, especially if targeting a specific group. Of course, this all depends on the groups having been set up and populated by someone else first.

Figure 34. Selecting a directory group shows a page-based view of its members along with checkboxes to include/exclude individuals from being added to a contact list.

The last bit of functionality to manipulate contact lists with is the “Merge Lists” functionality. Select one list (call this the first list) from the Address Book and select “Merge Lists” in the ribbon. A dialog appears, as shown in Figure 35, and asks which other list it should be merged with (call this the second list), and how the merging is to be performed (see Table 4). The important point to note is that the “new” list always overwrites the first list (the ticked list in the address book), it doesn’t generate a new separate list (and I’ve found a bug where other people’s lists can be changed with it too) so be careful.

33

Table 4. Merging lists possibilities. Merge Operation What It Does Union Add everyone in the second list to the first list Intersection Only keep people in the first list who are also in the second list Minus Get rid of anyone in the first list who is also in the second list (this is the opposite of intersection)

Figure 35. Merging lists via a pop-up dialog. Be aware the first list chosen will be over-written with the new list.

To summarise, invitations to restricted access surveys are best set up using contact lists. Contact Lists can be set up and shared between NOVI users, and in turn the Import Directory function allows specific groups of people to be quickly added to Contact Lists.

Styling Now that a survey has been designed and an invitation list constructed, it is time to consider what the respondent will see/receive at their end. This can be divided into three ‘chunks’ which are controlled from different places in the NOVI interface:

1. What the respondent sees when invited. 2. How the respondent sees the survey. 3. What the respondent sees after they complete the survey.

The first point is controlled back in the invitations group forms (and again skip along to the second point if not explicitly inviting people). Invitations were set up via contacts but what will invitees see upon invitation? In Figure 28 one of the side-tabs is labelled “Email”: select this side-tab to get the panel shown in Figure 36. This controls the formatting and contents of the invitation email sent out to people when the survey isn’t public access or just accessible to any NOVI user. Make sure any email sent is polite and always offer the choice of not taking part; Figure 37 gives an example of what a possible invitation email looks like to the recipient.

34

Figure 36. Setting up an invitation email.

Figure 37. What an invitation looks like to a potential respondent.

The second styling issue to consider is how the survey actually looks in the flesh. Before a survey is formally launched it may be wise to spend a bit of time polishing its . Under the Survey tab at the top-level , there is a Manage Templates option in the ribbon. Selecting this will display a paged list of standard templates (see Figure 38) to control the look of a survey, including logos, question alignments, background colours, progress bars etc.

35

Figure 38. The template ribbon and list.

In the case that none of the templates pass muster, a new one can be built by selecting “New Template” (or more wisely, by selecting the nearest template to what is desired and then using the “Copy Template” option) and tweaking individual settings – see Figure 39 for a flavour of the detail this involves. It is way beyond the scope of this document to give advice on visual design, but a simple rule is not to use too many different types and sizes of font in what will be a 99% textual document. One major issue to bear in mind is what devices will respondents be filling in the survey on? If a survey is designed on a 21” monitor, check to see if it is still usable on a 7” tablet or even on a mobile phone screen. The “Preview Template” option in the ribbon will show a survey in all its glory in the chosen template.

Figure 39. Setting various options in the template affects the look and feel of the survey.

The final consideration is what if anything does the respondent see once they’ve completed a survey? This is controlled through the Survey Options functionality. Select the top level Survey tab and select Survey Options in the ribbon, and the interface returns to the form first met way back in Figure 4. Here, select the “Completion” side-tab, where two important options can be decided: firstly, does the respondent get an email with various links in after completing the survey, and secondly, do they see a final page in the survey, thanking them for their time and a link to their particular responses etc.

36

(basically the same things as would be sent in the email)? Figure 40 shows the options that are available at this point, it would seem logical that completion emails are more suited to people invited by email – anonymous respondents won’t get them and NOVI users are better served by a completion page at the end of the survey. There is also the option (“Complete Action”) to send on respondents elsewhere after the survey is exited.

Figure 40. The panel connected to the Completion side-tab decides what a respondent sees after finishing the survey.

Alerts Collected responses will be indicated under the top-level Survey tab: each survey has columns reporting its status and how many responses it has received and these update automatically. However, if you do feel like being spammed to death the “Alerts” option in the ribbon will forward emails every time a condition is met in the survey, via the same set of conditional operators the Page Condition logic uses. Usually though the only sensible condition is “Response Status: Completed”.

Figure 41. Deployment options. Set everything else up first before setting the status to "open".

Final Deployment Ok, time to launch. If inviting people, go back to the Invitations dialog, select the required groups and schedule the invitations to start firing off sometime in the near future (but not immediately). Then go to the Deployment side-tab in the Survey Options form as shown in Figure 41. Set the times this survey is open for completion if this survey is to be time-limited (if not it’s availability is decided by the survey

37 status), decide whether respondents have a back button / / print options, and then finally set the status to “Open”. Now await the responses, and the next section describes what can be done with those.

Key Points  Under the terms of the ICF response tracking should be anonymised or semi-anonymised in the Survey Options settings.  IP filtering is a flawed way of controlling access.  If restricting survey access, contact lists are a much quicker way to build potential respondent groups than individual invitations.  Test the survey with the Preview options.

38

Analysing a Survey The final stage after constructing and deploying a survey is obviously to analyse the answers. NOVI has three tabs that cover these aspects, the top-level “Responses”, “Reports” and “Exports” tabs. The Responses option allows analysis (and editing) of individual responses, while the Reports option shows basic statistics and charts of chosen parts of the survey and the Export function outputs the data to file for use in more sophisticated statistical and charting tools.

Responses The “Responses” tab at the top level opens up to show a paged table of all the responses to a selected survey. Set the browsing mode to be “Survey” rather than “Respondents”, select the survey of interest from the drop-down list, and the responses are then listed in a table beneath. It shows basic data such as completion times, dates, languages etc., as in Figure 42. Choosing the browse by Respondents option cuts the other way, showing which surveys a particular respondent has completed.

Figure 42. The top level of the Respondents tab.

Selecting a particular response allows probing of that particular response in more detail via the ribbon and even the options to edit or delete it (when the survey is closed). Test responses may need to be deleted (or can be filtered out from reports etc. using other means), or in more exceptional circumstances a survey respondent may request their response is deleted – which is only possible if their response is uniquely identifiable through a UserID or a survey ResponseID. In some cases, deletion requires Napier C&IT to give you extra permission to have deletion privileges. Beyond that, it’s not particularly exciting so move onto the next two options – Reports and Exports.

39

Reports Selecting the top level “Reports” tab displays a new ribbon with various options to construct a report based on the responses received to a survey so far. Starting with the obvious starting point – “New Report” – selecting this option shows a dialog that asks for a name for this report and to select the survey the report is based on.

Once this dialog is completed a new form is presented, as seen in Figure 43, and is very similar to that presented for the design of survey questions and pages. This is the “Design Report” page (and returning to the top level Reports section will reveal a Design Report button now added to the report ribbon). A default section (called a report element) has been added, “Questions”, as it’s assumed the report should cover the answers to the survey at least.

Figure 43. Initial “Design Report” set-up.

In essence there are two basic tasks to designing a report: 1) What gets included or excluded, and 2) What it looks like if it is included.

Different types of report elements such as Respondents and Contingency Tables can be included via the “Insert Report Element” button – see the bottom of Figure 43 – the Contingency Table option plots answers to two different questions against each other, but this is about as statistically advanced as the report gets.

Selecting the Edit option that is present in a report element header – circled in purple in Figure 43 – switches to a form for controlling both the appearance and coverage of a particular element. A set of side-tabs (Main, Options and Charts) provide options, the ones in Main being self-explanatory topping and tailing texts for this report element. Within the Options side-tab are settings to format and, more importantly, decide which sub-elements are included within a report element. For instance, within a question report element, it supplies mechanisms for deciding which individual questions are included in the report – handy to include only certain parts of a survey, or if it’s intended to split the questions into two or more separate elements within the report. Figure 44 shows this functionality being used for a survey about mobile phones. The Options feature also has settings which decide whether responses are itemised individually, whether non-answers are reported etc., and for anonymizing participants – which under the terms of the ICF must be selected if their responses are not already anonymised. The Charting option is left for later.

If the Edit functionality can be used to decide which questions are included in a report, then what can be used to decide which responses are included in a report? Luckily, to help decide what gets excluded, two levels of filters can be set on the report. Those that apply to the entire report (“Report Filters”) and those that apply to individual sections of a report (“Response Filters”) – circled in red in Figure 43. Unluckily however, it should be noted that currently setting report filters leads to an error within

40

Napier’s version of NOVI. So, just looking at setting filters within report elements, these are all done again through similar interfaces and options that are used in setting up page skipping logic etc. An example is shown in Figure 45, where after selecting “Response Filters” for the question report element, a new response filter has been set up via a dialog such that only responses from males are reported.

Figure 44. Report Element > Edit > Options can decide which questions are included within the question reporting element.

Figure 45. Setting a filter condition on report generation.

41

The “Setup Report” option available in the top Reports ribbon (making sure a report is currently selected in the list) or in the ribbon for an individual report, controls global options for the entire report and selecting it gives a dialog with three side-tabs, “Main”, “Charting” and “PDF”.

“Main” controls the setting of layout options such as introductory graphics, title pages, subtitles, whether sections begin on new pages etc. Of particular note at the bottom of this section, shown in Figure 46, is a powerful widget called a “list builder” that allows adding extra surveys to be added to the report if required – in this way a report can include questions and responses from more than one survey. Simply click the arrow to move surveys between the “Available” and “Selected” lists.

Figure 46. The bottom widget can be used to add responses from multiple surveys to a report.

The “PDF” side-tab offers basic settings if the report is to be converted into a PDF – mostly margin and page sizes. More interesting is the “Charting” side-tab which allows the customisation of four basic graphs that can be used for questions in the report – Bar charts, Pie charts, Line graphs and Radial charts.

The type of chart being customised is controlled from the top drop-down list within the Charting panel, as shown in Figure 47. Changing this value changes the fields and options that appear below this first drop-down box, as many fields are particular to a given chart type e.g. while a Bar chart can have a stacking bars option, it’s meaningless for a Pie chart. A few fields are common to all charts, such as title elements, and whether to show a border around the chart or not. The NOVI help function adequately covers the functionality and meaning of each individual setting for the different charts, so there is no need to repeat it here.

42

Figure 47. Options for a bar chart, some are specific to the chart type, others are universal to all chart types

While these options control chart appearances for the entire document, these can be overridden for individual report elements. Remember back in the “Design Report” section, the “Charting” side-tab under the “Edit” functionality was ignored? This is where styling decisions for charts in individual report elements can be made using the same options as above.

Finally, selecting “Run Report” from the ribbon will generate an HTML document with the collated answers to all the questions, along with associated charts showing the numbers graphically. An example of this default report style is shown in Figure 48. Small icons in the top right corner of the report offer options to save as a pdf or to print the report.

43

Figure 48. Default report output.

Exports The final part of the puzzle is the ability to export surveys to external file formats for use in more powerful analysis and charting software. Selecting the top-level “Export” tab brings up functionality in the ribbon that allows the export of different data sets: survey responses, report responses (remembering these can be from multiple surveys), contact lists and survey definitions (the questions and possible answers, essentially the survey schema, though it lacks range definition – is this matrix scored 1-5? It doesn’t say). When one of these options is selected, there are options to select the survey or report to export data from, which format to export the data in (apart from survey definitions – which is always an excel file) as shown in Table 5, along with a host of options for how to represent missing values etc. as shown in Figure 49.

Table 5. Exported data types and file formats

Exported Data Exported File Formats Survey Responses CSV, Excel (.xslx), SPSS Report Responses CSV, Excel (.xslx), SPSS Contacts CSV, Excel (.xslx) Survey Definition Excel (.xslx)

44

Figure 49. Export data options for the report data type.

The next step in each case isn’t entirely obvious. What to do is to select the “Save Changes” option from the ribbon when the changes are done. This returns to the top-level Export interface, where the newly constructed export will be shown as being generated in the export list beneath the ribbon. Once the export has finished being generated a link will be available in the “Download” column. Selecting this link will then download the exported data to your computer, usually into a browser downloads folder. All existing exports can be re-downloaded or deleted using the “Cancel Export” option in the ribbon.

Key Points  The Responses tab records all completed responses.  Reports can be filtered by question and respondent, but also include data from multiple surveys.  Report charts consist of four basic types – bar, pie, line and radial charts.  Response and question data can be exported for external use, always in Excel format, often in csv, sometimes in SASS format.

45

Summary This guide has covered the three main activities supported by the NOVI Survey software – the setting up, distribution and reporting of surveys.

46

UI things to note

1. The “Insert Question”, “Import Questions”, “Break Page” group occurs before the first question and after every question, and the actions work at the particular point they are clicked. So clicking the first “Insert Question” inserts a question at the start of the page. Beware. 2. Re-entering survey. From the screen listing all surveys it’s not clear how to re-enter a survey. Highlight the survey in the list and then choose Manage Questions from the survey tab at the top. 3. Page introduction. Cannot exceed 2,000 characters but gives no indication of how many characters are used when typing. 4. The “Page Show & Hide” action is different terminology from the “Page Conditions” it controls. 5. “Break Page” has the same effect as “New Page” when selected after the final question on a page. Should it do so? Should either be greyed out or renamed as “New Page” in those circumstances IMO. 6. Page Conditions / Skip Logic – single condition groups term logic as “all” (AND) or “any” (OR). Combining groups and sections reverts to plain “AND” and “OR” text labels. 7. Why is “Paused” in New Invitations group not set to checked as default? Would stop invitations getting sent out by mistake. Paused not checked (default) + Invitations set on next quarter hour (default) + Survey set to open (user action) = Invitations sent, possibly when not intended or only half set-up.

Bugs Found As reported in NOVI v.6.6.5746

1. On Open-Ended Date question set-ups, the minimum and maximum range controls do not work. NOVI support says upgrade to version 6.8. 2. Setting a report filter (on a question at least) for the entire report causes an unexpected error when I try and save the changes. This appears to be fixed in 6.7 looking at NOVI’s version blog. 3. I can wreck other people’s contact lists via the merge lists functionality. NOVI are releasing an update for this (May 2014). 4. Selecting a blank export type and then reselecting an existing export type loses all the optional fields that are present at first. Reported to NOVI – confirmed as bug, they will release an update (June 2014).

47

[ TITLE OF STUDY ]

Edinburgh Napier University requires that all persons who participate in research studies give their written consent to do so. Please read the following and sign it if you agree with what it says.

1. I freely and voluntarily consent to be a participant in the research project on the topic of [some words of explanation] to be conducted by [your name], who is an undergraduate/ postgraduate student in the Edinburgh Napier School of Computing.

2. The broad goal of this research study is to explore [broad description of study only - to avoid premature shaping of participant’s responses]. Specifically, I have been asked to [brief overview of procedure], which should take no longer than [estimated length of study] to complete.

3. I have been told that my responses will be anonymised. My name will not be linked with the research materials, and I will not be identified or identifiable in any report subsequently produced by the researcher.

4. I also understand that if at any time during the [survey/interview/session/other] I feel unable or unwilling to continue, I am free to leave. That is, my participation in this study is completely voluntary, and I may withdraw from it at any time without negative consequences.

5. In addition, should I not wish to answer any particular question or questions, I am free to decline.

6. I have been given the opportunity to ask questions regarding the [interview/survey/procedure] and my questions have been answered to my satisfaction.

7. I have read and understand the above and consent to participate in this study. My signature is not a waiver of any legal rights. Furthermore, I understand that I will be able to keep a copy of the informed consent form for my records.

48