TOWARDS AN ORCHESTRATION OF FORECASTING METHODS

TO DEVISE STRATEGIES FOR DESIGN

by

Priyanka Sharma

APPROVED BY SUPERVISORY COMMITTEE:

______Dr. Maximilian Schich, Chair

______Dr. Kimberly Knight

______Dr. Matt Brown

______Dr. Frank Dufour

Copyright 2018

Priyanka Sharma

All Rights Reserved

TOWARDS AN ORCHESTRATION OF FORECASTING METHODS

TO DEVISE STRATEGIES FOR DESIGN

by

PRIYANKA SHARMA, B.Des, MA

DISSERTATION

Presented to the Faculty of

The University of Texas at Dallas

in Partial Fulfillment

of the Requirements

for the Degree of

DOCTOR OF PHILOSOPHY IN

ARTS AND TECHNOLOGY

THE UNIVERSITY OF TEXAS AT DALLAS

August 2018

ACKNOWLEDGMENTS

I am greatly appreciative of the many individuals who provided support and encouragement for my work through the process of writing this dissertation. Above all, I would like to thank my doctoral committee for their continuous guidance, critical feedback, and timely advice in the past few years. I am especially indebted to my advisor Dr. Maximilian Schich, whose invaluable insight, unique perspective, persistence for perseverance provided me with the inspiration and motivation to work towards this dissertation. I thank Dr. Dufour for being the relentless source of optimism and his faith in my efforts. His support and guidance made me hopeful of being capable of ingenuity and left me intellectually stimulated. I express my sincere gratitude toward

Dr. Kim Knight for her trust in me and enabling me to continue this journey. I am indebted to Dr.

Matthew Brown, who during the entire course of writing this dissertation helped me immensely with the structural aspects of this dissertation and guided me towards practical and critical milestones in my research.

Dr. Schich’s multidisciplinary approach in the field of art, architecture and cultural history continues to influence my way of thinking and questioning the status quo. His work integrating visual hermeneutics with methods of computation, natural science, and information design methods provided most pertinent perspective on how insights into the past may prove to be crucial in postulating the future. Dr. Dufour’s expertise regarding the epistemology of creating novel and timely methodological solutions helped me iterate and re-iterate my research process.

My heartfelt appreciation goes to Dr. Matt Brown, whose expert guidance in understanding the nuances of interpretative methods through the lens of cognition and interaction was instrumental

iv in enabling me to craft the structure of solution through my research. I feel extremely fortunate to have Dr. Kim Knight as a member of my committee. Her trust and generosity allowed me to keep working towards my goal and I am deeply grateful to her.

There have been many others with whom I interacted, worked, learned from and have guided me through the process with their friendship, knowledge, and wisdom.

August 2018

v TOWARDS AN ORCHESTRATION OF FORECASTING METHODS

TO DEVISE STRATEGIES FOR DESIGN

Priyanka Sharma, PhD The University of Texas at Dallas, 2018

ACT Supervising Professor: Dr. Maximilian Schich

Social, technological, political and environmental paradigms are changing globally at a pace never seen before, giving way to rapid changes in social patterns, culture, and human behavior.

Designers and creative industries are constantly trying to develop solutions, which are relevant for the end user and in tune with the current times. Since design itself, is a process of synthesis it cannot be carried out in seclusion. It needs to take into account changes happening in the surroundings and changing consumer preferences. Designing solutions without considering the overarching changes underway and their future impact run the risk of losing large monetary investments going into the process. Hence, it becomes extremely important that the design process is informed by relevant forecasts to address the changes emerging in the overall landscape of society and lifestyle.

Forecasters have grappled with ambiguity when devising forecasts for the design of consumer goods, durables, services and other lifestyle products. Multiple factors and distinct needs are involved with design projects. With the plethora of forecasting research methods available, selecting the right forecasting strategy is crucial to designing meaningful and successful

vi products. Forecasting literature provides multiple ways for categorizing forecasting methods, including qualitative, quantitative, normative, exploratory methods. Yet, any classification into such categories would focus on the methods rather than target the need and context for forecasting. This dissertation emphasizes the importance of the strategic forecasting approach to holistically inform the design processes. In particular, this dissertation, aims to facilitate a modular orchestration of methods from the ecology of forecasting approaches through a

‘Composite Framework of Forecasting Methods and Applications’. The composite forecasting framework illustrated in this dissertation gleans from three distinct contexts or needs (predict, prefer, and paradigm-shift) for forecasting and utilizes operational caveats (availability of data, lead time, nature of forecast and forecasting range) to guide the selection of forecasting methods.

The strategic selection of methods thus obtained enables achieving relevant and accurate future insights, constructively informing design processes geared towards specific forecasting needs and contexts. Applying the composite forecasting framework to real world scenarios opens future conversations for an interdisciplinary perspective on design processes and a richer amalgamation of forecasting techniques.

vii

TABLE OF CONTENTS

ACKNOWLEDGMENTS...... iv ABSTRACT...... vi LIST OF FIGURES ...... ix CHAPTER1 INTRODUCTION...... 1 CHAPTER 2 FORECASTING METHODS: AN ANALYSIS...... 14 A. METHODS OF OBSERVATION...... 14 Environmental Scanning...... 23 Nystrom’s Framework...... 29 Design Trends Analysis...... 36 B. METHODS OF ESTIMATION ...... 44 Statistical Modeling...... 44 Cross Impact Analysis...... 62 Decision Analysis...... 68 Machine Learning Methods...... 73 Morphological Analysis...... 81 C. METHODS OF INTERVENTION...... 89 Delphi...... 89 Ethnography...... 94 Design Thinking...... 107 Other Methods...... 118 CHAPTER 3 CASE STUDIES: AN EVALUATION...... 123 CHAPTER 4 FORECASTING NEEDS AND ATTRIBUTES...... 145 CHAPTER 5 APPLYING THE COMPOSITE FORECASTING FRAMEWORK...... 181 CHAPTER 6 CONCLUSION...... 201 REFERENCES...... 207 BIOGRAPHICAL SKETCH...... 230 CURRICULUM VITAE...... 231

viii

LIST OF FIGURES

1.1: Tracking the frequency of ‘trends’ and ‘trend forecasting’ from 1800 to 2000 in English books, using the Google Ngram Viewer…………………………………………………………..6

1.1A: The idea of “trend” began gaining prominence around mid 19th century…………………..6

1.1B: The term “trend forecasting” does not show much prominence as compared to the general idea of “trends” …………………………………………………………………………..6

1.1C: When viewed on Ngram viewer by itself “trend forecasting” grows since 1970s……...... 6

2A.1: A representative Trend Curve.…………………………………………………………………..15

2B.1: Galton’s experiment showing regression in children’s heights…………………………...47

2B.2: A linear regression plot……………………………………………………...…………….49

2B.3: Plots showing correlations between dependent and independent variable………………..52

2B.4: Examples of structures of variables……………………………………...………………..55

2B.5: Example of cycle identified by pattern discrimination algorithm…………………………57

2B.6 Schematic representation of Cross Impact Analysis…………………………………...... 65

2B.7. Representation of a basic Decision model………..….…………………...…………...... 68

2B.8 Framework for Morphological Analysis...…………………………………………………83

2C.1: Figure summarizing the forecasting methods …………………………………………...122

3.1: Pontiac Bonneville in 1956………………………………………………………………...136

3.2: Pontiac Aztec’s “bold” design …………………………………………………………….137

4.1: Forecasting methods and respective dependence on availability of existing data.………...147

4.2: Forecasting methods and respective Forecasting ranges…………………………………..149

4.3: Cone of Uncertainty………………………………….…………………………………….150

ix

4.4 Forecasting methods and respective lead times ……………………………………………151

4.5: Methods and respective nature of forecast…..…………………………………………….152

4.6: Forecasting Methods and their respective attributes ……………………………………...154

4.7: Forecasting methods and their respective attributes ………………………………………155

4.8: Attributes by their relevance towards the need for ‘Predictability’ …………………...... 158

4.9: Arranging attributes by their relevance towards the need for ‘Predictability’.……………161

4.10: Graph representing a paradigm shift …………………………………...... 164

4.11: Arranging attributes by their relevance towards the need for ‘Paradigm shift’...………..167

4.12: The composite forecasting framework…………………………………………………...169

4.13: Arranging attributes by their relevance towards the forecasting needs….…………...... 170

4.14A: Forecasting framework towards the forecasting need for “preferability”……………...171

4.14B: Forecasting Methods and their respective attributes focused on the need to predict...... 173

4.14C: Forecasting Methods and their respective attributes focused on paradigm-shift...... 175

4.15: Functional Summary of the Forecasting frameworks…………………………………….178

4.16: Example of forecasting framework when taking the forecasting needs for ‘preferability’ and ‘paradigm-shift’ into consideration simultaneously…………………...... 180

5.1 Manifestation of Forecasting frameworks for the case of ………………….183

5.2: Manifestation of forecasting framework for the case of Britannica Encyclopedia…...... 185

5.3: Manifestation of forecasting framework for the case of Pontiac Aztec………..…………187

5.4: Graph representing percentage change in Tiffany & Co.’s holiday season sales year over year since 2008…………………………………………………………………...... 193

5.5A: Application of forecasting framework per scenario 1 for Tiffany & Co………………..198

5.5B: Application of forecasting framework per scenario 2 for Tiffany & Co…………...... 199

x

CHAPTER 1

INTRODUCTION

We are living in a dynamic society and ever-changing environment. Social, technological, political and environmental paradigms are changing globally at a pace never seen before, giving way to rapid changes in social patterns, culture, and human behavior. The swift developments and unknown consequences necessitate a deeper understanding of the changes. To grow and sustain their capital and conceptual value, designers and creative firms strive to stay relevant to our times.1 Methods that facilitate a deeper understanding of changes in the environment can be instrumental in designing future solutions, products, and experiences that are of value to the user and relevant to the times. As such, forecasting research plays a crucial part in the process of creating products, environments, and experiences for a better future.

“Towards an orchestration of forecasting methods to devise strategies for design” – also comes from a place of personal motivation. Having worked in the consumer goods design as well as forecasting industry for over 11 years, I have witnessed various occasions where designers feel at a loss when initiating a design project. This may happen due to the total lack or poorly defined design brief. Often times companies do not understand the value of forecasting research and think of it as an unnecessary expense, making the designers entirely responsible for the success or failure of a future product. In some cases, even the design that is the right fit for the market and consumer needs does not make the cut due to personal biases of the leadership. In all

1 An example of failure to deliver on relevance, is the decline of the Finnish cellular phone device company Nokia. One of the major reasons (besides internal organizational issues) for Nokia’s decline was the failure to realize the importance of change being brought about by lifestyle products like the iPhone, and refusing to switch to the Android platform (Cord 2014), as they were afraid to compete in that ecosystem.

1

such scenarios, not only the can the involved company face commercial loses; and the designers miss a creative opportunity, the consumers miss out the opportunity of embracing a product that may have the potential to define our times. Thinking about a world without the iPhone makes this sentiment starkly clear. Through this thesis, I aim to provide the forecasters a tool to pursue systematic and holistic forecasting studies. Such forecasting studies can, in turn, provide a well- informed starting point for designers that align them with the current needs and preferences of the consumers and also helps to keep personal interests and biases of other stakeholders in .

The concept of “trends”

The topic of trends is often interpreted differently by different disciplines. According to the Oxford English dictionary, the word ‘trend’ is derived from the word ‘trendan’ of Germanic origin meaning revolve or rotate.2

Today, from embodying a general meaning of “direction in which something is developing or changing,” the word also signifies the styles in , popular topics in social media, prominent patterns in economics and commerce, etc. Since the word “trend” is interpreted in multiple ways in different groups and contexts, it is crucial that we establish the sense in which we will discuss ‘trends’ in this dissertation. William Higham (Higham 2009), the author of

“The Next Big Thing” defines trends as “change in consumer attitudes and behaviors”. The social, technological, economic, environmental and political paradigms around us change constantly, creating a shift in people’s needs, preferences, expectations, behavior, and lifestyles.

2 Dictionary, Oxford. “Trends”. Https://En.oxforddictionaries.com/Definition/Trend,2018, en.oxforddictionaries.com/definition/trend.

2

The trends that outline the kind of products and experiences people will choose to consume are therefore also referred to as “consumer trends” (Business Dictionary 2018). Industries associated with the creation of products for final consumption by consumers such as fast moving consumer goods,3 nondurables,4 durables,5 and services and other lifestyle products can immensely benefit from incorporating forecasting for consumer trends in their new product design and development processes.

In our discussion, we focus on forecasting of consumer trends. Forecasting here implies identifying prominent patterns about how choices and behaviors are changing today and how they will evolve in the future. The changing profiles of products in the area of consumer goods including fast-moving consumer goods, durables, automobiles, fashion and design products, buildings and architecture, electronics; experiences in social space, transportation, e-commerce, recreation, exhibition, branding, interior and exterior special design etc. are merely indicators of changing consumer trends.

It is also important to state that this discussion does not refer to predicting exact numerical values pertaining sales or market demand for a new product. The objective of consumer trends forecasting is to help decision-makers and stakeholders take informed action today in order to create relevant and meaningful products in the future.

3 Fast moving consumer goods include products which have a short shelf life, are consumed fast and are relatively inexpensive, like packaged foods, drinks and everyday consumables.

4 Nondurables are products which are consumed frequently but have a life span of less than three years such as clothing, shoes, small gadgets and equipment, medication etc.

5 Durables are products which have a longer time span, usually more than three years and are relatively more expensive, such as cars, appliances, furniture etc.

3

The concept of lead times

One of the reasons forecasting for consumer trends is critical to the process of creating new products is the lead time for that product category. Lead time is the time between the inception and final manufacturing of a product. For areas such as fashion and apparel, the lead times vary between six to 18 months (Sproles 1981), for other industries such as automobiles and large appliances the duration could be three to five years (Unger and Eppinger 2009) (Johnson and Kirchain 2011). This implies that the design and new product development teams will need to think about those products today, which their consumers will be able to purchase farther out.

Consumer trends forecasting helps companies understand the triggers that will determine what consumers will be looking for in the future.

Bearing on design

How does forecasting of trends in the consumer goods and services domain relate to design? If a designer is tasked with creating new products, they should be asking some basic questions like who will be using this product, what is the function of this product, when will product launch, is there an existing product with the same function, what will differentiate this product, why will people buy this product, etc. before the ideation phase. Whether the task is to design a disruptively new product or an incrementally better product, employing trend forecasting insights can help inform the design process and can be decisive in the commercial success of the product. Trend forecasting is a step that needs to initiate the design process.

Design effort taking place in isolation entails the enormous risk of creating something with a heavy investment that may be a hit or miss. Trend forecasters should not be expected to dictate the exact design elements with which to design a product. They, however, can lay the guidelines

4

of form, function, expectation, and emotion based on their understanding of changing consumer preferences and behavior, for the designers to harness. Hence, it would be incorrect to state that a study of trends and forecasting has no bearing on the field of design. On the contrary, application of trends forecasting could be the difference between a successful and a failed design endeavor targeted at the future.

Precedents in consumer trends forecasting

Trend forecasting is by no means a recent phenomenon. In our discussion on forecasting of consumer trends, we strictly focus on observing patterns in social, technological, economic, environmental and political areas that drive changes in people’s behavior and choices over a long term. We limit to the 19th century and onwards to refer to precedents in trends forecasting practice, i.e., instances where attempts were made to forecast consumer trends.

Running a Google Ngram6 Viewer query for the word – ‘trends’ – returns the graph in

Fig.1.1A (Books.google.com 2017a). This shows that the use of the word ‘trend’, at least according to English books in Google Ngrams, started to gain prominence and became more and more commonplace around the mid 19th century. In comparison with ‘trends’, the use of ‘trends forecasting’ was not as prominent (Fig.1.1B) (Books.google.com 2017c). Reviewing Ngram for

‘trends forecasting’ alone also follows similar graph as trends having spiked after the 1950s, despite there being much fewer instances of ‘trends forecasting’ (Fig.1.1C) (Books.google.com

2017b).

6 Google Ngram Viewer is an online search engine from Google, that searches for the appearance of the specified phrase/es from among a corpus of published books over a selected time range and displays it as a graph.

5

(A) (B)

(C)

Fig. 1.1: Tracking the frequency of ‘trends’ and ‘trend forecasting’ from 1800 to 2000 in English books, using the Google Ngram Viewer. (A)The idea of “trend” began gaining prominence around mid 19th century (Books.google.com 2017a). (B) The term “trend forecasting” does not show much prominence as compared to the general idea of “trends” (Books.google.com 2017c). (C) When viewed on Ngram viewer by itself “trend forecasting” grows since 1970s (Books.google.com 2017b).

In the first half of the 20th century, there were some developments which may be considered as precedents to consumer trends forecasting, as they attempted to understand the impact of contemporary changes in people’s choices. One of the prominent areas where such developments happened was that of color. During the early 1920s, the US saw a period when people had the purchasing power and industry players competed to best meet the demands of style-conscious consumers (Person 1922). They were looking for new products, from day-to-day and social use products such clothes, utensils, furniture, food and drinks etc. to durables such as

6

automobiles and refrigerators. As companies competed for higher share and more dollars, manufacturers and retailers had to estimate the demand right not just quantity-wise but also taste- wise. This gave rise to the profession of “fashion intermediaries” (Blaszczyk 2006). These were people with profiles of an art director, advertising executive, consultant designer, color expert etc. and their main goal was to inform business decisions regarding aesthetic risks (Blaszczyk

2006). Companies wanted to differentiate their products against competition and used different marketing schemes, advertising methods, model variations and “color merchandising” (E. H.

Brown, Gudis, and Moskowitz 2006). The status of color elevated from being merely an aesthetic differentiator to a business generation factor. So much so that Fortune magazine in the

1930s described how companies were using color to sell products in a feature called “the color revolution” (Blaszczyk 2012). Automobile manufacturers, fabric mills, furniture makers worked with colorists to design color schemes and even product designs. The chemicals giant Du Pont had diversified into paint and lacquer business and established agencies to consult companies on upcoming color trends by preempting customers’ aesthetic choices. Du Pont’s H. Ledyard Towle

(Towle 1941) was among one of the firsts to set up color and design departments within their firms.

Established in 1838, Cheney Brothers Silk Manufacturing Company became the most popular and profitable silk manufacturers in the US by the late 1880s (Manchester and Brothers

1916). Not only did they have innovative manufacturing techniques, but through strategic marketing, they associated their fabrics with desirability, aspiration and premium lifestyle through artistic photoshoots and imagery. Gradually the company also began to forecast fashion trends. Due to their popularity, they eventually garnered the authority to set fashion trends across

7

the industry, especially in color.

Similar instances happened in mainstream fashion and advertising areas as well. As the distance between the consumers and dressmakers increased, large apparel manufacturers and retailers began to use inspirational imagery created with help from fashion intermediaries. These imageries started becoming more and more influential in driving sales. Images and advertisements from the fashion scene in and other European cities were also being admired by people (Shin and Cassidy 2015). As fashion cycles started setting in (E. Kim, Fiore, and Kim 2013), apparel manufacturers realized that forecasting fashion trends helped streamline manufacturing and styling processes and drove sales. During the mid 19th century, as the fashion industry became entirely synched with the periodic trend-led business model, few established themselves as fashion trend consultants. Among them, Tobé Coller Davis and Edward Bernays are considered two pioneers (Pouillard 2013b), who in their own ways contributed immensely to the field of fashion forecasting as we know it today. While Tobe’s methodology was more focused on observing the trendsetters internationally and developing a general guideline for upcoming aesthetics and colors etc., Bernays, a public relations pioneer believed that by strategically diffusing the messaging any product could be made ‘trendy’ (Pouillard 2013b).

Outside the fashion, color and advertising scene – none of which helped to formalize the process of forecasting, despite making it popular – research was happening in other areas as well.

Post-WWII, the US government helped set up agencies like the RAND Corporation for the purpose of long-range military planning. The scenario-planning method ‘Delphi’ (Helmer 1967) was one of the many future planning methods developed at RAND during the 1950s and 1960s.

A symposium organized in 1966 by the US Air Force discussed the “place of technological

8

forecasting” in military research and development planning (Long-Range Forecasting and

Planning: A Symposium Held at the U.S. Air Force Academy, Colorado, 16-17 August 1966

1967). Primarily quantitative demand forecasting techniques for industrial products were also being written about around the late 1960s. The 1960s was as also a time when various new materials for everyday consumption were being invented and launched commercially (Market

Trends for Selected Chemical Products ... and Prospects To ... 1960). Soon, decision making in business and government agencies also started relying on technical future planning to guide making decisions regarding education, infrastructure, healthcare, etc. utilizing existing data

(Burton 1969).

By the end of the 20th century, agencies like, Nelly Rodi, Peclers Paris, Faith Popcorn producing trend reports had put the ‘trend analysis for strategic marketing’ spin on it. In more recent examples, ‘Trend Tablet’ owned by Lidwej Edelkoort provide consulting services to well- known fashion, lifestyle and consumer goods brands on what they think will be the next trend.

Websites like trendwatching.com rely on their global network of ‘trend-spotters’ who observe and report on happenings around them constantly.

Forecasting hence has become a multifaceted yet ambiguous practice with little formalization and standardization of processes. By the end of the 20th century, it became clear that many industries relied on it for planning, marketing and even innovating. However, different groups still had their own interpretation of trend forecasting and different methods of carrying it out, making its application ambiguous and difficult to adopt for design practices.

From the research in the area of trend forecasting in the past decade (Malhotra, Das, and

Chariar 2015) (M. Evans 2011), we see that the awareness about the relevance of forecasting

9

methods in the field of design has increased. However, an approach that furthers systematic investigation, strategically leveraging forecasting methods for designing solutions still remains to be formulated. This dissertation situates itself at the intersection of forecasting and design domains. Taking into account the real world challenges, this dissertation addresses holistic design needs through a strategic and systematic forecasting framework.

Consumer Needs

When exploring the topic of consumer trends forecasting it is inevitable to address the subject of consumer needs itself. What are consumer needs, how are they changing and what role does forecasting play in defining consumer needs – are all pertinent questions that contribute to the discussion around forecasting and design.

When we as human engage in any kind of consumption – of products, services, experiences, etc., – we become consumers. Other than our fundamental needs, the things that we consume shape our expectations and inform our needs. Marketers have traditionally described a need for the consumer perception of the difference between their current condition and what it might be (Smit and Bruce Archer 1973). As discussed in the previous section, public relations strategists such as Bernays know how to use communication strategy to make a product appear

‘trendy’ or aspirational (Pouillard 2013b) thus creating the impression of a need in the consumers’ minds. Cannon and Hasty, explained that the function of marketing is to identify a need, studies in human factors can define a consumer’s functional needs through observation

(Cannon and Hasty 1978). In a 2002 study focused on Asian consumers, authors examined the types of needs concluded that the three types of needs are experiential, social and functional

(Kim et al. 2002). Further, they stated that consumers’ experiential and social aspects actually

10

determined their purchases (Kim et al. 2002). How do forecasters and designers then discern the actual needs and determine which needs to solve for? Herbert Marcuse’s views on the “advanced industrial society” may provide a perspective (Marcuse 1964). Marcuse believed that with the advancement of industrial society there has been a loss of individuality, originality of thought and the ability to ask questions (Marcuse 1964). He believed that with the advancement of industrial society, people become part of established societies and surrender to pre-established norms of that society (Marcuse 1964). Marcuse termed such society a “one directional society” where human beings act like others, rather than acting autonomously (Marcuse 1964). Marcuse argued that in such a society where individuals cannot think and act independently, how can they understand what their real needs are (Marcuse 1964). He proposed that when human beings will be able to truly differentiate between “existence and essence, fact and potential, and appearance and reality” (Marcuse 1964), then they will they be able to define their own needs and such needs will indeed allow them to question and break existing social patterns of conformity

(Marcuse 1964). It is thus the shared responsibility of today’s designers and forecasters to search for larger direction while defining needs, one that is ethical and questions the status quo.

Ambiguity in meaning and application of forecasting

As described above, trends forecasting is still an amorphous concept and the practice itself is addressed by various different names. The practitioners are also referred to by varied titles such as trend analysts, trend forecasters, futurists, trend strategists etc. It is even applied with considerable variations in approaches and methodologies. As such, there remains a gap in identifying and addressing a baseline framework that is effective in forecasting the future of a trend, irrespective of the industry of application.

11

Even the literature present on the topic of trends forecasting is reflective of this gap, where authors have either given personal accounts of trend spotting and consulting without exactly describing the process on which forecasting hinges; or have written about methods applicable in totally disparate contexts and referenced each as trend forecasting.

Marketers, futurists, financial analysts, urban planners, product managers, and manufacturers employ a variety of forecasting methods to postulate future possibilities in their respective fields. J.S. Armstrong in his book “Principles of Forecasting: A Handbook for

Researchers and Practitioners” (Jon Scott Armstrong 2001) writes about eleven forecasting methods, with the majority of them being statistical. Futures Research Methodology, the book by J.C. Glenn, which includes contributions from many experts in different fields incorporates thirty-nine forecasting methods of various kinds (Glenn and Gordon 2009). But the perspective on how forecasting methods can inform the process of design to create relevant products and solutions for future still represents a gap.

Research Question and Methodology

The key research question, that this dissertation address is – “How can modular orchestration of forecasting methods enable their strategic selection to constructively inform design processes?” With the core research question, there are also sub-questions that determine the structure of the dissertation, such as:

Q 1: How does a trend forecaster select from myriad different forecasting methods?

Q 2: What attributes facilitate selection of forecasting methods towards informing the design process?

Q 3: What determines the order or combination of forecasting methods towards a design process?

12

Q 4: What warrants the need for forecasting studies to be the starting point of a design process?

The dissertation explores processes to determine attributes and needs towards a modular orchestration of forecasting methods. The subsequent chapters focus on addressing the attributes

(Q 1 and Q 2), order or combination of different forecasting methods (Q 3) and the need for a forecasting inquiry (Q 4). Chapter 2 starts with collecting, reviewing and analyzing methods and investigates if any tentative categories emerge among these methods. Chapter 3 utilizes case studies as a method of qualitative sampling of scenarios to test if a forecasting framework could have profitably informed the design process and decision making in these cases. The sample case studies are analyzed to yield (Q 3) properties or attributes or factors indicating the formation of a basic framework. Chapter 4 addresses Q 3 and Q 4 and proposes arranging the methods and attributes in a framework that may address the key research question. Chapter 5 applies the framework hypothesis derived in Chapter 4 to the sample cases in Chapter 3. Chapter 5 further discusses if the application of the framework hypothesis satisfies the core research question along with the sub-questions. Chapter 6 provides the final discussion, limitations of this study and opportunities for future work.

13

CHAPTER 2

FORECASTING METHODS: AN ANALYSIS

A. METHODS OF OBSERVATION

A multitude of foresight research and prediction methods have been used by researchers over the years and new ones are emerging every day. Today, a larger number of futurists, scientists, business planners and others globally are trying to postulate the future prospects for society than ever before. While trends forecasting practice is being embraced by some groups, their significance still remains underrated in most discussions or best left to an external agency or expert. Often, this is due to the fact that decision-makers in organizations do not entirely comprehend the potential opportunities that utilizing forecasting methods can unlock. This chapter discusses the different forecasting methods which can lay the foundation of a composite forecasting framework.

Consumer trends forecasting needs to go beyond reporting on current trends and extrapolating past data to project future values. Forecasting methods need to not only identify the germination of new trends based on emerging events and happenings, but they also need to outline how these trends will develop in the future. They need to take into account how the overall ecosystem around today’s trends will shift in order to determine the trajectory of change.

Designers and creative firms strive to stay relevant to times in order to grow and sustain their capital and conceptual value. Forecasters of fashion and design trends may rely on “spotting” them – on the streets, in design-tradeshows, in fashion weeks etc. Print and digital media start

14

broadcasting the spotted trends as the ‘next big thing’7 and manufacturers all over start churning out productions. The ‘next big thing’ gradually becomes mainstream and forecasters move on to spotting the new ‘next big thing’. But forecasting trends actually runs deeper than that. When the

B

A

Time Fig. 2A.1: A representative Trend Curve after the trend curve depicted by Brannon and Divita (Brannon and Divita 2015) fig. 2.2, shows the stages of a trend as it goes through the stages of introduction, adoption and

becoming obsolete. trends are noticeable enough to be ‘spotted’, they already have started climbing up the trend curve

(point B, Fig.2A.1) and can likely become mainstream in the future. But that is not where they originate. They have perhaps originated much before that (at point A, Fig 2A.1). And this origination of trends is a result of events happening in all areas – social, cultural, technological, political, environmental- farther outside of the design shows and fashion weeks. Hence, the practice of forecasting needs to look at the whole ecosystem, which would give creative decision making credibility and context.

So, where does the forecasting process really begin? Experts from different fields suggest different theories. Targeting managers and sales forecasts, Operations Research experts John C.

7 Here the phrase ‘next big thing’ refers to an idea, product, technology or design that is highly popular at a point in time and not the book ‘Next Big Thing’ (Higham 2009).

15

Chambers, Satinder K. Mullick and Donald D. Smith (Chambers, Mullick, and Smith 1971b) recommended starting by answering the following three questions:

• “What is the purpose of the forecast—how is it to be used?”

• “What are the dynamics and components of the system for which the forecast will

be made?”

• “How important is the past in estimating the future?”

Asking these questions may prove to be extremely important before embarking on a forecasting research, albeit they remain more focused on commercial sales and operations. Another framework that attempted to evaluate the impact of current events on the development of trends was created by Paul Nystrom in 1928 (Brannon 2005). Paul Nystrom, the American economist connecting fashion with retailing and distribution created a framework to understand the spirit of the times or Zeitgeist (Brannon and Divita 2015) (Brannon 2005). His framework listed factors that should be accounted for, in order to capture the direction of fashion at a certain point in time.

His framework listed the following factors (Brannon 2005):

• Dominating events

• Dominating ideals

• Dominating social groups

• Dominating attitude, and

• Dominating technology

This framework was extremely helpful for the task of ‘observation’, but did not follow through with the next steps about the analysis of trends or postulating the future of identified trends.

16

Once again for trends forecasting to yield actionable insights, it is not just important to observe, but also postulate the change in the behavior of people and changes in society at large thus impacting evolution of trends.

Often researchers dealing with just the observation and reporting of happenings around claim to be trend experts. While the observation part of the process8 may be easier, it is the latter part requiring the ability to postulate the future of current patterns that are critical to trends forecasting. Writing about “future shock” (Toffler 1970), Toffler explained that while the importance of forecasting has long been acknowledged in a larger social context, it is the challenge about measuring the social change that poses a challenge (Evered 1977).

Since the latter part of the 20th century, there has been a lot of research in the direction of measurement or estimation of change as well, based on past and current information. J.S.

Armstrong in his book “Principles of Forecasting: A Handbook for Researchers and

Practitioners” (Jon Scott Armstrong 2001) writes about eleven forecasting methods, the majority of them being statistical methods. Futures Research Methodology, the book by J.C. Glenn, which includes contributions from many experts in different fields incorporates thirty-nine forecasting methods of various kinds (Glenn and Gordon 2009). With such a wide plethora of forecasting methods available now, researchers and scholars identify some as quantitative and some as qualitative (Makridakis, Wheelwright, and Hyndman 2008) (Green 2001). The term

‘technological’ methods has also been used for some qualitative methods (Ayres 1969). A novel

8 Schich in his paper ‘Figuring Out Art History’ (Schich 2015) refers to the Ciceronian method of collecting material, arranging it in a specific order that creates a story and then communicating the said story. Trend forecasters have practiced this approach with events that are either current or have occurred in the past, but may not necessarily call it so.

17

array of modern computational techniques (Choi and Varian 2012b) & (LeCun, Bengio, and

Hinton 2015a) has now gotten added to this existing list of methods. Computational Machine

Learning methods leveraging search engine data are being used to forecast near-term trends in fashion (Zimmer and Horwitz 2015) and consumer goods (Choi and Varian 2012b). Contrary to this, many forecasters in the field of design and fashion, still rely on their intuition for the task of

‘measuring’ the change in trends, based on the primary step of continuous observation through individual trend-spotters (ISEG Marketing & Communication School 2016).9 These examples show that researchers have attempted to address trends forecasting in different ways. Nystrom

(Nystrom 1928) attempted to understand trends by observing the dominant themes of the day.

Chambers, Satinder K. Mullick and Donald D. Smith (Chambers, Mullick, and Smith 1971b) focused on the operational aspects of a system. Quantitative methods have helped estimate/calculate the future values based on past patterns. Some methods focused on observation of current trends, while others focused on calculating future values based on past patterns. However, with regards to trends forecasting, there may be elements that change the course of future, postulated based on past patterns. John Kineman (Kineman and Kumar 2007), explained the importance of this aspect through a “relational model”. John Kineman (Kineman and Kumar 2007) referring to Robert Rosen’s work in “Anticipatory Systems” states that his

“relational model” is a model of epistemology or “how science works” or the process of knowing, itself (Rosen 1985). Kineman’s model shows that the natural system is observed and

9 Lidwej Edelkoort through this video, describes her method of selection from a collection as purely based on intuition. Schich’s (Schich 2015) explanation about Ciceronian method of collection and feedback closing the Hermeneutic Circle may be relevant in case of creation of trend stories by trend strategists like Edelkoort. But here the reference is to the fact that Edelkoort herself refers to her method of selection as one based on intuition.

18

encoded by our senses and then decoded back to check against a descriptive system in order to test hypotheses or make forecasts. The model is considered successful if the encoding and decoding are aligned and the descriptive system mimics what we have seen in a natural system.

But, Kineman also warns that such models are valid only in case of a “simple system” (Kineman and Kumar 2007). However, the consumer trends which get shaped by triangulated and cumulative impact of social, technological, cultural, environmental and economic events keep the consumer space changing continuously. Therefore, in addition to methods which help with observation and estimation of trends, trend forecasters must also take into account such methods which address the impact of additional newer elements for forecasting of trends. Herman Kahn and Anthony Wiener in the year 1967 published their book “The Year 2000, A Framework for

Speculation on the Next Thirty-Three Years” (Kahn and Wiener 1967), listed one hundred innovations that would take place during the next thirty-three years till the year 2000. These innovations were based on the contemporary trends, with the assumption that they would sustain and evolve due to increase in technological capability and decrease in cost of technology. A review of this work by Richard Albright that came out in 2002 revealed that less than fifty percent of the innovations that were forecasted came true (R. E. Albright 2002). The prime reason for this as highlighted by Albright was that only the areas where computers and communication were concerned the forecasts came true as the increasing sophistication and declining cost technology could be ‘calculated’. Shortly after Albright’s review, in 2004, Paap and Katz illustrated that in order to truly sense the future it is essential to take into account the changing customer and their needs (Paap and Katz 2004). They claimed that corporations have to maneuver the “dualism” (Paap and Katz 2004) of incremental innovation as well disruptive ones.

19

They represented three distinct cases of interaction between changing needs, drivers10 and technology:

a) Old technology matures relatively to the dominant driver.

This happens when an existing technology is replaced by a newer technology as it can

meet the customer’s unmet needs better (Paap and Katz 2004). For trend forecasters, it is

extremely important to make note of the newest technologies, as well as to cross-analyze

their utility against customer needs. Evolution of trends based on such technologies is

tightly linked to how they meet existing needs better. Tiny storage drives replacing re-

writable DVDs and CDs is an example of such interaction.

b) The previous driver matures, a new driver emerges and old technology is unable to meet

the needs of the new dominant driver.

This is a case where observing and collecting past and current information will

only help to a certain extent, and postulating evolution of trends based on just what exists

today will be erroneous and can commercially prove disastrous. It may, however, happen

that customers cannot yet articulate the new need or stipulate the exact difference

between an existing solution and what would they rather have as a new solution till they

actually see it. In such circumstances, it is critical for the forecaster to know that the

products based on the incumbent trend will not work and the trend will not sustain

beyond the leverage limit11 of the driver. If at the current time there are multiple variants

10 (Paap and Katz 2004) define ‘driver’ as the performance characteristic with the greatest leverage for customer decision making. 11 Leverage limit is the point beyond which the customer no longer values or desires the incremental enhancements in performance (Paap and Katz 2004).

20

of the same product/solution/technology catering to that need, they all will at the same

time become redundant. It will only take one product that has been created by rightly

sensing the pulse of change to make consumers realize an inherent need and render a

number of existing products redundant.

Following the aforementioned two-step process of observation and estimation, the

forecaster's role would be to monitor the emergence of new solutions/technologies on one

hand and to calculate the rate of change in factors associated with the existing needs. If

the existing needs show a decline, the forecasters need to look for trends in ancillary

aspects. c) The environment changes

In this case, due to changes in the social, cultural, economic and other factors

customers’ needs go through a paradigm shift (Kuhn 1962) and an entirely new need

emerges. The emergence of a new need, may or may not necessitate that new technology

would also be required. But satisfying the new consumer need does require a disruptive

innovation, be it in form of a new invention, a new model of facilitation, distribution or

sales.

With observation and estimation being the primary steps, such situation warrants sleuthing for disruptive or interventional trends. Such trends may have been present but may have remained imperceptible. Change in external factors can lead such dormant laying trends to surface prominently. Hence, the facet of looking out for such interventional trends must be included in addition to the observing past and current trends and postulating their future evolution, to be able to holistically formulate the trend forecasting framework.

21

Observation, Estimation and Intervention

Part of the hypothesis to address the core research question stated in Chapter 1 was to identify emerging tentative categories among the forecasting methods. The three categories identified in the above discussion will be used to classify and review the forecasting methods –

(i) methods for observation of trends are discussed in the next section of this chapter, (ii) methods for estimation of trends are discussed in Chapter 2B, and (iii) interventional methods are discussed in Chapter 2C.

For a detailed review of each method we will look at:

(A) A short introduction of the method

(B) Known history of the practice of the method

(C) Some of the method’s convincing applications

(D) Full description

(E) Strengths & Weaknesses

(F) Conclusion about the applicability of the method

Methods for Observation of trends

The following methods are being discussed as being representative of the process of observing the patterns in the events across different fields and formulating trends. We take a detailed look at Environmental Scanning (K. S. Albright 2004) or STEEP analysis (Cronje 2016), Nystrom’s framework to capture the spirit of Zeitgeist (Brannon and Divita 2015), a design trends forecasters approach to observing trends, and Machine Learning.

22

Environmental Scanning

Method Introduction:

Environmental scanning looks for beacons of change in the social, cultural, technological,

environmental, economic and policy spheres to identify future trends and patterns.

History:

The term “environmental scanning” gained popularity, particularly in the 1960s-70s as

more and more organizations started focusing on the prognosis for a picture of the future

(Aguilar 1967). Aguilar explained ‘environmental scanning’ as a process of gathering

information (Aguilar 1967), and systematically scanning the collection of external and

internal information to decrease randomness, allow visibility of patterns and provide early

indications of external changes for the managers. This notion was reinforced by many others

in following decades- Bauer in 1984, equated Environmental Scanning to a conceptual radar

which constantly scans the world events to highlight anything new, or unexpected, something

with a strong impact or even something seemingly trivial (Bauer 1984). Fahey et al believed

that continued Environmental Scanning better equips decision makers to understand current

and potential changes in their institutions' external environments in order to boost strategic

intelligence in determining organizational strategies (Fahey, King, and Narayanan 1981).

Kendra Albright considers Environmental Scanning a vital tool for organizations strategic

planning and a radar for success (K. S. Albright 2004). Most recently, the approach has also

been applied in conjunction with other methods such as systems thinking12 (Haines 2016)

12 Systems thinking is a holistic approach to understanding how the constituent parts of the system interrelate and function together, in context of the larger systems (Checkland 1999).

23

and studies based on the so-called Delphi method (see below). Haines (Haines 2016), stresses the benefits of incorporating environmental scanning as part of systems thinking when planning to deliver a solution by a certain time in the future.

The area of application has also expanded to a broad variety of industries ranging from business and marketing to fashion and tourism. The following section provides specific examples of the application of environmental analysis to various fields.

Applications:

While Environmental Scanning was initially considered a tool for business and

organizational strategy. With the shrinking lead-times, decision makers across domains have

come to realize the merit Environmental Scanning has in assessing changing external

patterns. Most empirical methods prove useful in short-term prediction and the long-range

planning models, with reliance on historical data, do not take into account the possibility of

change brought about by external factors. This is where Environmental Scanning brings

value by bringing together a picture of the larger external environment, recent dynamics

within the concerned domain and its interpretation of the internal systems. Perhaps this is the

reason that this method is finding wider acceptance in places where improvements are

sought as well as where ground-breaking innovation is the need. Recently, Evans suggested

directions for successful social media campaigns of the future (W. D. Evans 2016), while the

indications for the hotel industry of China were explored through Environmental Scanning

by (Tavitiyaman et al. 2016). Aremu et al. conducted a strategic marketing exercise for the

Nigerian insurance industry (Aremu, Gbadeyan, and Aremu 2016), while Durst et al.

presented a holistic strategic foresight approach for the German Federal Armed Forces

24

(Durst et al. 2015) based on environmental scanning studies. Some other areas where

Environmental Scanning is frequently used is new product development, innovation

strategy, marketing, advertising, e-commerce, social media, fashion and lifestyle products,

automotive industry, etc.

Method Description:

According to Cronje, environmental scanning involves continuous study and

interpretation of the STEEP13 factors -social, technological, environmental, economic and

political events and trends (Cronje 2016). The key idea is that even the subtle changes in

these factors create patterns or ‘trends’. Sensing the pulse of these trends to forecast the

impact they are likely to have on human behavior and other areas to save lead times is the

prime objective of Environmental Scanning. Continued scanning of the STEEP and other

factors pertinent to specific industries, can enable decision makers to stay attuned to the latest

developments in their field and take preemptive actions for maximum benefits.

Environmental scanning not only pertains to external factors in international or macro level

environment but also enables deeper inquiry of the dynamic and structural elements of these

factors (Kroon 1990). Since Environmental Scanning is an ongoing process, it not only elicits

preemptive action, it also surveys the current conditions to find competitive advantages in

business, find new markets, upselling opportunities, changing customer preferences,

upcoming technological tools, and platforms. Hence, the approach has the ability to forewarn

13 The STEEP acronym is commonly used to refer to the stage in new product development or innovation strategy discussions for assessing relevant external factors. Such discussions often involve cross functional teams including but not limited to design, business, marketing, development, engineering etc. Other commonly known versions of STEEP are PEST, PESTLE, PESTEL etc. http://pestleanalysis.com/what-is-steep-analysis/.

25

about critical changes as they are likely in the future, identify signals of big events while they are still nascent and suggest a strategic course of action to ensure least disturbance or loss.

Scanning systems can distinguish between what is constant, what changes, and what constantly changes (Glenn 2003). Generally speaking, Environmental scanners, which could be individuals or trained groups using collective cognition, along with the following steps (K.

S. Albright 2004): a) Identification of the current needs and the issues within the business or

microenvironment. This step sets the stage by defining the scope and direction of

scanning activity. It outlines the participants, duration, focal points and other resources

available for the scanning process. b) Albright specifies that once the scope is established, the next step entails gathering

information spanning a wide variety of sources (K. S. Albright 2004). These sources

commonly include but are not limited to Journals/magazines, books, newspapers,

professional conferences, television, internet, customer interviews, commercial databases,

internal reports, conference papers, sales teams and reports, internal databases (K. S.

Albright 2004) etc. c) Information gathering is followed by analyzing all the information that the business has

collected. As the analysts look for patterns and leading indicators in the wide range of

information collected, organizations are also made aware of any influencing trends or

issues. d) This analysis is interpreted by experts and decision makers and translated into strategic

actionable insights for the business. This information and analysis is reported and

26

presented to the stakeholders, cross-functional teams and decision makers. e) Based on the insights, directions, and recommendations made to the management, in

order to project the business into a favorable trajectory with respect to the future business

environment.

Strengths & Weaknesses:

Strengths: Traditional long-range planning models and short-term statistical models which are based on past data often function under the ‘assumption of constancy’

(Makridakis, Wheelwright, and Hyndman 2008), where future changes are thought of as merely continuation of the current direction. The rate and kind of change of the present social, technological, economic, environmental and political trends is not accounted for.

While these might work for ‘surprise-free’ future, that tends not to be the case. This is where environmental scanning adds value.

Saxby et al assert that environmental scanning not only provides an assessment of the external factors, but it can guide organizations to induce strategic change from within to better align with the changing environment (Saxby et al. 2002). Insights from Environmental

Scanning may result in successful strategy that help the organization attain "a favorable position in an industry" and a "competitive advantage" (Porter 1985). Nokia was able to foresee the upcoming slump in cellular phone industry, around 2007-8. As a result, it was able to plan for a slowdown in growth and managed to stay comfortably afloat through the recession. Some of its competitors, on the other hand, taking cue from previous year-on-year performances predicted a 30-40% growth (Direction 2007). Eventually, when the demand in

27

European markets turned out to be much lesser than expected, it led to giants like Sony

Ericsson issue profit warnings.

Kotler identifies, detection of new opportunities and beaconing demand as one of the key benefits of environmental scanning (Kotler 2009). Another merit of environmental scanning is that it keeps the executive decision makers aligned with real market conditions, threats and opportunities and forces strategic decisions to be vetted by data rather than their own opinions (Fahey, King, and Narayanan 1981). Environmental scanning also increases sensitivity to customers’ changing needs and implores companies to be proactive in monitoring, predicting and responding to market trends (Morrison, Wilson, and Didsbury

1996).

The insights revealed from environmental scanning can be compared and verified with quantitative methods in order to make decisions. At the same time, it is a powerful tool if used in combination with other methods such as Delphi, decision analysis and cross-impact matrices.

Weaknesses: While gathering the information received through the scanning process, there may be selection bias involved. Some information which may be important can end up getting overlooked or just missed. Inferring the information to a large extent depends on the expertise of the analysts and researchers and make take time. Due to ever changing environment, information may be valid only for certain period of time, before it changes due external factors (K. S. Albright 2004).

28

Conclusion:

Environmental scanning is an interpretative method is not usually hard-coded in a model.

What it does provide toward foresight research is, a framework and guidelines evaluating the

impact of various factors pertinent to the research domain. The accuracy of forecast depends

on the level of proficiency of the experts carrying out the collection and evaluation of gather

information. At the very least this method provides a starting point for enquiry into the

changing patterns, trends and at a social, cultural, technological, economic and policy level.

Nystrom’s Framework for Zeitgeist

Method Introduction

Nystrom’s Framework for Zeitgeist denotes areas of influence that shape aesthetic

sensibilities for the fashion of the current times (Brannon and Divita 2015).

History

Paul Henry Nystrom regarded ‘Fashion’ as one of the key propelling forces behind

businesses (Nystrom 1928). His expertise in economics and experience in the marketing

world lead to the discourse on pragmatic utility of fashion for economics. Nystrom’s

framework for Zeitgeist first appeared in his book “Economics of Fashion” in 1928, as the

factors that cause and determine the evolution of fashion (Nystrom 1928). At the time of

writing “Economics of Fashion” Nystrom, already had twenty years of experience as a

business executive in fashion related industries. He also served as the Director of Retail

Research Association for seven straight years, from 1921 to 1927. During this tenure, his

observations and interactions with some of the best retailers informed his thinking about the

dynamics of . Nystrom argued that even if goods had qualities such as

29

durability, functionality, longevity etc., but were not in sync with the contemporary fashion, they will not sell (Nystrom 1928). Therefore, in order to make pragmatic business decisions, companies needed to consider the movement of fashion. Nystrom had initially listed only three main factors for identifying the zeitgeist – dominating events, dominating ideals and dominating social groups (Nystrom 1928). The other two – dominating attitudes and dominating technology were added later in recognition of advancing complexity in the way

Fashion changed (Brannon 2005).

Over the years, Nystrom’s framework has acted as a foundation for forecasters for basing their nuanced studies related to changing times and changing consumer preferences. In 1974,

George Sproles outlined a conceptual framework for fashion theory based on consumer behavior in fashion (Sproles 1974). Annette Ames built long term scenario projection of fashion for 10 to 20 years in the future leveraging Nystrom’s framework for eliciting the socio-cultural factors of the future which play dominant role in defining the aesthetic sensibilities in the projected future scenario (Ames 2008). Mackinney-Valentin co-authored the “Driver Analysis-Reading Trends” (DART) approach for organizing intuition towards conceptualization of design (Mackinney-Valentin 2011). The DART methodology organizes intuition by looking holistically at the different mechanisms that drive creative trends

(Mackinney-Valentin 2011). The trend mechanisms referred to in DART include social identity, market logic as well as the formation of the Zeitgeist (Mackinney-Valentin 2011).

Mackinney-Valentin draws upon Nystrom’s framework to structure the mechanism of formation of Zeitgeist despite it being originally created in 1928. Thus, Nystrom’s Zeitgiest

30

serves as the fundamental tool that helps garner a sense of the current times to fuel creative processes.

Applications:

Nystrom presented the framework for the Zeitgeist (Brannon 2005) based on a thorough analysis of the post WW I state of industrial and commercial practices (Nystrom 1928).

Nystrom stated that the factors he listed as dominant factors driving aesthetic movement impacts more than just apparel and textile industries. The framework was stated to be hold practical applicability for a wide range of industries such as interiors and décor, automobile, kitchen ware and appliances, upholstery, music and instruments, in addition to fashion apparel, accessories, footwear and cosmetics (Nystrom 1928). Nystrom’s framework does not only evoke an organic discussion about the state of things around but it further a very pragmatic purpose of establishing commercial validity for the design of consumer goods.

Speaking of the concept of Zeitgeist or the “spirit of the times”, it is also important to mention the concept of ‘modernity’. French poet Charles Baudelaire of the 19th century explained the concept of modernity in his book, “The Painter of Modern Life” (Baudelaire

1964). Baudelaire defined modernity as “the transitory, the fugitive, the contingent which make up one half of art, the other being the eternal and the immutable" (Baudelaire 1964).

Baudelaire wrote that every artists tries to capture the ‘modernity’ of their time in their work

(Baudelaire 1964), and it is important to do so, as by not taking into account the contemporary sense of modernity, the idea of beauty and aesthetics can quickly become

“abstract and indeterminate” (Baudelaire 1964). He further cautions against the misrepresentation of what was modern during a particular era, as that can distort the view for

31

future generations (Baudelaire 1964). It thus becomes important to understand the value of

methods that inform forecasters of what defines aesthetics and newness in the current times.

Full description:

Ninety years ago, in 1928 American Economist and marketing professor created a

framework to sense the pulse of the times called Zeitgeist, literally the ‘spirit’ of time in the

German language. He identified a list of social and cultural factors that shaped the aesthetic

sensibilities of the time. Nystrom researched how the changing demographics were

transforming the social beliefs and values, how the development of trade and markets

changed the economy, how the technological developments evolved society, how the

political events shaped the moods and morale of masses. Being able to map the spirit of the

times was important as it transcends through all consumer product categories and connects

the links in consumer experiences. Since consumers tend to be influenced by the social and

cultural environments around them, they are likely to choose products that reflect the same

spirit14. Nystrom’s framework consists of a list of five key factors, each of which contains

deeper classification. Nystrom’s list of factors includes (Brannon 2005):

o Dominating Events – 1920s was a decade of some historical events in the U.S.

The unpopular 18th Amendment banning the sale and consumption of alcohol was

passed. Women were allowed to vote, following the passing of the 19th

amendment. Great Gatsby, the book was published in 1925. Nystrom categorized

dominant events into three categories – significant events, art vogue and

14 ‘Zeitgeist’ likely alludes to the ancient concept of ‘spirits’ such as the Roman “Lares” being “spirits of the house” or the Christian “Holy Ghost” called “Heiliger Geist” in German.

32

accidental events (Brannon 2005). At the time when Nystrom drafted the

framework, “significant events” were described as events having impact over a

very wide audience and even worldwide in some cases, such as wars, death of

world leaders or world fairs (Nystrom 1928). Over the years the fundamental

definition of “significant events” has not changed, even if the nature of world

events might have changed. Academy Awards ceremonies held every year

attended by influencers and celebrities have a far-reaching impact not just for the

fashion sported, but also for social messages and themes. In 2018, at the 75th

Golden Globe Awards, Hollywood celebrities registered protest by wearing black

in support of the #Timesup movement (“TimesUp” 2017). This created

widespread awareness through different media channels and ordinary people

came out with their own stories of inequality based on sex. Hence, such events are

not only a reflection of the spirit of our times but also part of creating it (Brannon

and Divita 2015). Other examples of significant events in today’s age may include

stock market or widespread economic happenings, terrorist attacks, or natural

disasters. Flappers and Jazz music in the 1920s defined the vogue, and discovery

of Tutankhamun’s tomb was an accidental event of that created an increased

interest in Egyptian culture worldwide. o Dominating Ideals – The next factor considered important by Nystrom for

influencing the Zeitgeist was to identify which are the predominant ideals in

today’s society. At the time of writing “Economics of Fashion”, Nystrom picked

up on ideals of patriotism, religion, Hindu ideals of contemplation rather than

33

consumption, (Nystrom 1928) etc. Having provided a fair description of what he

means by dominating ideals, Nystrom claimed that such ideals have never been so

strong on their own as to cause lifestyle changes for the society (Nystrom 1928). o Dominating Social Groups – Nystrom described dominating social groups as

individuals in positions of power, money or leadership whom the rest would

aspire to emulate (Nystrom 1928). The times of nobility and royalty may not be as

they were at the time of formulation of the framework, but it is still true that a

handful of people by virtue of their celebrity status still have the power of

popularizing fashion. Achievers in mainstream media industry have always been

at the forefront of starting fashion conversations for their patrons. Today there is

also a huge following of social media influencers, who appear more relatable yet

aspirational. When social media influencers endorse brands and share experiences

they seem more realistic and hence are paid handsomely by companies. o Dominating Attitudes – Attitudes dominant during an era, was not first penned by

Nystrom as a factor influencing the Zeitgeist. Fashion scholars believe that

attitudes during a certain determine the overall fashion sense of the times

(Brannon and Divita 2015). Brannon describes the two phases of dominant

attitude as the desire to fit in vs. the desire to stand out (Brannon 2005). Further, if

the attitude is to fit in, it gives rise to only incremental innovation and conformity

in fashion such as that seen in 1930s and 1950s in the U.S (Brannon 2005). On

the other hand, when the overall attitude is to stand out, it gives rise to fashion

revolutions such as those seen in 1920s and 1960s (Brannon 2005). Not just

34

fashion but other creative fields such as music, art, film-making also reflect and

participate in the dominant attitude.

o Dominating Technology – Technology is another one of those factors that

Nystrom did not originally conceive as part of the framework. Today, since the

state of technology has become an essential characteristic of modern day living, it

undoubtedly is one of the key factors influencing the spirit of our times.

Technology is not only changing how goods are produced and are made

accessible to an average consumer, but it is also changing how we live. Our

lifestyle, behavior, and experience all have become habituated to an expected use

of technology. Our perception of ourselves and the technology we use have also

come to be interlinked. Whether it is convenience or creativity, technology

manifests itself in every sphere of modern life.

The five pillars of Nystrom’s framework – events, ideals, social groups, attitude, and technology, conceived originally for determining the fashion trends of the time, actually convey deeper and more holistic ideas about the state of the behavior, choices, sensibilities of the time. For this reason, not only is the Nystrom framework still relevant, but it is also applicable in the larger context of consumer trends.

Strengths & Weaknesses:

It is both a strength and weakness of the method that it collectively addresses the factors that cause fashion movements as well as factors that participate in it. For instance, trends in music or technological advancements are both reflection of the state of the society but they also

35

drive new and upcoming trends in lifestyle choices. Therefore, it converges the role of

drivers and indicators of zeitgeist.

Nystrom’s framework provides a holistic and over-arching view of the developments in

different spheres of society. However, it is imperative that researchers and scholars adjust the

implied meaning to the day and age when they are applying the framework. For instance,

Nystrom’s reference to nobility and royalty as dominant social groups are still valid today but

their implied meaning has expanded to include actors, musicians, entrepreneurs, even First

Ladies of states.

Just like the factors of dominant attitudes and dominant technology were appended to the

three dominant factors of events, ideals and social groups originally conceived by Nystrom,

the framework may see further inclusions in future years, since the core methodology stays

valid even with new developments.

Conclusion

Nystrom’s framework indeed presents a thoughtful way of observing and assimilating

important pointers for gauging the drift of what’s considered “in-vogue”. It is a subjective

and interpretative method where the researcher’s familiarity and expertise will need to be

ensured for a thorough analysis of dominant factors. Nystrom framework has been called

upon for mid and long range forecasting (Ames 2008) of creative directions.

Design Trends Analysis

Creative industries across the world look for visual, physical and functional cues for creating consumer products in spheres such as fashion, lifestyle, architecture, technology, luxury that are inspired by events and issues in the world at large.

36

A really distinct case of classic interpretative methods of forecasting is forecasting in the area of design trends or fashion trends. This requires a detailed mention since it is not widely discussed or academically written about but still is a practice on which many creative industries rely for future directions. Trend forecaster Lidewij Edelkoort presents an interesting and artistic perspective on the process of forecasting research. Edelkoort is a trend forecaster of international repute. She has consulted multiple global brands such as Coca-Cola, Nissan, Camper, etc. on product features, creating product identity and the overall conceptualization and development strategy. Many of her clients are from industries in the luxury and beauty space such as Estée Lauder, Lancôme, L'Oréal, , Dim, and Gucci, etc. (“Lidewij Edelkoort”

2016). In her own words, Edelkoort describes the process of forecasting as:

“Seeing into the future a few years or few seasons ahead to help industry correctly

construct a new collection, a new idea, a new philosophy, a new vision, because you

know beforehand what will happen. The way you know it is by using your intuition more

than anything else. Based on trained intuition15 and logic16 you interpret what you are

catching from the zeitgeist. As a profession (trend forecasters), catch on to small

15 Edelkoort’s concept of trained intuition reminds of Zwicky’s directed intuition, as discussed further below. Zwicky’s predictions in the field of astronomy were made by using what he referred to as the morphological method of directed intuition. His morphological approach gave way to the validity of forecasting methods which could not be proven. But Zwicky recollected the methods as being very successful during 1930s-1960s (F. Zwicky 1971) in predicting the existence of yet unknown cosmic objects. The method of “directed intuition” involved directing the researcher’s intuition towards known guidelines under the process of morphological analysis. Edelkoort’s terminology of “trained intuition” is based on the forecaster’s interpretation of the Zeitgeist, thus highlighting the conceptual synergies between the informal process of design trends forecasting and scientific and technological methods based on morphological approach.

16 Logic reasoning with roots in philosophy means Edelkoort’s method behaves to measurement like data modelling behaves to data analysis.

37

fragments of society every day. When you have enough of them and when you have a

major one, then suddenly you have ‘aha!’ moment. You figure - this is what it is, and this

is what it means. Once you have the word and the meaning then you can start trying to

understand what’s happening and then to intellectualize, to prepare a reason for why this

would happen. Because you need to show your clients a reason and not just a whim”

(ISEG Marketing & Communication School 2016).

As with any kind of interpretative method, data collection is an important part for

Edelkoort as well. She describes data collection for forecasting as a continuous process. From this huge array of data, she then intuitively selects elements. This particular step is considered a crucial one despite the fact that this intuition is not scientifically questioned at this point

(Edelkoort 1997), (Edelkoort 1999). Talking about Edelkoort, Diane and Cassidy, state that she

“promotes the idea that forecasting methodology is artistic in nature and that it is not possible to apply any objective investigation into it (Diane and Cassidy 2009). Diane and Cassidy, recount

Edelkoort’s seminar at the Briggait Centre, Glasgow, in 1999 where Edelkoort explained the methodology with respect to color forecasting as follows (Diane and Cassidy 2009):

• Edelkoort collects any representation of color, available in any shape or form, that

personally appeal to her on a continued basis.

• These samples are arranged and stored based on their hues. For six-month seasonal color

cycles, color representations are intuitively picked from the collection.

• When the occasion arises, Edelkoort instinctively picks color samples from her

collection, thus synthesizing stories or themes from the thus selected palette of colors and

connecting them based on subconscious influences.

38

• Based on such narrative, story boards and color boards are created which are discussed

with clients or partners to develop product ideas.

The process described above involves four key steps - collecting materials, arranging it in a certain order, creating stories and communicating it to relevant audience. This process is described by Schich (Schich 2015), as the classical Ciceronian method of developing a speech.

The feedback that Edelkoort provides to her clients/audience through narration of story boards completes the hermeneutic circle where detailed insight and a general big picture depend mutually on each other (Copeland 1995). Thus, similar processes practiced by trend analysts for observation and analysis of contemporary happenings can be said to be following the classical

Ciceronian process of rhetoric invention (Fortenbaugh 1998).

Based on a personal meeting with Edelkoort in 2012, I can recount an anecdote she narrated – “Once me and my partner were at a shop. My partner happened to see some copper containers, which he showed to me. I instantly knew, that copper is going to be all around.”

Factually speaking, between 2012 and 2017 we have seen a sharp rise in the popularity of copper as a color option for home products, gadgets (iPhone Rose Gold), material finishes, patinas, wall finishes etc. This anecdote outlines two possibilities – first, that forecasters like Edelkoort can intuitively estimate which design trends will become popular in the future, and second that forecasters can also be the agency influencing the propagation and spread of certain design trends. Above we have seen examples of both such possibilities, where Tobé Coller Davis exemplified forecasters intuitively ‘picking’ trends, and Edward Bernays exemplified forecasters acting as influencers, propagating the spread of certain design trends (Pouillard 2013a).

39

The most crucial component of forecasting in this case, is the ‘trained intuition’ informed by a periodic and consistently created collection of data, sources, materials or inspiration which is assimilated at a subconscious level. The trends that emerge from such assimilation provides reference and guidelines for designers in different areas of consumer experience and orientation such as apparel, lifestyle goods, automobiles, accessories, cosmetics, fashion, design, art, cuisine, films, music, etc. Trend forecasters often propagate this information through mood boards comprising of thought-provoking photography, drawings and text, art and artifacts, exhibitions and even samples of materials and colors. Creative professionals and designers then interpret these in a way that is innately personal and influenced by personal environment, culture and experiences.

The way in which Edelkoort, Davis and Bernays project trends can be viewed as a shared or co-construction17 of trends between the consumers and the forecasters, and even the designers in some cases. Roll mentions an example about the fashion brand Zara who pride themselves in two areas – understanding their customers’ needs and bringing it to them in the fastest turnaround time possible (Roll 2016). Citing a specific incident in 2015, Roll states that when a customer in the brand’s store asked the staff if they had a pink scarf, similar instances were recorded by the store staff in the stores in Toronto, San Francisco and Frankfurt, almost around the same time (Roll 2016). Roll mentions that similar demands were recorded at many other Zara stores across the world over the next few days (Roll 2016). Zara being the leader of

17 In Trend Report for the year 2011, I specified the trend of ‘Co-creation’, where by companies were explicitly inviting customers to send their ideas, preferences or stories to them. If the company utilized their idea or contribution they would acknowledge them through some form of their branding. While this put the company directly in touch with their customers’ thinking, the customers felt the sense of pride for making a contribution and would feel loyal towards the brand for a longer time.

40

fast-fashion, shipped 500,000 scarves to more than 2000 stores globally within seven days which sold out in three days (Roll 2016). Advocating to blur the boundaries between ‘for-profit’ and

‘non-profit’, Porter and Kramer urge companies to invest in the development of new products that create shared value, not just for the company but also for society (Porter and Kramer 2011).

Once touted the worst industrial polluter, GE put their Ecomagination strategy in place in 2005 to provide for resources while reducing the environmental impact globally (General Electric

2015). Porter and Kramer, citing the example of GE’s Ecomagination products which reached $

18 billion in sales by 2009, emphasize the value of creating something where the value is shared between companies as well as society. Writing about the phenomenon of “co-creation”

Degnegaard indicates the strong connection between innovation and co-creation with users

(Degnegaard 2014). Degnegaard expresses that, what is critical about bringing together users who act as innovators and producers or designers, for the purpose of co-creation, is that certain boundaries or limits are put in place to enable the right kind of innovation (Degnegaard 2014).

He further states that such co-creation necessitates that the solutions attained are sustainable and work towards solving problems for the community and society at large (Degnegaard 2014). It then becomes the key responsibility of the forecasters and designers to be ethical and righteous while transcribing trends so as to always think about the impact of such trends and uphold larger social benefit.

After Bijker (Bijker 1995), Oudshoorn and Pinch describe the idea of co-construction as the “mutual shaping of social groups and technologies (Oudshoorn and Pinch 2003). Oudshoorn and Pinch in their discussion refer to “technology” in general with the key focus on users rather than producers (Oudshoorn and Pinch 2003). Their premise is that the way users interact with,

41

adopt, reconfigure, or adapt to technology represents a “co-construction” of users as well as technologies. Oudshoorn and Pinch consider that the phase of testing a technology with the users is extremely important, as it not only checks the definitive pre-supposed purpose of technology but also explores the users’ behaviors, thus testing the range of the technology’s capabilities as well as limitations (Oudshoorn and Pinch 2003). They also illustrate the possibilities of conflicts or the politics in the context of co-construction of users and technology due to cultural factors or just disparities in power (Oudshoorn and Pinch 2003).

The topic of co-construction has always been a relevant one for defining the society. In

1964, in his book, “One-dimensional man: The ideology of advanced industrial society”, referring to the “industrialized” society, Marcuse said that the society has changed so much that in order to truly define a “free society” a new model is required (Marcuse 1964). He argues that beyond the biological needs, all other needs of human beings get pre-conditioned by the existing norms of the society, making it important to distinguish between true and “false needs” (Marcuse

1964). Marcuse defines “false needs” as the ones, which an individual adheres to, in order to conform with the social set up of what one should and should not need to have (Marcuse 1964).

Marcuse believes that this framework of needs prescribed by the society is repressive and no matter how much one believes them to be their own, they remain “false needs” (Marcuse 1964).

In context of products, technologies and trends it is interesting to think which needs are being addressed and whether these contribute to Marcuse’s definition of a free society or add to societal norms of repression. He suggests neither capitalism, nor socialism, but the better approach is to construct the society “from below” (Marcuse 1968). By “co-construction from below”, he meant “a socialism of cooperation and solidarity, where men and women determine

42

collectively their needs and goals, their priorities and the method and pace and modernization”

(Marcuse 1968). With there being multiple different perspectives whereby co-construction with respect to trend forecasting can be explored, my argument in this regard is pending further investigation.

43

B. METHODS OF ESTIMATION

Once the observation of current events and existing environments leads to the identification of trends, the next step is to estimate how these trends may unfold in future – whether they will prevail over a long range or short term, how they impact consumer preferences, and how they manifest in form of new products and solutions. Here, we review methods of statistical modeling, Cross-impact analysis, Decision analysis and Morphological analysis to gain insight into methods of estimation.

Statistical Modeling

We proceed with a general introduction to the area of statistical modeling followed by a specific focus on Regression and Time Series Analysis. There can be many different kinds of statistical models, which have their base built on statistics of observations. Statistical Modeling techniques rely on using a mathematical model as a substitute for reality. Burnham and Anderson speak of models as a simplified representation of reality, which may not convey everything about the actual situation. Statisticians may formulate the degree to which the model may deviate from reality.

Statistical modeling represents a set of assumptions through certain variables. The model may involve random variables, which could acquire any value from a given range of distribution as a result of the defined function and different values of non-random functions (General Electric

2015), about a sample of data from a larger population. These sets of assumptions, represented by the statistical model define a set of probability distribution within the sample space. The embodiment of such probability distributions differentiates statistical models from other

44

mathematical models (Kelley 1947). A statistical model as such is a theoretical representation of the data-generating process.

Strengths: Regarding the use of statistical modeling for futures research, its strength stems from the fact that the assumptions for modeling are rooted in existing, often historical data.

Such models provide an understanding of the forces that shaped historical events. Hence, they can be relied upon to provide a baseline forecast for future happenings (provided no changes occur to the factors in the historical data) within confidence intervals which can be calculated. As our focus here is on using statistical modeling for forecasting, we will take a deeper look at statistical modeling methods such as Regression Analysis and Time Series. These models

‘forecast’ by deriving a relationship based on how changes in the value of independent variables impact the values of dependent variables (Guerard Jr 2013). Dependent variables are the output the model delivers based on the different values of independent values that have been input into the model. Statistical modeling methods such as Regression Analysis and Time Series which explain the impact of time as an independent variable on other dependent variables in a model are of particular significance and consequences for forecasting research.

Weakness: Regarding the use of statistical modeling for futures research, the assumptions embodied in the models may prove to be limiting in some cases. For instance, the model may assume that the information needed to forecast is contained in the historical data, and the model represents the system being studied in reality and that this system will not change in the future.

There is a danger/temptation of inferring causality from statistical models, when most often all there is, is correlation (G. A. Jones et al. 2001).

45

Further, two different manifestations of Statistical modeling- Regression Analysis and

Time Series Analysis are discussed in the following section.

Regression Analysis

Method Introduction:

“Regression Analysis is a statistical modeling method that estimates relationships among

dependent and independent variables.” (Burnham and Anderson 2003)

History:

The earliest forms of Regression Analysis go back to “methods of least squares” to the

early 1800s. French mathematician Adrien-Marie Legendre published his work on “least

squares” in 1805 and German mathematician Karl Gauss in 1809, both of whom used this

method to predict astronomical phenomena about the orbits of celestial bodies around the sun

(J. Scott Armstrong 2011). Further development on least squares, was published by Gauss

again in 1821 (Plackett 1950). The method eventually entered the social sciences in the

1870s when Francis Galton coined the term “Regression” (Stigler 1989). Galton illustrated

“regression towards the mean” through a study itself based on studies of heights or

descendants of tall parents (Galton 1886). Based on his experiments that found the height of

children of tall parents is likely to regress to a normal mean (Galton 1886), Galton’s law of

regression implied that the hereditary traits from parents do not get completely transferred to

the children (Fig. 2B.1).

This phenomenon had only biological reference initially. Subsequent work by Karl

Pearson (Pearson et al. 1903) and Udnu Yule (Yule 1897) on “interpretation of correlations

46

HEIGHT The Deviates of the Children are to those of DEVIATE inches inches 65 70 MALE. FEMALES. MALE. 68 75 ,75i X 71 YviWhen Mid-Parents are taller than mecliocrity, , D 72 _4 0 This content downloaded from 129.110.242.50 on Wed, 15 Mar 2017 17:53:42 UTC in their Mid-Parents as 2 to 3. in B C l C their Children tend to be taller than they. C M F- RATE OF REGRESSION IN HEREDITARY STATURE. FORECASTER OF STATURE their Children tend to be shorter thanthey.a All use subject to http://about.jstor.org/terms 67~~~~~~~~~~~~~~~~~~~~- When Mid.Parents are shorter than mediocrity, MI A~ ~ ~~~~~~~- Fig. (a) FiA (b) Plate IX. JY &i~~~R.Emslie1idi0.5 3 4 -3 + 3 69 jI /0 9t +1 > X ' Fig. 2B.1: Image Source: (Galton 1886). Values from Galton’s -2

experiment showing regression in heights of children towards 0~~~~

mediocrity. Heights of children of tall (taller than mediocrity) parents 0 ~~~~~70 7 tend to regress to a shorter height, while heights of children of short (shorter than mediocrity) parents tends to higher than that of parents. 65 4 1 / -aid65 eos 65 C!0 70 70

between indices or ratios” (Yule 1910), established regression as a method with broader

statistical applications in physical and economic fields (Yule 1910).

Milton Friedman around 1944 used regression to assess how long it would take for

different alloy samples to fail under varying degrees pressure, temperature and other

metallurgical variables (J. Scott Armstrong 2011). In those days a large computer needed

forty hours to perform the calculations, excluding the data input time. Today, with

computational advancements and more sophisticated regression methods a similar calculation

47

can be performed in about a second. More complex Regression analysis methods include so- called non-parametric models, Bayesian models, models where variables need to be presented through complex data objects such as graphs or curves (Gatsi 2016), and time series models (as discussed in the next section). As a result of the availability of such more sophisticated models, regression analysis has been relied upon for the purpose of predicting. Freedman (Freedman 1997) through his model showed that regression models’ ability to correlate can be evolved to infer causality between variables. With increasing volume of variables and parameters involved in predicting, machine learning methods have later entered the domain of prediction and forecasting (Trafalis and Ince 2000). Increasing overlaps between regression models and machine learning techniques are enabling wider application of regression models in analytics (Rasmussen 2004) (Nasrabadi 2007).

Applications:

Since, regression techniques enable the estimation of possible correlations between the variables of interest, and factors influencing the variable’s behavior, they have found extended use across a range of industries such as financial services, retail, telecom, pharmaceuticals, and medicine.

48

Fig. 2B.2: Hyndman and Athanasopoulos (Hyndman and Athanasopoulos 2014) attest a linear regression plot from 2010, projecting the number of incoming tourists to between 2011-2015 based on past twenty years of data.

Companies are also employing regression techniques to measure the effectiveness of their marketing strategies through various channels over a given time period. They are using regression models to analyze and study the impact of their decisions such as changes in pricing, merchandise offering, product architecture, deals and offers on consumer sales, loyalty and acceptance among customers etc. Fig.2B.2 from (Hyndman and Athanasopoulos

2014) represents a linear regression plot from 2010 of international tourists visiting Australia that utilizes past data since 1980, and predicts estimates for number of international tourists for the next five years from 2011- 2015.

Since regression techniques can help understand historical data and relationships between variables to assess effectiveness of prediction, they are widely used for a variety of business needs. Several companies today are relying on predictive analytics for various kinds of

49

business needs. In 2012, in a joint venture with software services giant Accenture, GE

Aviation used shadow technology to predict potential failures before they actually occur (I.

Murphy 2016). The model uses whole aircraft data to create a virtual version of the components (I. Murphy 2016). Based on the data about how different components are expected to perform in different conditions, how often are they to be serviced or changed etc., this model tracks the virtual engine in real-time to predict, when a certain part needs attention or repair. If the aviation company would have otherwise waited to inspect the components at stipulated mileage or period, this could have resulted in a miss-happening.

Predictive analytics can help avert such incidents by continuous monitoring and analysis.

This technology has been used by Rolls Royce for many years as a part of their high services standards.

The financial services and insurance domain is another area where Linear Regression methods find wide application. Insurance companies such as those carrying out insurance for vehicles, houses and other assets use linear regression methods to calculate premiums based on insured declared values. The financial risks associated with insurance, mortgages and loans are analyzed based on past information about the applicant and attributes associated with the property.

A financial services company, in the credit card domain can minimize the portfolio risk by understanding the factors from existing data that cause customers to default the majority of the time. This information can help them evaluate specific options for EMIs so that the default and associated risk among such customers may be minimized. Some other areas of application of regression analysis include (but not limited to) studies in biostatistics such as

50

clinical trials and epidemiology; engineering statistics such as probabilistic design and quality control; social statistics such as census, demographics and population studies; as well as environment and geographic statistics.

Linear regression models find wide applications in business situations with continuous dependent variables. However, its applicability becomes limited in other situations subject to complex non-linear behavior in the system, unless, the non-linear characteristics of such a system can be transformed with certain assumptions to make it applicable to a linear model.

Traditional forecasting methods based on time series and regression methods, once could consider only a handful of demand attributes. Today with the advent of machine learning, machine learning based forecasting can combine learning algorithm, big data and cloud computing to analyze hundreds of products and highly varied causal attributes simultaneously.

Method description:

Regression analysis facilitates statistical inference, forecasts, hypothesis testing, determining causality (Glenn and Gordon 2009), by defining a formal statistical relation between variables. A regression model uses a representative sample set and certain assumptions to determine how changes in the value of independent variables impact the values of dependent variables (Guerard Jr 2013). The form of the equation that typically relates the dependent to the independent variables is:

y(t) = c0 + c1x1(t) + c2x2(t) +…+ cnxn(t) + u(t)

where y(t) is the estimate of a real response variable y (t), and c0, c1, ..., cn are the constants

51

(sometimes called coefficients) (Glenn and Gordon 2009). The variables x1 to xn are the explanatory (linearly independent) variables, and the error term u(t) represents a random variable that concentrates all the unknowns that exist between an ideal model and the real system in question (Glenn 2003) (Fig.2B.3).

Regression analysis models can be parametric or non-parametric depending how the correlation between dependent and independent is defined (Erilli and Alakus 2014) .

Parametric models can evaluate a future value, based on a finite number of parameters of the given data such as the intercept and coefficient for a single variable model (Hardle and

Mammen 1993). Since the future value is predicted from these parameters such models are termed parametric. Some of the most commonly practiced parametric regressions are linear regression models and ordinary least squares regression models (Gatsi 2016). Non-

(a) (b)

(c) (d)

Fig. 2B.3: Correlations between dependent variable represented on the Y axis and independent variable represented on the X axis, as shown by (Sullivan, Dukes, and Losina 1999) through figures (a) – (d) (a) Denotes strong and positive correlation (b) Denotes weak but still positive correlation (c) Denotes no correlation (d) Denotes negative correlation

52

parametric models have better ability of predicting future values as they take into account infinite dimensions of the parameters associated with the concerned data set. While such models have more freedom and flexibility to represent finer nuances of data, they may be better to define the future value, overfitting may become an issue. (Hardle and Mammen

1993). (Sullivan, Dukes, and Losina 1999)

Strengths & Weaknesses:

Strengths: Regression models are the go-to approach for standard business planning activities such as weekly, monthly, quarterly or annual estimates of sales, revenues, market share, top lines and bottom lines (Studenmund 2000). They are relatively simple to implement and are easily accessible through various software such as SPSS, minitab, SAS,

Stata, R et al. Many specialized software packages are also available for specific applications such as real estate valuation, medical and clinical testing etc. Some relatively simple estimates can be carried out using common applications such as Microsoft Excel. Business decisions involving larger stakes may benefit from the use of sophisticated non-parametric models such as Bayesian models which can take into account, a greater variety of factors about upcoming changes and their impact on future estimates (Erilli and Alakus 2014).

Review of purchase data and other relevant information including demographics may reveal insightful patterns for decision making. Systems integration and project management functions can course correct project plans based on estimates obtained from regression models.

Weakness: One of the problems with regression analysis is to find appropriate independent variables that promise a relationship with the dependent variable (Guerard Jr

53

2013). This can be approached using multivariate correlations, i.e. identifying the correlation coefficients that exist between multiple potential variables. Inferences based on relationship between variables, must be drawn with prudence and diligence as correlation can be confused with causation, and lead to erroneous results. Parametric methods have limited ability in handling complex evaluations as they assume a finite set of parameters. The parametric models are limited in their complexity even if the data is not limited. Non parametric models have higher degree of freedom and flexibility in tackling complex problems as they are not limited by the amount of data that can be captured through the parameters.

Still, the range of prediction is short to midterm, for specific values of variables as these methods are powered by the past data and patterns. They are not to be used for providing general directional long term view of future landscape and they may not take into account additional factors that may emerge in future and alter past patterns. Regression models have to take into account certain assumptions in order to represent reality. The measure of a good fit is that the values predicted through a regression model are close to the past values on which the model is based. The information or data from past feeds into the models, hence the sample set that analysts choose to feed into the model also determines the accuracy of the model. Overall, the success of the model depends on factors such as fit, sample set and assumptions embedded in the models.

Conclusion:

Regression analysis in itself can be said to be explanatory in nature as it can explain the relationship between the factors to be forecasted and the independent variables in that environment or problem space (Makridakis, Wheelwright, and Hyndman 2008). As for its

54

role in area of futures research, regression analysis can be seen as a predictive method as it

estimates a future value based on historical data. Different models of regression analysis may

vary in terms of data required for modelling, cost, time needed to run, accuracy and precision

of prediction, but their approach is predictive as they forge a way of prediction based on

behavior of variables in the past. (Kestel 2013)

Time Series Analysis

Method Introduction:

Time series models reveal the internal relational structure of variables- such as a seasonal

or cyclic pattern, a trend, (see Fig.2B.4) or autocorrelation – via a periodically ordered

sequence of outcomes based on analysis of past data (Jon Scott Armstrong 2001).

(a) (b)

(c) (d) Fig. 2B.4: Kestel provides examples of structures of variables (Kestel 2013) from (a) – (d): a) Cyclical variations b) Random variables c) Seasonal variations d) Trend variable

55

History:

The very first actual application of time series in form of autoregressive models, i.e. regression of time windows within a time series to a temporal pattern, can be traced back to

1920s -30s through the works of Yule and Walker (Walker 1931). During this time researchers, with the aim of removing periodic fluctuations in time series introduced the method of moving average, where the average is measured in a sliding time window. Herman

Wold introduced Auto Regressive Moving Average (ARMA) models for stationary series in

1930s (Wold 1939). The estimation of “maximum likelihood”, defined as the most desired values of the parameters are those, which make the probability of observed data most likely

(Fisher 1925). In 1970, G. E. P. Box and G. M. Jenkins explained the procedures of modeling for the purpose of forecasting, running diagnosis and estimating future values (Box and

Jenkins 1970) . In the following decades, the Box-Jenkins models became one of the most commonly used techniques used for forecasting and seasonal adjustment. VAR (Vector Auto

Regressive) among the subspecies of the autoregressive – moving average or ARMA models have also become popular, but they are only applicable for stationary time series, where stationary means that statistical properties such as mean, variance etc., remain constant over time (Hyndman and Athanasopoulos 2014); a rising trend exhibited by economic time series would be an example for non-stationarity (Nelson and Plosser 1982). Tests for non- stationarity developed mainly during the 1980’s when co-integrated time series were introduced in “error-correction” models (Granger 1986).

Today, time series methods have evolved and more sophisticated methods such as

Recurrent Neural Networks (RNNs) (Connor, Martin, and Atlas 1994) enable predicting

56

future values in such time series faster and more accurate. RNNs are models with internal memory which they can use to process arbitrary sequences of inputs (see Fig. 2B.5) enabling

Fig. 2B.5: Guh et al shows an example of cycle identified by a RNN-based pattern discrimination algorithm (Guh et al. 1999).

them to perform sophisticated tasks of pattern recognition in infinitely complex cases, such as handwriting (Graves et al. 2009)or speech recognition (Sak, Senior, and Beaufays 2014).

With the advent of various instances of statistical analysis software, the use of time series has become widespread including both simple and the most recent techniques. Some of the commercial and open source software tools include “Time Series Objects in MATLAB”,

“Statistics” and “Modeler” packages in SPSS, Windows Statistical package “EViews” and others. (Guh et al. 1999)

Applications:

The areas of applications of time series models are varied and its purposes manifold.

Since its introduction, time series analysis has been used for different cases including to

57

forecast periodic sales, inventory planning and weather forecasting (R. B. Miller and

Hickman 1973) etc. Time series models have been found to be effective under the assumption that past conditions of periodicity will continue to prevail, thus providing at least a baseline forecast. The need for these forecasts encouraged organizations to develop better forecasting techniques by combining time series models with other methods. Time series models applied on massive data sets, for example utilize methods of data mining to reveal temporal patterns in data for more precise prediction of future values.

Some areas where the application of time series analysis is commonplace are the stock market, values of goods, econometrics, financial planning, inventory and logistical planning, marketing and communications planning. Having found relevance for such a wide variety of applications, time series analysis is still a method that is more suited to a periodic and shorter term prediction as it relies on past data and as external factors are less likely to change and impact over a shorter period of time. Hence, such methods are best suited for relatively stable situations (Makridakis, Hyndman, and Wheelwright 1998). In case of substantial fluctuations and underlying conditions being subject to extreme change, time series methods using regression in time, may not be that reliable (Makridakis, Hyndman, and Wheelwright 1998).

Also, time series models require past data to be available in sufficient quantity, to be reliable and truly representative for the forecasts to be reliable.

Full description:

Time-series analysis uses simple or complex mathematical methods to derive an equation that best fits a given set of historical data points. These methods can “range from simply drawing a curve through the historical data points of a variable in a way that appears to

58

minimize the error between the curve and the data to analyses that involve deriving equations that mathematically minimize the cumulative error between the given and reconstructed data.

The equation can be a straight line or a curved line, a static average of past data or smoothed using a moving average or exponential smoothing (allowing more recent data to have a greater effect on the smoothing process). If the fit is good, the plot can be extended into the future to produce a forecast.”- (Jon Scott Armstrong 2001)(Theodore Jay Gordon 1992).

A time series defines a sequence of numerical values, spaced at even or uneven time intervals representing a measurable quantity of a system of interest. The numerical values correspond to a unique characteristic of the system of interest. For the purpose of forecasting the numerical values help in understanding the structure of the system - the various constants, variables, disturbances, and noises. This understanding together with how these components effect the model and observed data, can be used to create forecasts within the desired accuracy. The key aspect of time series modeling that differentiates it from other mechanical approaches is the treatment of ‘uncertainties’. Uncertainties need to be accounted for while designing the model (to select the correct technique to build the model) as well in the use

(Glenn 2003). Wherever possible the model should indicate the degree of confidence or reliability of the predictions by adding either bounds or confidence intervals. The model should enable understanding of not only how a certain phenomenon happened, but should also to be able to predict future behavior within confidence intervals that can be calculated.

Identifying the types of patterns in historical data is an important aspect of selecting the appropriate time series method. In fitting a set of data for time series modeling, a computer software program may be used to examine the historic data to determine how well an

59

equation or series of equations can fit or duplicate those data (Jon Scott Armstrong 2001).

The equations can be either linear or nonlinear, with points in a plot lining up on a straight line or not. Nonlinear equations include quadratic or higher order, sinusoidal or S-shaped curves (represented for example by a Gompertz curve (Winsor 1932) (Jarne, Sánchez-Chóliz, and Fatás-Villafranca 2005) or a logistic function or more complicated wavelets as used in machine learning. Based on the data observed, there are a number of typical data patterns: a)

Horizontal, b) seasonal, c) cyclical and d) trend (Chatfield 2016) (Makridakis, Wheelwright, and Hyndman 2008):

• Horizontal pattern (Fig. 2B.4b) appears when data values fluctuate around a constant

mean. If sales of a product do not change much over a period, the sales data would

exhibit a horizontal pattern (Makridakis, Hyndman, and Wheelwright 1998).

• Consistent increase or decrease in values over a long term represents a trend (Fig. 2B.4d).

The increase or decrease does not have to be linear. A trend may be said to have changed

direction when it goes from an increasing trend to a decreasing one (Hyndman and

Athanasopoulos 2014). Economic indicators of various kinds provide examples of a trend

pattern over a period of time. The exponential growth of the stock market would be non-

linear long-term trend.

• Seasonality (Fig.2B.4c) is said to exist when the values of a time series are affected by

seasonal factors such as a certain time during the year, a certain day of the week etc. –

sales data estimation of certain products during different seasons, planning for weekly

marketing deals etc. Since it depends on seasonal factors, it is always exhibited for a

certain known time period (Hyndman and Athanasopoulos 2014).

60

• Data is said to exhibit a cyclic pattern (Fig. 2B.4a) when the increase and decrease in

values occur over varying periods of time. A cyclic pattern is observed when the data are

influenced by longer term fluctuations such as those associated with business cycles

(Zarnowitz 1994). Sales of houses, major appliances, automobiles or industrial output

may exhibit a cyclic pattern. Cyclic pattern is different from a seasonal pattern in the

sense that seasonal pattern occurs over a fixed duration of time related to seasonality,

while the cyclic trends can occur over different lengths of time, and the gradual increases

or decreases become less predictable each time (Makridakis, Hyndman, and

Wheelwright 1998).

Strengths & Weaknesses:

Strengths: If futures research needs to be conducted with the assumption of constancy, i.e. things will keep going as they have in the past, statistical modeling can also provide a surprise-free baseline forecast.

Weakness: Statistical time series analysis methods as presented here typically assume that all of the information needed to produce a forecast is contained in historical data. Models based on historical data often do not entirely capture the real life intricacies of the system.

The system that generated past data, could have also evolved and changed over time. If this change is not accounted for in the model, it may lead to erroneous results.

Conclusion:

Time series analysis is exploratory or explanatory depending on the nature of inquiry.

Time series analysis can be used to explore data or to confirm a hypothesis. In exploratory models, the researcher may not necessarily know what they are looking for. On the other

61

hand, in an explanatory time series analysis, there is a hypothesis that needs to be tested. For

our purpose, time series predictive analysis is most reliable when applied to shorter terms

(monthly, quarterly or annually in the case of design trend forecasting) and stable situations.

Cross Impact Analysis

Method Introduction:

Cross Impact Analysis method estimates the probability that the occurrence of any of the

potential future developments will have an effect on the likelihood of occurrence of other

future events.

History:

Cross Impact analysis was first developed in 1966 by T.J. Gordon and Olaf Helmer

(Helmer, Brown, and Gordon 1966) in form of a game about creating forecasts based on how

future events may interact. In 1975, John Stover created a simulation to conduct cross impact

analysis for the economy of Uruguay based on an approach that combined dependence on

time and systems dynamics. Shortly after that a simulation method, ‘Interlax’, incorporating

cross-impact concepts was developed at University of California (Glenn and Gordon 2009).

In the past decades, the application of the approach had expanded to building scenarios

(Fontela and Rueda-Cantuche 2004), understanding causes of floods (A. Winterscheid 2007),

geopolitical evolutions (M. Godet 1993) and much more.

Applications:

Various applications of the cross impact analysis have involved combination with other

methods, with the most promising ones being simulation modeling, Real-Time Delphi and

62

Questionnaires. Cross impact analysis applied through gaming and simulation has also

proven valuable for strategic decision making.

The cross-impact analysis is rooted in the fact that the occurrence of most events and

developments are in one way or another related18 to some other events or developments

(Makridakis, Hyndman, and Wheelwright 1998). A single event at any given point in time is

made possible by a combination of events preceding it and then, in turn, becomes a precedent

to the events of the future. In the later part of 20th century and early part of 21st century,

technology-enabled faster production of goods, trade policies favored internationalization

which in turn promoted a globalized economy. The interdependence between different

happenings that results in a certain event, as an effect of another eventuality is referred to as

the ‘cross-impact’ between them. In 1993, Godet, in his book ‘From Anticipation to Action’

wrote about the application of cross-impact analysis to a diverse set of areas- from corporate

scenarios to the nuclear industry, and from aircraft construction to geopolitical evolution (M.

Godet 1993). Winterscheid used cross-impact analysis in the study of flood risks in a portion

of the Rhine with 25 experts completing independent cross-impact matrices linking flood

cause and effects (Winterscheid, 2007).

The group decision support system developed by Brent Vickers is a software application

built on the concept of cross-impact analysis (Vickers 1992). Vickers set the parameters of

cross-impact through the Delphi method (introduced further below) and provided a

computerized and interactive way to aid decision making. During this study, which was

18 The relational impact of one impact on another can also be modelled as a complex system, where the networked nature of the system is taken into account more explicitly in the form of network topology (Mitchell 2009).

63

conducted in 1990, the focus was estimating the business conditions of European automobile

industry in 2000, making it a long-term, future scenario projection study. Vickers’

application not only expedited the learning for the panelists but also the process achieving the

potential outcomes.

Method Description:

Gordon stated that the Cross-impact method provides an analytical approach to

postulating the probability of occurrences of a certain event in a forecasted set within a

specified time period (Theodore J. Gordon and Hayward 1968). The process of conducting a

cross-impact analysis as explained by Gordon begins with listing the events to be included,

based on an extensive review of the literature and expert interviews in the area of research

(Glenn and Gordon 2009). Upon careful review of this list, more events which are considered

to be closely related to can be added, and those found to be unrelated can be taken off. If the

events are independent of each other, the analysis becomes much simpler. Determining the

set of events is followed by calculating the initial independent probability of each event.

Explaining the workings, Glen and Gordon (Glenn and Gordon 2009) and Makridakis et al.

(Makridakis, Wheelwright, and Hyndman 2008), write that the probability of each event is

judged first in isolation and then cross-impact analysis adjusts the initial probabilities for the

influences of the other events i.e., what would be the initial probability of an ‘event 1’, and

what would be the conditional probability of the same ‘event 1’, if ‘event 2’ occurs Fig.2B.6.

Rochberg (Rochberg 1970) defined the factors for assessing interactions between different

events as:

- Direction or mode of interaction

64

- Strength of interaction

- The time delay of the effect of one event on another.

Event 1 Event 2 Event 3 Event 4 Event 5

Event 1

Event 2

Event 3

Event 4

Event 5

Fig. 2B.6. Makridakis et al. represent Cross Impact Analysis through the above schematic diagram (Makridakis, Wheelwright, and Hyndman 2008) which represents a non-systematic matrix. In the Cross Impact matrix, cells indicate an asymmetric relationship, how much one event in a row impacts an event is a column.

Once the cross-impact matrix is created in this manner, the matrix can be analyzed using a computer and techniques of simulation. The computer-aided analysis of the matrix involves the random selection of an event and deciding its occurrence or non-occurrence based on assigned probability. The probability of the remaining events in the forecasting set is then adjusted according to the interaction factors discussed above. The same process of deciding occurrence and non-occurrence is repeated with all remaining events, using the new probabilities. Cross impacts are normally calculated using odds ratios. The ratio of the new odds to the initial odds is used to define the event’s cross impacts (Glenn and Gordon 2009).

Strengths & Weaknesses:

Strengths: The cross-impact method focuses on causal relations between events (Helmer

65

1977). As such, cross-impact analysis method emphasizes upon the causality between different events, while having one event at the central perspective at a time. If the result of a cross-impact analysis is unexpected, then the view of expected reality and assumed interactions between events can be reexamined.

Implementation of a cross-impact matrix questionnaire derives the interdependencies and dynamics between two events. Integrating the cross-impact matrix approach with other models strengthens the explorative aspect of the research by taking into account any external events which may impact the postulation of future results, thus eliciting an update of the calculation model. This provides an opportunity, of incorporating ‘sensitivity’ to changes as an important parameter of the process of forecasting.

Weaknesses: One obvious demerit while manual processing of the analysis is that collecting the data can be tedious and exhausting, for instance, a ten-by-ten matrix would require, 90 conditional probability judgments to be made by the analyst/analysts (Theodore J.

Gordon and Hayward 1968). For a matrix size n, the number of conditional probability judgments required would be n times (n-1). Number of probability judgments would increase with higher order of matrices, which would necessitate using computation via machines.

A cross-impact analysis conducts the analysis with the assumption that conditional probabilities are more accurate than a priori ones (Glenn 2003). This is misleading, as it may not be the case for every research. Since the impact of various developments on each other is dependent on their sequence of occurrence (Eymard 1977), the probability of the impact of one development on the other changes with change in sequence of occurrence. The research hence, needs to ensure that all possibilities of events and sequences have been thoroughly

66

studied and accounted for.

Gordon and Greenspan (T. Gordon and Greenspan 1994) cautions that it may also be difficult to understand the consistency of cross impacts for some, especially in case of complex systems. Gordon and Glenn (Glenn and Gordon 2009) also warn that some researchers believe the technique to be not appropriately vetted. For instance, Macy et al (Centola and Macy 2007) in their experiment about how weak but long ties for complex contagions such as high-risk social movements, avant garde fashion etc., show that long ties impede diffusion as the minimum number of participants required to hit adoption target increases. And lastly, when dependent only on expert opinion, the process invariably picks up inherent biases and gets restricted to the level of expert knowledge.

Conclusion:

Cross-impact analysis may be considered interpretative in approach as it is mainly used in technological forecasting studies and relies on relies on the use of experts. From typically being used in combination with the Delphi method through simulations, it has now found its own place as a method of significance. Now, it is also relied upon as a complimentary method to other techniques, to explore new perspective of research questions and broaden the research scope. Banuls and Turoff (Bañuls and Turoff 2011) validate Cross-impact analysis as a method, that is valuable for strategic decision making in a myriad field such as manufacturing, consumer goods, market strategies etc. This approach is also relied upon by business strategists, industry experts, academia, research and government. When combined with methods of observing current trends, cross impact analysis can help channel a new design and development discussion based on the impact of pertinent factors and their sequence of occurrence.

67

Decision Analysis

Method Introduction:

Decision analysis is a technique to systematically handle the entire range of possible outcomes for the specific forecast query. (Njeri 2017)

Fig. 2B.7 Representation of a basic Decision model. Source: (Njeri 2017).

History:

The history of decision analysis is often traced back to Pascal’s exploration of decision making under high degree of uncertainty in the 17th century. Bernoulli introduced the concepts of risk and utility in decision making process in 18th century. However, most of this work had inbuilt biases as the decision making incorporated the rationality of the experts carrying out the process. By 1979, Tversky and Kahneman, conducted psychological experiments to highlight the workings of human decision making and the inherent biases and irrationalities. (Kahneman and Tversky 1979). Michel Godet improved the method and introduced the MULTIPOL technique. MULTIPOL assumes that there could be more than

68

one future outcome for a decision taken at present (Lennon, Pearse, and Godet 1979).

Nowadays, the Decision analysis is being used by big businesses and major corporations as well in order to plan for multi-million and in some cases, even multi-billion-dollar capital investments. Oil and natural gas giant Chevron, swears by its award-winning decision analysis process (Chevron 2010).

Applications:

Decision analysis can be said to be a prescriptive approach, focusing on quantitatively dealing with uncertainties. Such decision analysis methods are used in a wide variety of fields, ranging from energy sources exploration, medicine and health care, organizational management strategy to the various branches of business (Glenn and Gordon 2009). Some real world examples of decision analysis are:

• Linkov and team conducted an environmental study in 2004, regarding pollution and

contamination of ecological sites. The study included multiple criteria as parameters of

decision analysis, as well the role of risk assessment and stakeholder participation

(Linkov et al. 2004).

• Decision modeling was employed to study risks associated with natural gas pipelines in

Brazil (Brito and de Almeida 2009). The researchers used a utility matrix approach,

based on various hazardous scenarios which resulted in a hierarchal list of pipelines

ordered by the associated risks.

Full Description:

Decision analysis systematically evaluates all possible outcomes of a forecast query by specifying a range of outcomes for any uncertain quantity (also referred to as a random

69

variable in statistical literature), and thereafter determining the expected outcome based on all of the relevant certain events. Makridakis et al. (Makridakis, Hyndman, and Wheelwright

1998) explain that Decision analysis can combine different forecasts with a range of possible outcomes with varying degrees of variability into a systematic framework for making decisions. Thus, this probability based methodology, enables managers and analysts to make decisions by explicitly accounting for uncertainties. For a given forecasting query, the possible actions available can be determined and related graphically by drawing a decision tree (Makridakis, Wheelwright, and Hyndman 2008), exemplified in Fig.2B.7. In this decision tree diagram a square indicates a decision space and a circle indicates an uncertain event. The distinction between decisions and uncertain events is important and necessary in this type of analysis. In case of statistical probability, varying degrees of probability are possible with the probability for any specific event being between zero and one. In case of multiple event outcomes, the sum of probabilities of each individual outcome will be equal to one.

However, when using subjective and not statistical probability for decision making, the

decision making process can be explained in following steps (Glenn and Gordon 2009):

a) The pertinent events and decision points must be identified and placed in proper

sequence through a decision tree diagram.

b) Decision criterion must be determined.

c) The next step involves evaluating each of the possible paths in the decision tree in terms

of the selected decision criterion. This step may be deemed complete when an end point

value has been obtained for all of the end points in the decision tree.

70

d) With all end points defined, probabilities must be assigned for all the uncertain events.

These probabilities could be based on subjective estimates made by the decision makers.

Otherwise, these probabilities could also be derived from quantitative methods in

keeping with the properties of probabilities.

e) Next, the analysts are working from the ends of the tree backward by computing the

expected values at each event point, then at each action point selecting the action branch

with the highest expected value.

f) The final step involves specifying the sequence of decisions that will be made given

various sequences of event outcomes.

Some researchers prefer Classification and Regression trees (CART or C&RT) as

an alternative to decision trees, but the core technique remains the same. Rokach and

Maimon describe (Rokach and Maimon 2014) Decision Trees as a tool in data mining

applications based on description of data. More specifically, Decision trees are also used

in machine learning as predictive models to draw conclusions about an item or query’s

future value based on the classification mechanism. Finally, and notably, Decision Trees

are helpful in visually representing decision analysis in a decision making process.

Strengths & Weaknesses:

Strengths: Decision trees implicitly perform the function of classification through either filtering by variables or by features. In analytics when a decision tree is fit to a data set for training, the decision tree algorithm identifies the most important variables within the data set and gets trained based on the top few nodes on which the tree is split and then goes on to

71

automatically complete the feature selection. While decision trees accomplish classification based on training data, the data set for training might also be a source of bias.

Decision trees do not warrant any extra effort towards data preparation, as they are agnostic of units of measurement of the parameters while some other methods such as regression will need the data to be normalized first. Additionally, decision trees do not stop classification of data, in case of missing data values. When utilized for classification, Decision Trees as a

‘tool’ are characterized by the fact that their structure does not change by normalizing or scaling data – since the splitting or classification of data through the decision tree happens only within the portion of data samples within split ranges. Any anomalies do not impact the decision tree. Finally, Decision trees are easy to interpret and self-explanatory. This comes in handy while communicating an executive level view.

Weakness: Apte and Weiss (Apté and Weiss 1997) caution against over-fitting the training data, as it may lead to ambiguous classifications and erroneous predictions. In order to account for over-fitting, Apte and Weiss (Apté and Weiss 1997) recommend following up with the subsequent “pruning” step, where the tree can be generalized by removing sub-trees.

Conclusion:

Decision analysis is about formulating a process to make choices from the available alternatives, based on how true an alternative is to selection criteria and the relative significance of the criteria. However, the idea of these criteria and their relative importance in the decision-making process is likely to change, either over time or due to external factors not part of the decision tree.

A simulation model of the system, under decision analysis, provides a better

72

understanding of the forces at play and the expected behavior. Decision analysis matrices

have the ability to provide clarity for complicated and short and mid-term decision making as

well. With multiple moving parts, this tool can certainly provide clarity for the immediate

best course of action.

Machine Learning Methods

Method introduction:

Bengio, Hinton, and Le Cun, three outstanding proponents of the field, describe Machine

learning as a way to train computers with existing data and observations, enabling them to

classify new information as belonging to a particular category (LeCun, Bengio, and Hinton

2015a). With its’ origins in the field of artificial intelligence, machine learning borrows from

the fields of statistics and computer science where algorithms learn from data and present

new modeling approaches.

History:

Statistical methods for probabilistic decision making had inherent problems of data

gathering, representation and requirements of restrictive assumptions, which lead the

researchers to explore the possibility of machines learning from data (Russel and Norvig

2003). Alan Turing was the first one to propose an example of artificial intelligence (Hodges

2012) in form of a ‘learning machine’ called the Turing Machine, which could learn and

become artificially intelligent. First introduced in 1936 (Turing 1948), this development

brought interest in exploring neural networks in biology, for example, the perceptron

(Rosenblatt 1958). Other algorithms were eventually developed over the years such as

artificial neural networks, association rules, decision trees, reinforcement learning and

73

Bayesian networks (Uysal 2004). Different algorithms varied in their approach to decision making, data requirements, the accuracy of results and time taken.

The concept of Explanation-Based Learning was introduced by Gerald Dejong in

1981, wherein based on the analysis of training data sets a computer creates a general rule to be followed for decision making (DeJong and Mooney 1986). The research was taken further in the 90s, as scientists created programs that enabled ‘learning’ for computers by training them with large volumes of data to analyze and draw conclusions. As the computational power continued to advance, the capabilities of machine learning were greatly increased by automatic application of complex mathematical calculations to big data at a much faster speed. In 2006, Geoffrey Hinton coined the term “deep learning” (G. E. Hinton, Osindero, and Teh 2006). Deep learning was about the programs which enabled computers to recognize patterns in form of text, images, or videos (LeCun, Bengio, and Hinton 2015b). Google’s

XLab’s machine learning algorithm autonomously browsed YouTube videos and identified the videos with cats (N. Jones 2014). Facebook’s ‘DeepFace’ algorithm recognized individuals on photos almost as well as humans (Taigman et al. 2014). As recently as 2016,

Google’s artificial intelligence bot, ‘AlphaGo’ won at Chinese board game Go by beating the world champion (Gibney 2016).

Applications:

The rapidly advancing and expanding the field of machine learning is being utilized not just for conventional planning but also toward providing a better experience – be it retail, e-commerce, travel, and tourism or cybersecurity. Retailers are leveraging machine learning for a broad range of tasks, from demand forecasting, merchandise planning, financial

74

planning, pricing, lifecycle management, assortment planning, inventory management, replenishment estimates, to even identifying potential customers and providing personalized recommendations (Sun et al. 2008). Healthcare and medical science is another area which has immensely benefitted from advanced machine learning methods in cases of prognosis of diseases (Kononenko 2001), predicting seasonal flu and epidemics (Choi and Varian 2012a) as well as highly specialized cases of cancer prognosis (Cruz and Wishart 2006). Branches or environmental science and ecological modeling are also finding machine learning invaluable for establishing predictive models and simulations (Recknagel 2001), (Olden, Lawler, and

Poff 2008). Business and financial forecasting have long been employing data mining processes for continued monitoring and predictive analytics (Bose and Mahapatra 2001), (K.

Kim 2003).

Method Description:

In the last decade, advances in methods were made, which are characterized by unprecedented amounts of data and entirely new computational processes that are hard to classify into the crude dualism of quantitative vs. interpretative.

The conventional parametric methods continue to get more refined, sophisticated, and less affected by biases due to better technology and practices. But in addition to those advances, the last decade has witnessed the rise of methods using Machine Learning, i.e., methods that are different in nature, (R. N. Anderson et al. 2015) and (Kalmár and Nilsson

2016). Sophisticated algorithms mine large volumes of data (Leonard 2014) for a better understanding of cultural patterns. Google allows for the estimation of near future trends based on current search patterns (Choi and Varian 2012a). Computational models

75

empowered by unsupervised machine learning approaches such as Deep Learning and

Reinforcement Learning resulted in a great deal of improvement in various kinds of pattern

recognition (LeCun, Bengio, and Hinton 2015c). Traditional quantitative forecasts based on

econometric models dating back to 1960s can consider only a handful of attributes. Machine

learning based forecasting, on the other hand, combine learning algorithms, massive amounts

of data and cloud computing to analyze from hundreds to millions of products and highly

varied causal attributes simultaneously.

Machine learning methods are being used to generate highly accurate predictive models by identifying structure in even complex non-linear datasets (Shmueli and others 2010). With advances in ease and amount of data collection, large and high-resolution datasets are available to be worked with.

Machine Learning methods have a diverse taxonomy of models corresponding with the nature of outcome- broadly classified into supervised, unsupervised, reinforcement learning and recommendation systems (Jordan and Mitchell 2015). Olden et al explain supervised learning methods as those, which can model the relationship between inputs and known set of outputs (Olden, Lawler, and Poff 2008). Supervised learning methods include decision trees, decision forests, logistic regression, support vector machines and Bayesian classifiers (Uysal

2004). Supervised learning methods are used to train models for identifying or classifying new information based on training data set, such as computers being able to recognize facial features based large amounts of images of faces used to train them.

Unsupervised methods of machine learning are concerned with revealing patterns in large data sets. Unsupervised methods include dimension reduction methods (such as principal

76

component analysis, manifold learning, factor analysis, random projections and auto-encoders)

and clustering procedures (Jordan and Mitchell 2015). Dimension reduction methods work on

the analysis of unlabeled data by making assumptions about the structural properties of the data

(J. A. Evans and Aceves 2016). Clustering procedures, on the other hand, work on finding the

desired labeling for the data based on observation of available data and a rule for predicting

future data. Bishop believes that the growing use of methods, such as Hopfield neural networks

and self-organizing maps, reflects their ability to model complex and non-linear relations

without the restrictions of assumptions needed for conventional, parametric19 approaches (C.

M. Bishop 2006).

Other than these, deep networks such as convolutional neural networks (CNNs), in recent

years have become a rapidly evolving and highly impactful method of supervised learning.

Deep networks are complex multilayer networks of distinct units. Each unit in the network

computes a simple parameterized function of its inputs (Jordan and Mitchell 2015) where the

parameters can be adjusted through gradient-based optimization algorithms. In recent years

such deep learning systems have been applied on large digital collections of text, images, voice

samples, and videos yielding a promising outcome in the field of computer vision and speech

recognition (G. Hinton et al. 2012) (Krizhevsky, Sutskever, and Hinton 2012).

In between supervised and unsupervised learning lies reinforcement learning. The aim of

Reinforcement Learning is to use training data to identify whether an action is correct or not

based on reward or ‘punishment’. The goal of the system is to learn until all actions identified

are correct (McCallum 1995). However, this identification takes place in the hidden state. If an

19 Parametric and non-parametric models have been explained in the section on Regression analysis above.

77

action is identified as incorrect, and results in a harmful outcome, finding the correct action still remains a problem and training from hit and trial may take a long time. Instance-based state identification can considerably reduce the number of training steps (McCallum 1996).

Reinforcement learning involves an agent acting in a dynamic environment. The focus of the learning task is to train the agent with a strategy to choose actions for any given state and maximize correct outcomes over time.

In addition to the classical approaches, such methods of pattern recognition and classification can prove to be invaluable in forecasting with the potential to transform the landscape of prevalent forecasting methodology. On a tactical level, machine learning methods are goal oriented. Some are engineered for classification and pattern recognition, some toward deriving numerical values based on training dataset etc. They also vary in terms of being supervised or unsupervised, amount of data for the feasible application, require different resources and perform at different speeds.

Examples of some simple fundamental machine learning methods include K-Means for clustering, Naive Bayes, Support Vector Machine or SVM for classification, Principal

Component Analysis or PCA for Dimensionality reduction, Adaboost for boosting the performance of models like decision trees, and Logistic Regression for binary classification

(Amatriain et al. 2011). Some of the most commonly used analytics tools and machine learning libraries exist in R, Python, Teradata, SAS etc. Some methods can be readily performed on state-of-the-art laptops, while others require substantial infrastructure in the form of GPU- based high performance compute clusters. Each method has its own merits and limitations and their selection depends on the desired end goal.

78

Strengths & Weaknesses:

Strengths: The humungous rate of generation of digital data today, again warrants a digital medium to process it. Without the automated systems based on machine learning, it is difficult to imagine such a digital ecosystem. With advancements in machine learning, and in particular deep learning mechanisms are becoming more robust and less expensive to implement.

An interesting aspect of unsupervised and deep learning models is ‘feature learning’ of previously unknown features. Typically, a deep machine learning output provides estimates, based on input observations. Despite being initialized and trained on some data sets, the system can ‘learn’ feature representations for a given task. Conventional statistical methods needed experts to formulate the features by detailing multiple parameter sets. Machine learning has expedited and automated the process, thus making it possible to find relevant features even in disordered data sets. This has opened opportunities in the field of pattern recognition making advanced face detection, face recognition, speech recognition or image classification possible.

One of the main differentiators for machine learning methods as opposed to their statistical counterparts is their ability to handle a large number of dynamic parameters. A deep neural network can work with a billion of such parameters which are tunable. Machine learning tools use a gradient-based method of optimizing a large number of parameters.

When these parameters are tuned and appropriately set to function, they enable the ‘system’ to function properly. It is not manually possible to find the optimal setting for such a large number of parameters, hence methods like stochastic gradient descent are used in such cases.

79

Weaknesses: Machine Learning models are typically trained and validated on a smaller data set, and then applied to much larger data sets for the purpose of prediction. Any inherent biases present in the training dataset, get carried on with the model which could lead to inaccurate results. The nature of this limitation lies in the in the fact that machine learning models have their roots in statistical methods which exhibit similar limitations. This limitation presents just the machine version of you only see what you know, challenging qualitative methods for centuries.

As much as machine learning methods are being talked about as the future of forecasting, they are not applicable in all cases. In fact, researchers often argue that they do a better job at identifying patterns than predicting them (Shmueli and others 2010). In most situations, machine learning algorithms need to be trained using large volumes of data.

Sometimes such volumes of data are not available or extremely difficult to obtain and manage.

Conclusion:

Machine learning is a sophisticated tool. Like with any kind of tool, we need to be aware of when and for what purpose can machine learning be used. Between simple and complex algorithms, we need to know how to choose. Working with machine learning systems necessitates a clear scope and definition of what needs to be done, based on which multiple systems or ensembles need to be tested to assess the results, before finalizing one kind. In terms of their forecasting application, they essentially make us aware of the expected behavior or mechanism of a system. This sometimes manifests as indicating a future possibility but many times it represents a means of automating or expediting an existing

80

process. In addition, machine learning needs large volumes of data to perform and hence may

be ruled out due to non-availability of data or the expense associated with acquiring data.

It is important to note here that, since machine learning tools provide estimates based on

observations, machine learning as method overlaps between methods of observation and

estimation. The structure of the chapters is the to lay the order of the thesis, but machine

learning as a method can be part of methods of observation as well as estimation.

Morphological Analysis

Method introduction:

Borrowing from Tom Ritchey’s (Álvarez and Ritchey 2015a) (Fritz Zwicky 1967)

description of morphological analysis, the method refers to the analysis of structural

relationships in the area of a scientific discipline where the investigation is situated. Fritz

Zwicky (F. Zwicky 1971), who is credited with creating the framework for general

morphological analysis (GMA), used it to systematically organize and analyze problems in

astrophysics with multiple facets. The most important aspect of the morphological analysis is

that it can systematically analyze problems which cannot be quantified (Álvarez and Ritchey

2015a).

History:

The term ‘morphology’ was first introduced into the scientific discussion by author and

scholar J.W. von Goethe in his “Morphology of Plants” (Ritchey 2011b). It was formalized

in its modern form by Fritz Zwicky of Caltech (Ritchey 2011a). Fritz Zwicky developed the

morphological analysis methodology in the 1930s to examine all possible combinations of

certain attributes systematically in the field of astrophysics (Fritz Zwicky 1947), (Fritz

81

Zwicky 1948a), (Fritz Zwicky 1948b). Zwicky's (Fritz Zwicky 1969) now famously known introduction to the generalized approach said:

“Attention has been called to the fact that the term morphology has long been

used in many fields of science to designate research on structural interrelations

– for instance in anatomy, geology, botany, and biology. ... I have proposed to

generalize and systematize the concept of morphological research and include

not only the study of the shapes of geometrical, geological, biological, and

generally material structures, but also to study the more abstract structural

interrelations among phenomena, concepts, and ideas, whatever their character

might be.”

The origins of morphological analysis lie in the study of the form, structure and the structural relationships between the parts or objects of the particular scientific discipline where this term is used (Álvarez and Ritchey 2015a). Since morphological analysis is based on systematically populating and examining all possible combinations between given attributes in a possibility space it also is a logical tool to be utilized for foresight research studies. Zwicky applied Morphological Analysis (Ayres 1969) in fields such as astrophysics and the design of jet engines since the 1930s illustrating the cost of application of foresight research by limiting possibility space and therefore development cost. Morphological analysis began to be used more widely for forecasting studies by the 1960s.

It is also important to discuss a few other lines of the morphological approaches which have garnered credibility as methods of future scenario generation. Two of such approaches are Rhyne & Coyle’s Field Anomaly Relaxation (FAR) and Tom Ritchey’s further developed

82

“ General Morphological Analysis” or GMA (P. Bishop, Hines, and Collins 2007). Although

FAR was developed independently of the morphological analysis but turned out to be closely related to morphological analysis (Rhyne 1974). General Morphological Analysis (GMA) was described as a “general method for non-quantified modeling” by Ritchey (Álvarez and

Ritchey 2015a). This lineage adhered most closely to the procedural tenets of the morphological approach described by Zwicky and hence represents its canonical perspective.

Rhyne’s approach, Field Anomaly Relaxation (FAR) differs from Zwicky’s morphological analysis in the way that it limits the numbers of parameters to be no less than six and no more than seven based on the cognitive study which states that human working memory is able to work with about seven different elements (Voros 2009) (G. A. Miller

1956). Tom Ritchey’s General Morphological Analysis (GMA) was developed at the FOI

(the Swedish Defense Research Agency) and adheres most closely to Zwicky’s morphological approach (Kosow and Gabner 2008). Many of the factors to be considered for evaluating future scenarios involve non-quantifiable dimensions from the social, political and cognitive dimensions. In 1995, with the development of advanced computer support for

Fig. 2B.8: Framework for scenario projections created using Morphological Analysis. (Pillkahn 2008).

83

General Morphological Analysis, it became possible to develop inference models even for problems which are not quantifiable (Ritchey 1998). This imparted additional functionality to the method, expanding its area of application and relevance. (Pillkahn 2008)

Applications:

Zwicky applied general morphology to a number of complex social-technical problems in different fields of inquiry which required an integrated view of technical, political, psychological and ethical factors (Ritchey 1998). Over the years, morphological analysis has been used as a general method for non-quantified modelling in a variety of areas (Álvarez and Ritchey 2015b) such as design of engineering systems, product design, architecture, future studies, scenario projections, technological forecasting, innovation and knowledge management, security strategies and war prevention, organizational design, policy analysis, management science etc.

Pillkahn suggested Morphological Analysis was also found to provide a versatile framework for scenario projections which could be updated periodically (see Fig. 2B.8). For projecting future scenarios many non-quantifiable social, political and ideological variables are taken into account. Such a framework may be achieved by creating a so-called

“morphological” field by listing the elements in a row and their respective manifestations/variations in columns beneath the respective elements. Upon assessing the consistency for all combinations of manifestations, it can be estimated how well each manifestation matches with all the other individual manifestations. Thus, based on the number of elements and manifestations, a large number of possible combinations that define a scenario may be reached. However, not every manifestation may be credible or consistent

84

with other manifestations- which can be filtered out for while assessing the combinations for future scenarios.

Another area of frequent application of morphological methods is that of organizational strategy and policy making. Voros (Voros 2009) illustrates the application of morphological analysis in policy-making by creating a ‘morphological solution space’ for governmental agencies or public-sector organizations, based on external contextual environment, and then assessing the consistency of configuration with internal policy space. Computer software based on morphological analysis have also been incrementally adopted in the sphere of organization strategy, such as the one developed by Coyle (www.actifeld.com) (Coyle 2004).

Michel Godet, who has extensively used morphological analysis a tool for this strategy research has also developed a suite of software (www.3ie.fr/lipsor) with his collaborators

(Michel Godet et al. 2004).

Morphological analysis can also be used to scan for weak signals, ‘wild cards’ and black swans, by incorporating a pre-emptive stance for an emerging shift or ‘newness’ into the framework rather than a passive receptive position. One such variation of morphological analysis particularly tailored for the analysis of emerging issues for long-range scenarios,

(from multiple years to decades) was carried out by Graham Molitor (Molitor 2003).

Full Description:

Zwicky described the process starting with exhaustive listing and defining of all the independent attributes or parameters involved in the problem space. The value/values that each parameter may represent are also listed. These values may be numerical, categorical or simple qualitative descriptions. Hence, if there are n parameters in the problem space, and

85

each parameter, p (pi: i = 1…n) is characterized by an independent and irreducible value ki then these n parameters pi will generate an n-dimensional ‘possibility space’ (Voros 2009)

This possibility space will incorporate all possible combinations of parameters and their corresponding values. Voros represents these independently possible combinations as the following notation of all ki values (Voros 2009):

The mathematical notation does not denote a quantitative process, albeit it represents a combinatorial model of non-quantified entities. The formula represents the total number of theoretically possible configurations with the given parameters and dimensions. However, not all of the configurations attained would be feasible or countable towards a meaningful possibility. Hence, next step of the process of reduction is carried out to remove any anomalies and leave only those possibilities in the ‘solution space’ which can be considered consistent in practicality. Voros opines that the process of graphically laying out sequences of the potential future evolution of configurations is perhaps the most powerfully intuitive aspect of morphological analysis (Voros 2009). An example for such graphical representation would be the so-called “Zwicky Box” (Ritchey 1998).

Strengths & Weaknesses:

Strengths: A morphological analysis often appears confusing at the onset of the process.

However, in cross-functional or multidisciplinary studies, it can help initiate the investigation by objectively collecting the parameters and their respective variations related to a system.

86

Also, it can be carried out without a large amount of data and even with non-quantified problems. It also does not afford a lot of time to execute and analyze.

By starting an exploration with morphological analysis, and even preceding a study such as Delphi by morphological analysis can provide more structured questions for the Delphi discussion. As such its main power is to provide a systematic start to a framework for an investigation (Wissema 1976).

Weaknesses: The most limiting aspect of the procedure of morphological analysis is that there is no scientific or systematic way of establishing the fundamental functions or parameters or elements of the subject being studied (Álvarez and Ritchey 2015a).

While morphological analysis, as defined by practitioners, acts as an idea generation framework by systematical evaluation of a system’s components, each of these ideas further requires being studied in supplementation by another method such as Delphi or scenario projection to fully defining the concept.

Finally, like most interpretative methods, a successful morphological analysis warrants the team executing the analysis to stay as objective, focused and unbiased as possible. Each proposal must be evaluated for its’ factual merits and relate it objectively to the analysis. The approach taken by the participants can greatly impact the outcomes.

Conclusion:

Morphological analysis is a powerful interpretative tool that helps in communicating the explicit and implicit aspects of a problem space. It does not necessarily rely on the availability of past data, unlike regression or time series methods, but is applicable only to the early part of qualitative forecasting inquiry. A morphological analysis is most valuable

87

when used in combination with other methods. With the development of multiple morphological analysis software, the time taken to carry out the analysis is also reduced considerably.

88

C. METHODS OF INTERVENTION

Interventional methods20, beyond observation and estimation of trends, refer back to the (Paap and Katz 2004) discussion about different scenarios at the beginning of this chapter, where the overall environment changes due to external changes, give rise to new needs. Thus, we are interpreting interventional methods as those that help estimate a change in existing situations due to external interference. A variety of methods can help estimate the conspicuous impact of changes happening around us which may lead to new needs. Adhering to the focus on change in consumer preferences and needs we review Delphi, Ethnography, Surveying, and Design thinking methods in detail here.

Delphi

Method Introduction:

One of the co-developers of the method, Olaf Helmer defines the Delphi method as using

informed expert opinion- gathered, analyzed and fed back - through a series of prudently

created questionnaires, till a converged opinion of the future is conceived (Helmer 1967).

History:

The Delphi method was developed by members of the RAND Corporation, a think tank

focused on Futures Research between 1950s-1960s (Helmer and Rescher 1959). Some key

members identified with developing the initial method were Olaf Helmer, Nicholas Rescher

and Norman Dalkey (Helmer and Rescher 1959). The intent was to gather a legitimate and

holistic fore-view of the impact of technological developments at the time of warfare during

20 In this discussion, Interventional Methods do not refer to medical or clinical studies involving experiments with varying interventions.

89

the initial Cold War years (RAND 2017). The RAND team focused on extracting true expert consensus through unbiased debates without the impediments of personality clashes (Helmer and Rescher 1959). The anonymity of the respondents is a key aspect- their responses are not attributed to them by name. The other important factor is feedback and synthesis. In order to do away with the influence of oratory and pedagogy, the Delphi analysts go through the reasons provided by the participants for extreme opinions and evaluate them based on equal weight and feed it back to the panel for further analysis.

Applications:

The Delphi method has been a popular tool for forecasting and decision-making based on experts’ opinions since its inception in the late 1950s (Okoli and Pawlowski 2004a). The initial nature of Delphi studies and reports was to assess the inclination of long-term trends in areas of science and technology, space research, population growth, war prevention and defense mechanisms (Theodore J. Gordon and Helmer 1964) etc. In the subsequent decades the research areas diversified and the Delphi method was being implemented for more sophisticated applications such as robotics, spread of the World Wide Web, digital connectivity etc. the Delphi method was later, being increasingly adopted for questions about public policy, economic trends, technology in education, healthcare, and business strategy.

With the advent of Real Time Delphi, many forecasting projects are now being conducted real time, alleviating the time taken to carry out multiple questionnaire rounds (T. Gordon and Pease 2006). TechCast Project (“Techcast” 2017) is one such platform which uses a panel of hundreds of experts situated all across the world to collaborate real-time forecasting research studies in all fields of science and technology.

90

In a detailed study on expansion of e-commerce in Sub-Saharan Africa, Okoli and

Pawlowski identified factors affecting the dissemination. illustrated how involving strategic design choice of the method can provide particularly insightful results (Okoli and Pawlowski

2004b). Recently, we see wider acceptance of the Delphi method in the area of advanced healthcare. Dmitry Khodyakov, a senior sociologist at the RAND Corporation in a recent paper on Online Modified Delphi (OMD), illustrate how this modern day approach can be used to engage much larger and more diverse panels of experts and stakeholders in developing health services performance measures for advanced research on Arthritis

(Khodyakov et al. 2016). Such approaches stimulate interest and credibility of the Delphi approach in the important areas of human well-being and future.

Method Description:

Conducting a Delphi study begins with identifying the subject area and scope of research, the next step is to identify pertinent subject matter experts. The identification and selection of experts are crucial to the quality of responses received as part of the inquiry.

During the initial contact, the selected Delphi experts are briefed about the topic of inquiry and invited to participate. The statements made by the panelists are not attributed to them by name while feeding them the survey outcomes back to the panel, so they can express honest opinions to the best of their knowledge. Researchers conducting the inquiry, prepare the questions. The overall of the scope of inquiry is broken into phases and corresponding sets of sequential questionnaires. The first questionnaire addresses the first phase of the enquiry, informing the subsequent. In a second questionnaire, the next set of topics would be presented to the group, and so on. After each round, analysts or researchers assess the range

91

of responses. They may find that most responses fall within a range of possibilities, while others represent extreme opinions. In such a case, the participants may be asked to either revisit their response or validate it through other means. The reason thus sought for extreme opinions inform the synthesis of the next set of questions. The extreme opinions along with the general judgment reached by the group. In view of what is presented the panel may revisit or refute their opinion, followed by a final round where a convergent assessment is presented as the conclusion. Even if participants do not converge towards one opinion, the reasons for extreme responses are made explicit before the expert panel. This is valuable to the researchers conducting the study as it gives further ideas to test and validate against.

Since the Delphi method represents the synthesis of the opinion of a particular group of experts, it cannot be deemed statistically significant as the number of respondents in the expert panel is typically small. Yet, it provides the researchers with the rich expert knowledge in the domain and the points of divergence of opinion with the opportunity to further investigate.

Strengths & Weaknesses:

Strengths: What characterizes the Delphi method’s key strength is its ability to objectively discuss questions which are beyond factual inquiry for various reasons (Theodore

Jay Gordon 1992). While the method provides an opportunity to systematically process expert knowledge and opinion, the presence of conflicting responses and rationalization for such responses is an extremely valuable product of the study.

Weakness: Delphi studies need meticulous planning in order to be performed properly.

As stated earlier- a selection of participants, anonymity, and feedback synthesis are crucial to

92

obtaining good results. Selection of participants is a critical aspect of the approach. A panel lacking in representation of relevant experts can misguide the whole study. Since it requires multiple rounds of surveys followed by synthesis and feedback the process can be very time- consuming. As a result, participants may drop out after a certain point or lose interest.

Conventionally, each survey round required three weeks, making a three-round Delphi a 3 - 4 months affair.

Woudenberg evaluated the Delphi method based on the review of its quantitative applications and found the accuracy of the Delphi method over other methods could not be proven (Woudenberg 1991). His research stated one of the glaring critiques of the method - that often, in reality, the consensus among experts is a result of pressure to attune to the larger general opinion and is further statistically mediated by analysts, before feeding it back it to the group. This emphasizes the importance of maintaining the anonymity of participants, unbiased recording of opinions and unbiased feedback loop, for reliable and truthful results.

Conclusion:

The Delphi method is a versatile tool for decision-makers and continues to be used for forecasting and directional insights. When evaluated for its accuracy and compared with other techniques of similar methodology, no decisive findings could be concluded either in favor of or against the Delphi method (Landeta 2006). Nonetheless, interest in the technique and its popularity continues and advancements such as online and real-time methods of conducting the Delphi have definitely alleviated some criticism around it. In the last 5 years, there has even been a greater evidence of this technique being employed in the fields of business, economics, psychology, health sciences. Delphi method can be also be combined

93

with other synthetic methods such as Cross-impact analysis (Helmer 1977). It may thus be

concluded that the Delphi method is an interpretative method for long-range decision making

that greatly depends on the inherent knowledge and biases of the experts participating in the

process.

Ethnography

Method Introduction:

Ethnography is a systematic and comprehensive method of social and cultural research

(Hymes 1977). Michael Rosen defines ethnography as “the method of participant-

observation, indicating that a researcher is directly involved in the life of a social group,

collecting data through various forms of observation, interviewing, memo and file reading”

(Rosen 1991). Ethnography entails collection of data and analysis, through which the culture

and behavior of a social group is interpreted and conveyed to a larger audience (Akins and

Beschner 1980). The ethnographers are trained in observing, recording and analyzing the

information collected (Akins and Beschner 1980) via observations and interactions with the

group over a period of time (Hammersley and Atkinson 2007). The understanding and

interpretations from an ethnography study may be documented and presented in from of

writings, images, audio or video presentations (Rosen 1991).

History:

Ethnography, as a method has its roots in the discipline of social and cultural

anthropology (Hoey 2014). Han F. Vermeulen,21 a Dutch historian specializing in the history

21 Han F. Vermeulen’s research is credited with establishing the emergence of ethnography and ethnology in the 18th century during the German and Russian Enlightenment, far before the development of social and cultural anthropology subjects in UK and USA in the 19th century.

94

of anthropology during the German and Russian Enlightenment traces the origin and

development of ethnography to 18th century (Han F. Vermeulen 1999). His work throws light

on the development of ethnography as part the research of German speaking scholars of the

Russian Academy of Sciences (Hendrik Frederik Vermeulen 2008).

Ethnography emerged as a discipline due to (a) the scholarly interest of German scholars

such as Messerschmidt, Müller, Gmelin, Steller, Fischer (Hendrik Frederik Vermeulen 2008)

and (b) the Russian empire’s agenda to acquire more in-depth knowledge of people living in

their multicultural and expanding territories (especially the diverse Siberia) for tax and legal

purposes. The combination of these factors, and that ethnography was based on empirical

observations, lead to ethnography being adopted as the new scientific practice in Russia for

describing the peoples, during the first half of the 18th century (Hendrik Frederik Vermeulen

2008). The Russian authorities hired scholars to foray into the empire and collect information

on its culture and nature, developing ethnography as a comprehensive and descriptive study

focusing on people.

Historians Schlozer, Kollar and Müller at the University of Göttigen, spearheaded

formulating ethnography as the science of peoples. Scholzer and his colleague Gatterer are

credited with first incorporating ‘Ethnographie’ and ‘Volkerkunde’ (ethnology) into

mainstream study of history (Han F. Vermeulen, Darnell, and Murray 2015).

Applications:

He is currently associated with the Max Planck Institute for Anthropological Research in Halle and the Max Planck Institute for the History of Science in Berlin.

95

The term ‘ethnography’ embodies a two-fold meaning. Literally, “Ethnos” means people and “graphein” means writing, thereby implying “writing about people.” The practice of ethnography is the qualitative process of studying the surroundings and the inhabitants through participant-observation and interviews (Akins and Beschner 1980). The term “ethnography” is used to refer to the product of an ethnographic study (Hoey 2014).

Since ethnography, depends on systematically understanding the context and possible issues from within the cultural environment, rather than an outside and distant perspective it has been utilized by researchers in broad variety of fields. It has been applied in extremely niche environments to better understand challenges within that environment, and also been utilized to understand cultural contexts of entire tribes.

Researchers at Hitachi (Kashimura et al. 2014), applied an ethnographic research methodology in the area of construction management systems. Ethnography enabled the company to understand the work practices in the field and highlight the inherent challenges, which lead the researchers to develop next-generation construction systems for power plants.

The main objective of the ethnographic research was to uncover the cause of delays in delivery of pipes, which were one of the key components in the construction of the power plant (Kashimura et al. 2014).

The ethnographers assessed the processes at the pipe manufacturing factory (Kashimura et al.

2014) by visiting the factory for four days. Three ethnographers joined the factory workers and observed the facilities for 2-3 hours and conducted 90 minutes of interviews each day

(Kashimura et al. 2014). Working alongside the regular workers the ethnographers were able to closely observe the process and practices which would be hard to observe for outsiders.

Participant-observation carried out in such a manner enabled the ethnographers to identify the

96

hidden needs of the staff and the process issues (Kashimura et al. 2014). As a result of the ethnography study, several issues were brought to light such as, handling methods used by the workers, the project schedules being updated once in two weeks when they should be updated daily and an overall lack of collaboration between the different manufacturing units (Kashimura et al. 2014). By sharing the results of the ethnography study with the workers, the company was able to reduce the delay in delivery times from 17% to 1% (Kashimura et al. 2014).

Within business and management, one of the areas where ethnographic method has been used successfully is organizational ethnography (Eriksson and Kovalainen 2015), including occupational careers and bureaucracies. Organizational theorist Van Maanen describes the purpose of organizational ethnography as – “to uncover and explicate the ways in which people in particular work settings come to understand, account for, take action, and otherwise manage their day-to-day situation” (Van Maanen 1979). Rosen states that organizational ethnography focuses on studying the groups of people who are organized in social relations towards the fulfillment of specific objectives (Rosen 1991). Rosen further notes that while the principles and procedures for organizational ethnography are aligned to that of general ethnography, it is different in the sense the people being studied interact with each other under the specific premise of a goal-oriented activity (Rosen 1991). The ethnographers is this case situate themselves within an organization comprising of people like themselves rather a distinct social, cultural or religious group (Rosen 1991). An ethnographic study is conducted when the problem to be studied can be best investigated by the method of ethnography and conveyed through the documentation of ethnography. By situating themselves within the organizational group ethnographers can closely

97

witness and gather data about day-to-day behaviors and meaning of the interactions between the people, which otherwise would stay shielded (Rosen 1991).

In the area of healthcare, ethnographers participating as “fly on the wall” have evaluated the functioning of healthcare systems, and the work practices in epidemiology, genetics, biotechnology among others areas in medical and healthcare domain (Caprara and Landim

2008). The genre of ethnography in healthcare came about during the second half of the 20th century, as ethnographers increasingly worked on creating the understanding of cultural beliefs that shape healthcare practices (Higginbottom, Boadu, and Pillay 2013). Higginbottom et al, referring to the use of ethnography in the field of healthcare state that often ethnographers have focused on determining the cultural beliefs and practices around an illness rather than studying the group of people involved (such as patients and practitioners) (Higginbottom, Boadu, and Pillay 2013).

Hodgson, for example utilizes ethnography to study the cultural influences around HIV, and what they mean to healthcare workers (Hodgson 2000). In his study, Hodgson explains the importance of ethnography particularly in case multi-factorial and multi-cultural diseases (Hodgson 2000). Savage notes that the emphasis that ethnography brings to ‘context’ makes it particularly relevant for the field of healthcare (Savage 2006). Due to emphasis on the context, the findings from ethnography can be more meaningful and applicable to the practice of healthcare (Higginbottom, Boadu, and

Pillay 2013).

In addition to creating understanding about the cultural perceptions related to health issues, ethnography has been utilized in researching the experience related to illness (Nichter 1987), exploring patient experiences at mental health institutes (Goffman 1961), as well as in recounting the change in the self-perception of patients with terminal illnesses and the ethical practices in such cases

(Lawton 2001).

98

Corporate ethnography with a focus on commercial applications has led to multiple innovations since the first half of the 20th century by uncovering unknown or unmet needs. As a result of 3M engineer Richard Drew’s observation of the process of dual-tone paint application on automobiles, he came up with concept of a ‘masking tape’ in the 1920s (Schlack 2015). While such examples of studies in corporate organizations may not strictly adhere to the scholarly definition of ethnography as the individuals engaging in such studies are not external ethnographers, but members of the internal workforce, such examples do present innovation scenarios within companies. Companies encourage employees to identify opportunities for innovations and improvisations with regards to processes, people, and technologies. Over the years, corporate ethnography has evolved from being a method to initiate product innovation in large corporations to also being utilized to inform long-term and strategic planning (Anderson

2009). Intel leveraged ethnography to explore new avenues and markets for its products. A detailed and strategic ethnographic study in 1995 revealed that there existed a huge untapped market for Intel products outside of their traditional market of offices and workplaces – that of processors for home and personal use (Anderson 2009). In more recent strategic applications

Intel is using an ethnographic study to understand consumer behavior to gain insights into the hidden needs pertaining to TV, personal computer and smart phone usage, which even surveys and data analysis cannot provide (Anderson 2009).

Over the recent decades the method of ethnography has expanded from use in physical environments and surroundings to virtual ones. It is now being used to understand online communities including mobile users and user-centered experience of software applications. The practice of ethnographic research through virtual participant-observation on the web, is also

99

termed as “netnography” or “webethnography” (Prior and Miller 2010). The possibility of observing practices in the virtual world has widened the scope of application of ethnography, especially in areas and conditions where physical presence is either not possible or not feasible.

Another scope for online ethnography is understanding the shift in digital consumer patterns, preferences, behavior and decision making through online forums, response to online campaigns, blogs etc. (Kozinets 2002). Research agency “C Space” ventured into conducting ethnographic studies through mobile phones in 2010 (Schlack 2015). This was not only a fresh approach to ethnography due to the medium being used to collect “fieldwork” information but also because this was a crowdsourced approach to ethnography. They had real clients send them files of images, sights, sounds and experiences of how their products were being used in real life, enabling researchers and innovation strategists immerse themselves in the environment where customers were making decisions.

The realm of software applications is readily leveraging the tactics of ethnography to improve upon the user-experience related to their products, applications and interfaces. Scott

Stiner, whose companies produces software solutions heavily relies on ethnographic research to glean on his clients’ needs and understanding their day-to-day business environment and goals

(Stiner 2016).

Similar to ethnography, cognitive ethnography also involves learning through participant- observation by immersing oneself in the natural environment being studied, but with specific focus on the cognitive process and context (Dubbels 2014). Thus, cognitive ethnography bridges the dichotomy between the disciples of anthropology and psychology (Ball and Ormerod 2000).

Edwin Hutchins, often called the father of cognitive ethnography, emphasizes the distributed

100

cognitive interactions between different contexts and environments (Hutchins 2000). Hutchins, a pioneer in the field of distributed cognition laid down the ethnographic approach to understanding instruction in real world situations (Hutchins 1995). His work “Cognition in the

Wild” proposed an approach of looking at cognitive systems which suggests that cognitive systems may be composed of multiple members which among the cognition is distributed

(Hutchins 1995). Citing the difference between ethnography and cognitive ethnography,

Williams states that rather than focusing on the meanings that the social group creates, cognitive ethnography focuses on how those meanings are created (Williams 2006).

Designers and design thinkers striving for breakthrough innovations have also been relying on ethnographic methods, even more in the past couple decades with the advent of human-computer interface design (Beckman and Barry 2007). The process of innovating not only involves a deep understanding of usage context but also a broad view of the entire system where a new product or solution will be situated (Beckman and Barry 2007). Ethnography has been increasingly called upon to facilitate participatory design that includes humans and computers within the same work space (Beckman and Barry 2007). Within traditional consumer research, which requires the designer to understand how users are engaging with their product and opportunities for innovation, ethnography provides the critical value by decoding the human experiences (Beckman and Barry 2007). Designers can use ethnography, not only to explore how users interact with the product, but also to understand the meaning and the intentions behind that interaction (Beckman and Barry 2007). As a result of uncovering the underlying behind the meanings and intentions gathered by the ethnographers, the designers can strategically think about meaningful innovations.

101

Method description

As stated earlier, the main aim of ethnography is to provide holistic insights about the environment of interest, its inhabitants and their practices through detailed observation and documentation (Hoey 2014). In-person, first-hand participant-observation are the preferred means of observation, while remote observations are also carried whenever appropriate. Both formal interviews and informal interactions enable the ethnographer to become steeped in the cultural environment of the study (Akins and Beschner 1980). Ethnographers have relied on participant-observation as it allows them to actively participate in the setting alongside the original inhabitants and observe interactions, rituals, and actions (DeWalt and DeWalt 2011).

Researchers may acquire a temporary role to enable interactions with the community and people they are observing, but Li warns against covert observation and points to the ethical use of participant-observation (Li 2008). The technique of participant observation allows the ethnographer to portray in inside perspective of the environment and people being studied

(Emerson, Fretz, and Shaw 2001), but the fact that it is one individual’s perception is a source of bias (Li 2008). Participant-observation allows the ethnographer to create in-depth field notes and even have detailed interviews but their own presence may cause the behavior of the native inhabitant to change (DeWalt and DeWalt 2011). It must also be noted that the data collected during participant-observation is mostly qualitative in nature and may not give categorical quantitative values (Emerson, Fretz, and Shaw 2001). A seasoned ethnographer will employ a combination of techniques to create a holistic view of the study and minimize biases. The ultimate purpose of an ethnographic study is to highlight otherwise undetected issues, which either inform the current social understanding about the said culture or environment or to

102

develop design solutions addressing the said issues. Methodologically, the ethnographers work towards collecting, analyzing and then triangulating the data to derive frameworks or themes

(Reeves, Kuper, and David Hodges 2008). Conducting fieldwork and documenting detailed observations through field notes are critical components of an ethnographic study.

Before initiating an ethnography, the research group must identify the group or environment to be studied based on social, geographic or other pertinent demographic factors.

“Fieldwork” is the primary and most crucial component of the ethnographic study. From immersing oneself in the native environment, establishing an understanding or relationship with the subjects while being transparent about the objective of the study; to identifying and expressing in words the sensorial experiences related to the environment – fieldwork incorporates all such activities, documented with the use of fieldnotes. Fieldnotes keep a record of the ethnographer’s observation of the interactions between subjects, dialogues and conversations as part of an interview or otherwise, lifestyle study, information on people and personas (Hoey 2014). Other data may also be collected in form of images, audio/video recordings, material samples etc.

The data collected through fieldnotes, media recordings and any additional empirical records are then sorted, analyzed and triangulated to generate themes, patterns and tentative theoretical explanations of the practices and actions observed in the study (Reeves, Kuper, and

David Hodges 2008). The insights and deductions from ethnography are utilized in wide variety of fields, as explained above. In addition to the social sciences and cultural studies, healthcare, game studies, multimedia applications, security, and now virtual reality applications are some

103

fields that are benefitting from ethnography, providing a reliable base for innovative and strategic design thinking.

Strengths & Weaknesses

Strengths: An ethnographer’s in-situ presence and observation of the environment and the participants being studied allows them to closely analyze interactions, behavior and patterns

(Weston 2013). This can enable them to highlight even unexpected or otherwise difficult to find issues through surveys etc.

Another aspect of ethnography is the documentation produced, including the narrative referred to as “the ethnography.” These are detailed and descriptive documents presenting an opportunity to scrutinize underlying behavioral and emotional aspects of the subjects. Such documentation helps in applying appropriate context and perspective to a highlighted issue.

Thus, when researchers and strategists propose solutions, there is higher likelihood of such solutions being adaptable to the native environment (Takegami et al. 2014). The ethnography is rich in details, providing a reflexive account (Stoller 2015), but it is mandatory that ethnographers create the ethnographic account in line with appropriate ethics and transparency guidelines.

Weaknesses: One of the main challenges with an ethnography study is that it may require the researchers to spend extended periods of time in the native environment studying and in some cases, also interviewing the users. The nature of documentation as an outcome of the study is also expected to be contextually rich and descriptive. It might need lengthy transcriptions of user interactions, and/or analysis of collected data as well. Hence, preparing the final ethnography product also takes time.

104

The duration of the study may not always and necessarily be long. However, ethnographers have expressed that during a shorter study users may still act under some pretense, inhibiting the researchers from truly observing their natural behavior and practices (Weston

2013). Qualitative studies involving surveys and direct questionnaires may not be exhaustive in terms of the questions. Also, the respondents may deliberately or unknowingly miss something of importance. This might prevent a true and in-depth understanding of the environment and context and a missed opportunity to unearth an important insight. Hence, a seasoned ethnographer knows to not only note what the subjects say, but more importantly pay attention to what they do in different situation without being prompted by the researcher.

One of the keys to avoiding biases and following ethical and transparent practices in capturing ethnographic inform is choosing an ethnographer with the right experience and bent of mind to carry out such studies. As this is a subjective research method the perspective of the ethnographer can greatly influence the future design the of product or solution proposed for that environment. Hence, it is critical that the researcher does not misrepresent the findings and also that they do not get “lost in translation”.

Conclusion:

Ethnography through the use of participant-observation, helps identify and understand the origin of a design problem. It brings to light the real user-centered domain where the problem is situated, and the practices, processes, expectations and context involved. Ethnographers use participant-observation to the natural environment of the specific group, through the eyes of its’ people, to participate in their culture and their day-to-day life, and to empathize with their issues and challenges by spending extended time with them (Beckman and Barry 2007). When trying to

105

understand the experience of a user, customer or worker, ethnographers can take on their role and go through the same course. In some cases, when it is difficult to place oneself in the natural settings within or for a short period of time, other techniques such as shadowing a natural inhabitant, contextual inquiries, ‘a-day-in-the-life-of’ accounts, formal interviews and informal conversations, video or audio recordings can help ethnographers gain valuable insights

(Beckman and Barry 2007). Because of its reflexive nature (Reeves, Kuper, and David Hodges

2008) ethnography is ideal for the stages that revolve around problem inquiry, ideation, conceptualization that may point to a future design outcome.

The most critical aspect of an ethnographic study is the choice and selection of the ethnographer. Even if there is a team involved, the primary responsibility of scoping, designing, conducting, documenting, and analyzing the study lies with the principal ethnographer/s. Hence, it is essential that the ethnographers have the required skill and experience in the domain of the study. The ethnographer also must be fair and conduct the study ethically and with transparency.

Ethnographic research takes time. It requires patience, empathy and honesty on the part of the researcher to immerse themselves in the study environment. Ethnography is inductive and reflexive in nature (Reeves, Kuper, and David Hodges 2008). The relationship that develops between the researcher and the subjects is crucial in decrypting the internal systems and process of the environment (Stoller 2015). Several complex social issues warrant the deep, rich narrative that diligently done ethnography can provide and can benefit from the interpersonal account of the situations and interactions.

106

Design Thinking

Method introduction:

Design thinking is a problem-solving method. It provides a systematic methodology for

innovation and new product development. Designers and strategists rely on design thinking

to solve complex problems through the synthesis of desirable solutions for a preferred future.

The founder of the design consultancy “Creativity at Work” writes that – “Design

Thinking draws upon logic, imagination, intuition, and systemic reasoning, to explore

possibilities of desired outcomes that benefit the end user” (Naiman 2016).

History:

Design thinking came into view, as a result of the need to combine human, technological

and strategic needs in the post World War II times, where the methodological and scientific

application of design could better people’s lives and everyday problems (Dam and Siang

2017).

In the 1992 published article “Wicked Problems in Design Thinking”, Richard Buchanan

shed light on the epistemological origination and evolution of the concept of design thinking

(Buchanan 1992). He traces the development of sciences since Renaissance, and their

formalization over time into distinct disciplines (Buchanan 1992). Buchanan stated that

emergence of design thinking was actually the convergence of these distinct scientific

disciplines into an integrated problem-solving scientific practice (Buchanan 1992). In the

early 1960s, referring to the integration of design practice with rational sciences,22

22 Gaston Milhaud explains Rational Sciences as an attempt to explain and provide reasoning for things (Milhaud 2006).

107

Buckminster Fuller called the 60’s, the decade of “design science” (Cross 2001). During the mid 1960s, another development put design thinking and its significance in perspective of problem-solving. With a focus on the application and scope of design thinking, the term

“wicked problems” referring to multi-dimensional and complex problems was coined by

Horst Rittel (Rittel and Webber 1973). Design theorists believed that due to multi-faceted nature of wicked problems, design thinking is uniquely apt to resolve such problems (Dam and Siang 2017). From radical technologists to design theorists in the 1960s to the computer and cognitive scientists in 1970s, more and more intellectuals started establishing design as a scientific method. Herbert Simon, with pioneering work on artificial intelligence, through his work (McKim 1972) showed that integrating, what was called the right and left brain modes of thinking can best tackle problem solving, instituting various aspects of design thinking.

Developing over years, design thinking was formalized as a structured discipline for problem-solving. In 1987, Peter Rowe’s book with the title “Design Thinking” laid the framework of design thinking lead problem-solving process. The book illustrated the process of “inquiry” carried out by architectural designers in finding creative solutions to practical problems (Rowe 1991).

With the merger of two design firms – David Kelley Designs and ID Two, a new design firm IDEO was formed in 1991 with a path-breaking approach to creative problem-solving in areas of business, organization and electronic product design. Over the next ten years, the variety of design projects undertaken by IDEO ranged from healthcare, organizational structuring, innovation of legacy models, alternative learning experiences (T. Brown 2009).

Providing successful solutions for such projects gave IDEO the reputation that they were not

108

just a design agency for consumer products, but the ones who could design and innovate holistic consumer experiences (T. Brown and Wyatt 2010).

Internally, the new nature of these design assignments, came to be referred to as “design with a small d” (T. Brown and Wyatt 2010). One of the founding partners of IDEO and founder of Stanford University’s d.school, David Kelly often ended up using the word

“thinking”, in explaining the nature of these distinguished design efforts. IDEO since has been famously regarded as one of the companies credited for bringing the practice of “Design

Thinking” to mainstream design and innovation areas.

Applications:

In 1992, when Richard Buchanan wrote about “wicked” and complex multi-dimensional problems in the area of design thinking, he referenced four established areas which warranted design solutions (Buchanan 1992) – a) visual communications design, b) product design, c) organized services, and d) complex systems or lifestyle environments.

The first area of visual communications and semiotics dealt with graphic and typography design in the field of publishing and advertising. Over the years, this began to include communication of ideas in creative fields such as photography, filmmaking etc. and even contemporary desktop computer displays (Buchanan 1992). The second area of product design targeted the conventional theories and principles for form and visual appearance of material objects of everyday use such as domestic appliances, apparel, and vehicles etc.

Design thinking took this practice a notch further to address not just the appearance of products but also the physical, psychological, social and cultural relationship between the product and user (Buchanan 1992). Designing for organized services and activities was the

109

third area highlighted by Buchanan. Service design was about designing processes that needed to incorporate people, physical and logistical resources in efficient synchronization in order to achieve specified desired outcomes (Buchanan 1992). After visual communications, consumer products, organized services, and activities, the fourth area where design thinking played a significant role in addressing problems was that of the environment and system design (Buchanan 1992). This fourth area cared for the not only design of actual spaces for purposes of different activities such as playing, learning, working, living etc., but also the larger frameworks of cultural and ecological environments, their purpose, integration, and interaction with humans to cater for holistic experiences and functions.

Despite clearly marking out contemporary design areas where design thinking can address problems and solve creative bottle-necks, Buchanan also states that limiting the application of design thinking to such areas can lead to restricting the benefits of design thinking from being applied in various other scenarios and areas such as business, organizational planning, information technology, education, healthcare and even software development. Alan Hall describes how design thinking plays a role in establishing an organization’s culture, so the employees can better contribute towards the organization’s larger objectives while growing individually as well (Hall 2012). Akin to organizational planning, businesses also need to streamline processes overall in order to serve their customers better. Whether it is the creation and development of new products by understanding user needs, marketing those products and planning future business goals, design thinking helps optimizing overall business processes (T. Brown 2009).

110

The field of learning and education also has various case studies where design thinking has enabled more effective learning. Koh et al. (Koh et al. 2015) in their paper on design thinking and education, recommend ditching the conventional education system, in favor of an education system that embraces design thinking as part of the curriculum. They further explain that design thinking’s epistemological approach inculcates real-world problem- solving skills in students (Koh et al. 2015). Rivka Oxman introduced the pedagogical concept of think-maps (Oxman 2004) for teaching and learning based on design thinking. Think- maps lay out a conceptual framework that gleans the learnings learned from the education imparted (Oxman 2004).

The tools of design thinking have helped in the field of healthcare as well. Medical schools and institutions are encouraging the adoption of design thinking to no just improve patient experience, but also innovate in the realm of medical devices and apps. In 2014,

Sidney Kimmel Medical College in Philadelphia launched “JeffDESIGN” the first design program within a medical school (Criscitelli and Goodwin 2017), with the objective to inculcate human-centered design thinking towards solving medical and healthcare challenges. As part of this program, medical students gain hands-on experience rethinking medical devices, environment, and services by working with designers, architects and technicians (Criscitelli and Goodwin 2017).

Social innovation is one of the typical areas where design thinking has been extensively applied and has successfully yielded holistic and sustainable solutions. Tim Brown is the

CEO and president of global design firm IDEO which works with some of the world’s largest and most renowned brands and corporations to explore and innovate relevant design

111

solutions (T. Brown and Wyatt 2010). Brown states that more and more non-profit organizations are increasingly adopting design thinking approach (T. Brown and Wyatt

2010). Brown (T. Brown and Wyatt 2010) explains that non-profits and social benefit organizations are taking to design thinking more than ever before as a means to problem- solving as it allows “high-impact solutions to bubble up from below than being imposed from the top”.

The field of information technology and software design is also leveraging design thinking to account for user needs and experience. The design and development of software applications and systems warrants not only user empathy to draft a proof of concept but also to design and manage processes and work flows. Technology company Apple Inc. has been famous for integrating design thinking in not only its product and service design practices but also its organizational structure (Gibbons 2016). Design thinking in the domain of software development and programming, necessitates that a user-centered interface is designed first

(R. C. Martin 2002). This implies that a software product or application design process initiates with caring for the user-experience of the user-interface and goes back to designing the database – this is referred as the “Dependency Inversion Principle” (R. C. Martin 2002).

Method description:

Over the decades, design thinking has created its identity as the “Scientised” design practice (Cross 2001). Even though design firm IDEO and David Kelly have been credited with popularizing design thinking (T. Brown 2009), it has been around since much before that. In Design Thinking 101 (Gibbons 2016), Nielsen Norman Group’s Sarah Gibbons writes about the “learning by doing” approach adopted by designers Charles and Ray Eames

112

to design their famous “Eames chairs”. The Eames’ practice itself has roots in earlier approaches such as the Bauhaus and the Arts and Crafts movement (Neuhart et al. 1989).

Design thinking is a systematic approach to solving real-world problems through hands- on and user-centric methodology utilizing actual customer insights. Due to its ability to innovate actual consumer products since the early stages, it has been widely adopted in business (T. Brown 2009). The Nielsen Norman Group enlists the process of design thinking in six steps:

• Empathize – the researchers should place themselves in the users’ position to be able

to understand their perspective and their challenges. Empathizing with the target users

allows for better research and information gathering (Gibbons 2016).

• Define – Articulating the challenge correctly sets the path to finding the solution for

the users’ problems. The research and information gathering should serve the purpose

of clearly articulating the problem and the opportunity for finding an innovative

solution.

• Ideate – A free and unrestricted attempt to brainstorm all possible ways to address the

users’ problem using creative problem solving that leverages collaborative thinking is

the most crucial aspect of design thinking.

• Prototype – The brainstormed ideas can then be collected and refined to generate a

prototype (or more than one if needed) with the desired degree of fidelity, depending

on the time and resources available.

• Test – Once a prototype, a proof of concept or a minimum viable product is ready, it

must be tested with real users. The way users interact, use or experience the prototype

113

must be carefully observed. Their feedback – both positive and negative should be

documented. In case of shortcomings, the designers can go back to the design board

to work on improvised iterations withholding the features that worked. Testing and

prototyping can be a cyclically iterative process until all aspects of users’ needs are

addressed.

• Implement – The finalized solution or product vetted by the above process should

alleviate the challenges and problems and be able to impact the users’ lives positively.

Once implemented, the test of a good design whether physical or conceptual, is that it should be naturally welcomed and adopted by the audience it was designed for (Alexander

1977). Tim Brown’s assessment of the role of design thinking in the creation of “good and successful design”, is that design thinking combines a designer’s acumen and methods to meet users’ needs with a technologically feasible solution that creates value for the customer as well as for the business (T. Brown 2009). The legendary and visionary designer Dieter

Rams gave the design community “ten principles” for defining “good design” (Rams 1980).

Rams’ principles outline a design that is innovative, is aesthetic, serves a purpose, provides longevity, is unobtrusive, and is environmentally friendly (Rams 1980). He envisioned the changing role, design, and designers need to play towards the society. Rams believed that while the design has the ability to alter experiences associated with small products to large cities, it must do it ethically and with moral values (Rams 1999). Rams stressed that most important aspect of “good design” is the thought behind it, as such design imbibes within itself global consciousness and thoughtfulness towards the society (Rams 1999). The

114

following statement by Rams conveys the power that design possesses, along with the responsibility that patrons of design share:

“only when the design is really understood and accepted by the whole population that a lasting rise in the quality of life can be achieved through design” (Rams 1999).

Through his constant experimentation with design and its larger concept, Rams was able to understand very early on that when as users we demand a new design, we must ensure that it aligns with larger goals of being environmentally friendly, human and most importantly

“future-protecting” (Rams 1999). Design thinking as a method must carry this understanding through all stages of problem-solving and design creation. Only then can that design be considered valuable and responsible.

Business professor and strategy consultant Roger Martin stated three kinds of logic approaches (R. L. Martin 2009) that design thinkers can leverage – deductive, inductive and abductive. While deductive (inferring from previously established logical premises) and inductive (generalization from several specific instances) logic promotes thinking around what should be or what is (Dunne and Martin 2006), advocates of design thinking favor abductive logic as that encourages thinking about what might be (Dunne and Martin 2006).

Charles Pierce, known as the father of Pragmatism emphasized that “the process of forming an explanatory hypothesis. It is the only logical operation which introduces any new idea”

(Fann 2012). In case of existing knowledge or explanation of a phenomenon, abductive logic explains it by virtue of what is experienced or observed. It is this cognitive process that informs design thinking.

115

Strengths & Weaknesses

Strengths: Design thinking certainly offers benefits by virtue of its methodology. For instance, design thinkers employ a user-centric approach that focuses on understanding user struggles and needs by empathizing with the user. This creates a realistic account of the challenge. The process also lays a lot of stress on testing the prototypes until a satisfactory

(even delightful) solution can be reached. A design solution that is successfully accepted by its target consumers, is able to do so as it creates true value for the user – the value that has been absent from any other product/solution so far that the market had to offer. The products or solutions thus created bring tremendous commercial benefits for businesses as well.

Design thinking also provides organizational opportunities for any company implementing this method. Cross-functional teams can utilize collaborative and multi- disciplinary thinking utilizing each other’s perspective. It also gives organizations to review their processes holistically and inculcates problem-solving mindset among all participating groups.

Weaknesses: Despite its advantages for organizations, its implementation is also often challenged in businesses and companies. Organizational structures lacking in cross- functional communication find it hard to embrace design thinking despite its advantages.

Often times, the leadership may not want to invest in or fund an initiative without a proven business model. Leadership focusing on immediate business targets may lack the vision to understand the potential benefits of design thinking and hence may ignore or postpone such considerations. Apart from practical challenges about adopting design thinking in business, there may be other factors that hinder the acceptance of design thinking. Lack of awareness

116

and education about the application of design thinking still prohibits many from using it as they are unsure of the situations or problems where the application of design thinking is relevant. Corporations which claim to understand design thinking may ask their design teams to start the design process with benchmarking to see the designs from competitors, but adopting design thinking ensures the inclusion of originality and innovation in the process thereby addressing the design context. Sadly, many in-house design teams, under pressure from leadership are forced to make up data to validate the improvisation approach rather than using data to predict opportunities for innovation.

Conclusion

Design thinking has the potential to create value by addressing the problems users face today. Thinking about the future needs of users will require analyzing the past and present scenarios in addition to looking at other sources of information. The most critical aspect of design thinking that aligns with the DNA of any method that involves thinking about the future is its abductive logic. Abductive reasoning helps to think about what might be, instead of deducing and inferring from past events only (Dunne and Martin 2006). Methodologically, observing and engaging with users helps us understand behaviors and attitudes. This information is critical not just for the design thinking exercise but also as a baseline for understanding future preferences of users. In some cases, the insights revealed by users or through collaborative thinking sessions as part of design thinking exercises may lay the foundation of future innovation.

117

Many perceive design thinking as an ambiguous method and there may be concerns about its subjective process and scalability, it aligns closely with thinking about future solutions and forecasting in its insight gathering approach (Roumiantseva 2016).

Some other methods of qualitative research

Some other methods of qualitative study that are so frequently and ubiquitously used that they no longer need formal description include surveys via questionnaires, interviews and focus groups etc.

Surveys and questionnaires – surveys are used frequently to gather information and often end up being distributed widely via different media channels, over social media and even through journals. Hence, for the reader to place their confidence in the results reported through surveys it is important that the researchers take into account the technical aspects and quality controls related to generation and analysis of surveys.

A survey is simply the process of collecting data on a given topic or a certain situation

(Ferber et al. 1980). However, in social science, a survey is synonymous with collecting information through a series of pre-set questions from a certain number of “sample” individuals. Ideally, surveys should be targeted towards a specific sample set who satisfy certain criteria, but in today’s day and age when surveys can be conducted via social media, online polls, and emails, a strict control, and vigilance over the participants may not always be possible. Surveys collecting information from a number of respondents use a pre-designed set of questions to gain insights into a specific situation. The sample size of the individuals being surveys may vary from study to study based on the context and needs of research and

118

on the estimated total population (Ferber et al. 1980). While online surveys are becoming popular today, they are also conducted in person and/or over the phone. Surveys may be classified by their target sample, a method of data collection or by the type information they hold (Ferber et al. 1980):

• Target sample: researchers create a survey sample from a specific group, community

or individuals satisfying set criteria, such as those belonging to a certain demographic

profile, owning or using certain products or services, engaged in a certain occupation

or profession etc.

• Method of data collection: surveys may be conducted in person or virtually. Surveys

can also be self-administered by participants through a common form and submitted

back to the researcher. Based on the same principle, online survey hosting services

like Surveygizmo, SurveyMonkey, eSurveysPro etc., have become popular. Many

organizations also offer surveys for customer feedback on their digital distribution

channels through third-party providers such as Foresee Feedback, Responster etc.

• Type of information: Surveys are powerful in the way that they can measure opinions

as well as facts. Surveys are not only a great way to analyze the success or failure or

past actions, but based on the kind of questions asked, surveys can provide a deeper

look at people’s behavior, attitudes and hence the likelihood of their future actions.

From having people respond in closed-ended quantitative answers, choose between

given options, to indicate levels from a range to an entirely open-ended response, the

information captured in surveys can be used to very different means. The quantitative

information collected about actions taken in the past can be used to postulate the

119

future “demand” in case of commercial or other such planning activities. The

qualitative information gathered over a period of time can be utilized to provide

trends specific to that area and interest.

Surveys are an important tool if they are appropriately designed to address the primary objective of that information gathering exercise. Whether the objective is to collect feedback for past actions for the purpose of improve, whether it is to understand the behavior around using certain product or services, whether it is to understand user challenges in the current situation, whether it is to gather information for reporting about current indices of some kind, whether it is to access the need for a new product or service – the design of the survey questionnaire is critical to meeting that objective. The researcher may choose to include questions involving ranking options, providing open-ended feedback, responding positive or negative, in order to attain the right mix, sequence, and length of information that satisfies the research objective. Researchers may also stipulate the method in which the survey should be conducted and how the records should be maintained. However, it is also critical that the execution of surveys is carried out in an ethical and principled fashion, and is free of malpractices. The paradigm of archeological methods of survey, not further discussed here, may be helpful too (cf. https://en.wikipedia.org/wiki/Survey_(archaeology)).

Similar to surveys and questionnaires, a focus group is an established market research method that elicits the participants’ feedback. However, it differs considerably from a survey in the sense that it involves multiple respondents in a single session with the interviewer/s.

Focus group is an instrument to provide feedback from the collective discussion. The key however to a productive focus group lies with the moderator and design of the questions. A

120

moderator experienced in method and also well-acquainted with the research area can bring out the most meaningful insights from a focus group session. The objective of focus groups is to record the perceptions, beliefs, and reactions of the participating group about a specific product or service or proposed new idea. It is a qualitative method, where the collective participation allows mutual interaction between respondents which may result in their opinions being influenced as well, similar to real-life situations. The sample of participants should, however, be limited to a small group in order to keep the discussion meaningful and avoid the ‘noise’.

Recap

Based on the methods and the categories of methods described in this chapter, the figure

2C.1 recaps the categorization:

121

Chapter 2

Chapter 2A OBSERVATIONAL METHODS Environmental Scanning Nystrom’s Framework Design Trends Analysis

Chapter 2B ESTIMATION METHODS

Statistical Modeling

Cross Impact Analysis Decision Analysis Machine Learning

Morphological Analysis

Chapter 2C INTERVENTIONAL METHODS

Delphi Ethnography Design Thinking Other Methods

Fig. 2C.1: Forecasting methods discussed in this chapter, grouped under proposed categories of Observation methods, Estimation methods and Intervention methods.

122

CHAPTER 3

CASE STUDIES: AN EVALUATION

Chapter 3 examines three sample cases and attempts to identify common threads yielding the properties or attributes that may help define a forecasting framework (Q 2). This chapter provides three different accounts, where a strategic forecasting approach could have altered the course of events of that the organizations went through. Cases recounted include – the financial and identity struggle seen by the British lifestyle and fashion brand Laura Ashley (Sull 1999); the crisis that Britannica organization (Shane Greenstein and Devereux 2017) went through with the advent of CD-ROM encyclopedias; and desperate attempt by quintessential American muscle car maker “Pontiac” (Holmstrom and Newhardt 2011) to restore its identity and success.

Each case study examines the struggles these companies experienced by virtue of changing times, ineffectual forecasting actions and subsequent detrimental outcomes.

Case study 1: Laura Ashley

• Introduction

Industry: Women’s Apparel/Fashion

Their position within the industry:

Laura Ashley is a British fashion and home-furnishings brand established by Laura and

Bernard Ashley in 1953. Laura’s Welsh upbringing instilled in her the aesthetic sensibilities

of the life in the British countryside and through her designs, she wanted to promote

traditional British values. The 1970s were one of the most profitable eras of the company

(Laura Ashley e-commerce website 2018). From humble beginnings in 1953, starting screen

printing in their flat Laura and Bernard Ashley, by 1970 the sales had reached

123

£300,000 (Sull 1999) or around $421,000 in current value. The store located in Fulham, which had opened the same year sold 4000 dresses in a single week (Laura Ashley e- commerce website 2018). Licensing operations and company-operated stores expanded to

Australia, , , Paris and San Francisco during the 1970s (Laura Ashley e- commerce website 2018). In 1979, the year of its 25th anniversary, the company lodged a turnover of £25 million or over $35 million in current value. By the end of the 1980s, the company had 450 stores worldwide.

• Status before the incidence

By the end of the 70s, Laura and Bernard moved to . This evoked a sense of greater grandeur in Ashley’s upcoming collections (Laura Ashley e-commerce website 2018). The couple had incepted and grown their brand with careful planning and control (Sull 1999).

Together, they carefully supervised all areas of their business – from design and manufacturing to distribution and retail. They cultivated a “Made in ” image for their brand through their centrally located production facility in Wales (Sull 1999). At a time when many British businesses were shutting down due to labor issues, the Ashleys managed to keep their employees happy by providing above average salaries and benefits(Sull 1999). Not only were their employees loyal, over the years they had developed close relationships with their suppliers and franchisees. Their customers had become devoted patrons to the brand and the values it embodied (Sull 1999).

• The Challenge: Issues, pain points and challenges

Tragedy struck and Laura Ashley passed away in 1985. The well-oiled machinery that

Laura and Bernard Ashley had established over decades, hit a roadblock in the late 1980s.

124

Bernard Ashley attempted to keep the company and its sensibilities on the track set up by his late wife (Sull 1999). Fashion, however, had evolved. Laura Ashley brand evoked rich countryside aesthetics, but the 80s was the decade of “Power Dressing” when more and more women joined the workforce. They were seeking professional and practical clothing.

Critiques accused Laura Ashley of being “stuck in the past”. The manufacturing technology was fast changing. Most fashion houses began to outsource production facilities due to cheaper labor and economic climate being conducive to international trade (Sull 1999). The means the company had used to clamp down labor issues in the 70s were now proving to be expensive. Not keeping in touch with the changing times cost the company dearly. Outdated designs forced the customers to ultimately abandon the brand, and expensive labor and production processes impacted the company finances adversely (Sull 1999).

• Why did they have this problem?

At the outset, the company failed to catch the pulse of changes in customer needs and lifestyles, business and manufacturing processes. By the end of the 80s, they had realized their mistake. As a measure of correcting their mistakes (Sull 1999), an external consultant was commissioned to determine a route to recovery. Based on those recommendations, the company’s board hired one CEO after another, in hopes that a new CEO would be able to profitably restructure the company. Each CEO was given the single point agenda to increase sales and reduce costs (Sull 1999). However, due to coming and leaving of multiple CEOs in quick succession the company lost direction and identity. The image that Laura and Bernard

Ashley had cemented over decades, was beginning to blur, as different CEOs focused on different tangents. Even after changing seven CEOs in just one decade, the company could

125

not be steered in the direction of the contemporary fashion and remained wrought with financial difficulties.

• Analysis

The failure happened due to tackling the problem as a business problem only. The instructions were to increase sales and reduce costs. But the real problem required a creative approach as well.

What did the Laura Ashley management expect to happen and why?

The management at the Laura Ashley, knew that they had secured the two key pillars of

the company – first, the brand had a unique identity and design profile based on Laura

Ashley’s creative vision; second, they were paying their employees better than the

industry average and believed that they could rule out any labor problems. Manufacturing

in the UK added to the brand’s identity of being a home-grown British brand. Hence,

their plan for future was to stay on the path shown by the founders Laura and Bernard

Ashley but unfortunately became complacent about the changes being brought about by

world events. While difficult economic conditions in the UK lead to many smaller

fashion brands to close out, Laura Ashley went on to establish standalone and franchise

retail store internationally. Her floral prints and lace dresses were still popular with the

younger audience. However, another significant trend that had started appearing in the

mid-1970s was that of androgyny. As more women were taking to working outside the

house, elements of male dressing were being incorporated into their attire. Women wore

trousers more than ever before, from flared legged at the beginning of the decade to a

straighter look by the end of the 70s. This trend was to transform into a more defined

126

“power look” by the 80s. By the time of Laura Ashley’s death, the trajectory of women’s fashion style had already started to turn against the essence of the brand.

What was the forecasting need in this case?

The forecasting need, in this case, was to chart the company’s path to a preferred state for business profitability by assessing the impact of contemporary events. In order to attain a commercially profitable state, the management should have to invest in forecasting the changing taste in fashion over the years to come. When Laura Ashley started her business in the 1950s, it was a time when people were tired from the War and wanted a reprieve.

With both men and women back from the war the society began to reform. Forces of systematic chauvinism also began to force women back into the homemaker role

(Zapa\la-Kraj 2014). Magazines and other print media further reinforced that perception

(Lamb 2011). The aesthetics of Ashley’s prints and designs perfectly reflected that mindset and lead to its popularity. The 1980s, however, was a time when women were once ready to assert themselves in the workplace and were dressing to that sentiment.

Long helmed dresses with laces and bows were now being considered regressive. Laura

Ashley, as a brand took no action to address that sentiment and did not evolve their design profile. At such a time, the company should have engaged in a forecasting exercise with a focus on the creative direction for the future.

On the business side of things, the trade climate and labor laws were changing rapidly. From a commercial profitability standpoint, there was a need to re-evaluate the

“Made in U.K.” manufacturing model and overall business direction for the future.

Especially, since the brand was going global, keeping manufacturing limited to the

127

facilities in U.K. warranted a re-examination of not just production but also distribution processes.

What (methods) could have prepared them for the change?

Analysis of design trends globally, and environmental scanning of events in social, economic, cultural and political events, a sense of the zeitgeist would have helped them pre-empt the direction in which new trends were emerging over for the longer time range.

In addition to forecasting for creative direction, the management should have also assessed their business model and practices. Understanding the impact of changing trade and labor laws would have helped the company make better decisions with respect to global procurement, production, and distribution. Application of methods such a cross- impact analysis and decision analysis could have helped the company manage its brand image while maintaining profitability.

Were there any limitations in running a forecasting research?

One of the limitations that could have prevented the management at Laura Ashley from pre-empting the future was that the past sales number did not present a cause for concern until the late 1980s. But by then it was already too late. The frequent replacement of the

CEOs, further reduced the lead time of the company to create new collections in an industry where lead times are already tight. Also, the management stayed focused on monitoring and planning for the short term due to the constant change in direction from the CEOs. This did not allow them to lay out a long-term strategy, plan to execute based on a long-term creative and business direction.

128

Case study 2: Encyclopedia Britannica

• Introduction

Industry: Print encyclopedia

Their position within the industry:

Encyclopedia Britannica is one of those products that has seen a lifeline of not just years

or decades but centuries. From its origins in Scotland in 1768 (Shane Greenstein and

Devereux 2017), it went to build a much-respected reputation. Its contributors were some

of the most celebrated thinkers of the time including various Nobel prize laureates (Shane

Greenstein and Devereux 2017). Post its purchase by American investors in 1901, its

operations moved to New York (Britannica 1965). The eleventh edition released in 1911,

with 29 volumes and creating a much wider appeal through its content is the most

acclaimed edition. Its headquarters moved to Chicago in 1930. First, its operations were

bought by William Benton in 1943 (Britannica 1965), and later William Benton

Foundation with the University of Chicago as its beneficiary, acquired complete

ownership (Shane Greenstein and Devereux 2017). By the late 1980s, Britannica had not

only established a very profitable business, but also an unrivaled position in the Anglo-

Saxon encyclopedia world as the “august repository of serious information” (Shane

Greenstein and Devereux 2017).

• Status before the incidence

While Britannica encyclopedias certainly commanded authority in terms of the knowledge

offered, the leadership took some other factors for granted. Because of their established

authority, best scholars and academics were eager to become a contributor in return for a

129

very small compensation. More importantly, they believed that their readers valued the

knowledge and information offered by Britannica that they never paid attention to their

customers’ changing needs, such as the fact that these encyclopedias were not always easy

to use and read. They also overlooked that some their competitors, albeit lesser known were

addressing these concerns. Further, they were confident of their prestigious image and

extensive content and assumed their customers would always pay for the product.

The primary target audience for Britannica was middle-income families with small

kids. These parents attested a lot of value to their kids’ education and academic

performance. For them, Britannica Encyclopedia was a way to provide answers to their

kids’ general inquisition and questions, which the school books seldom provided. This was

exactly the pitch, Britannica’s sales force used when engaging with potential customers. In

fact, their sales representative formed a vital part of their business model since they went

door to door and made in-person pitches to potential customers. It is interesting to note that

while the product itself was based on decades of extensive scholarly knowledge, the

business model was extremely dependent on the sales team and their traditional methods

of selling. This makeup of the company made it prone to risks as the technology around

consumption and distribution of information was quickly changing.

• The Challenge: Issues, pain points and challenges

While Britannica Encyclopedia was going steady with its sales, Microsoft has launched

the operating system Windows 1.0. Microsoft was now looking to diversify into new

innovative projects. Microsoft had approached Britannica to forge a partnership to combine

its leading edge multimedia capabilities with Britannica’s knowledge content through CD-

130

ROMs (Shane Greenstein and Devereux 2017). However, not assessing the future trajectory of personal computers Britannica declined. They claimed that putting the encyclopedia on compact discs did not align with their traditional selling model and moreover, the personal computer market was small and limited (Shane Greenstein and

Devereux 2017). Undeterred by this, Microsoft continued to seek out a partner to launch multimedia encyclopedia initiative. Ultimately, the tee-ed off a deal with Funk and

Wagnall. However, since Funk and Wagnall were far from respected in the reference industry and almost defunct as a business, Microsoft decided on the name “Encarta” for its encyclopedia venture. Microsoft compensated for the questionable quality of knowledge content that came from Funk and Wagnall with images, sound bites and even video content that the users could enjoyably and easily use on their personal computers. Encarta encyclopedia, which was provided free of charge with new Windows PCs, garnered a much better response than anticipated. On the other hand, despite turning down Microsoft’s offer,

Britannica felt pressured to enter the CD-ROM market. Unsure of using Britannica brand name for this task, they decided to use another one of Britannica’s lesser known brands

“Compton” and launched Compton CD-ROMs, available free with the print version.

Compton Encyclopedia became the first encyclopedia to be available on CDs in 1989.

However, with an exorbitant price tag for independent CDs and a sales force lacking any interest in selling CDs, as well as the know-how to demonstrate multimedia features on

PCs the effort did not take-off.

After an extremely profitable 1990 year, Britannica’s sales became stunted in 1991 and

1992 and seriously dived shortly after that. More and more households now became

131

familiar with the name “Encarta”. Children found the rich multimedia content highly

engaging. As online learning became the norm, Britannica found it hard to catch up. It

eventually had to be sold for a mere $135 million to Jacob Safra in 1996. i. Why did they have this problem?

Rather than charting the trajectory of future changes in the consumption and distribution

of referenced knowledge content, Britannica organization remained heavily centered on

the culture of salesmanship. They did not pay attention to the technological changes

happening in the peripheral industries nor did they engage in any attempts to understand

consumer needs and pain points better. Other than revising and updating the text, there

was hardly any focus on innovation or diversification. The sales force abhorred the idea

of selling encyclopedia on CDs and could not make a timely strategic pivot from their

traditional model and training. Microsoft on the other hand, despite being a $140 million

startup was not content just by being the operating system provider for IBM. Microsoft’s

founder Bill Gates was a visionary and encouraged spending heavily on research and

innovation initiatives. While Microsoft was quickly establishing a name and trust for

itself in the personal computers market, Britannica was already a well-respected brand. A

partnership – had it happened – between the unparalleled leader of reference industry and

the soon to become tech giant could have charted an entirely different path for Britannica

organization.

• Analysis

The failure did not happen due to inaction. Failure was a result of ineffectual decision

making – taking inappropriate action at the inappropriate time. Well-suited forecasting

132

methods could have helped Britannica gauge the emerging needs, changing context and avoid an undervalued sell-out.

What did Britannica expect to happen and why?

The sales numbers for Britannica showed an upward graph throughout the 1980s, with their sales peaking in 1990. A method for forecasting the company’s future that accounted the sales figures from the past years for prediction would have only shown a continuation of the growth pattern. Hence, the managers at Britannica found little cause for concern. Greenstein (S. Greenstein 2005) refers to this as the “foresight trap”.

What was the forecasting need in this case?

The forecasting need, in this case, was “Predicting” indeed. However, the prediction needed to be holistic by taking into account not just the quantitative data from sales in the past years, but also other factors impacting the print encyclopedia industry and its target customers.

What (methods) could have prepared them for the change?

A forecasting research that could have helped the organization in taking a business decision regarding whether to venture into encyclopedia on a CD-ROM business or not could have changed the course of events for Britannica. Among the methods discussed in

Chapter 2, Cross Impact Analysis and Decision Analysis methods could have helped the management take into account the multiple moving parts in the situation. Even after the launch of Encarta, the company did not engage with their customers to understand if their preferences towards educational tools were changing. An exercise such as a survey could have helped them gauge the increasing interest in the adoption of personal computers for

133

education. A carefully designed ethnography study could have highlighted children’s higher susceptibility to learning via multimedia methods, provided Britannica thought in that direction while time permitted.

Were there any limitations in running a forecasting research?

Microsoft had first approached Britannica for a partnership in the mid 1980s. Britannica decided to turn the offer down. Britannica’s argument for not accepting the offer was that they viewed the print encyclopedia as a serious educational tool, and doubted that a CD-

ROM could ever attain that meaning for the customers. Customer aspirations and beliefs, however, were changing quickly with the advent of the personal computer.

Britannica’s organization was centered around its sales force (Shane Greenstein and

Devereux 2017). Because of the commission associated with the sale of print encyclopedias they were not motivated to sell CD-ROM encyclopedias. After the launch of Encarta, even when Britannica decided to put all the text on CD-ROMs, to be provided free of charge with the purchase of print encyclopedias their sales force failed to understand the need or significance for this decision.

Britannica had the opportunity to conduct forecasting research at least on two different occasions. At the time when Microsoft had approached them, Britannica had the time to conduct a study the changing customer preferences and undertake a holistic decision-making exercise. After the launch of Encarta, when the sales force sincerely doubted the significance of CD-ROM encyclopedias, Britannica could have still conducted a study such as a survey, to unearth what was it that their customers were

134

interested in, rather than acting on the sales force’s emotion and antipathy for a CD-ROM

encyclopedia.

Hence, Britannica needed a forecasting solution that could help predict the future

course and provide the means for decision-making, taking into account the changing

parameters of technology consumption and consumer aspirations and preferences.

Case study 3: Pontiac Aztec

• Introduction

Industry: Automobile

Their position within the industry:

The Pontiac car brand was incepted in 1926 by Alfred Sloan, the then President of

General Motors, to specifically address the market between its Oldsmobile and Chevrolet

brands. (Liu, Rouse, and Yu 2015). Pontiac offered performance driving at modest prices

which struck a chord with the audience for first few decades and it established its

position. However, its body-frame, conservative design, and reliable performance gave it

a “grandma image” (Holmstrom and Newhardt 2011) (Fig. 3.1).

135

Fig. 3.1: Pontiac Bonneville in 1956. People started associating the reliable yet stodgy looking (Holmstrom and Newhardt 2011) car with the grandma in the family hence giving it the “grandma image”. Image source: (Holmstrom and Newhardt 2011).

It is this “grandma image” that Knudsen wanted to change (Liu, Rouse, and Yu

2015). He steered the brand in a direction away from the “grandma” image the with the

launch of the Tempest model in 1964 (Liu, Rouse, and Yu 2015). This car featured the

GTO23 package that added a sense of positivity with usability. The company had

estimated a sales target of 5000 cars, but they actually ended up selling 32,450 of these

cars (Holmstrom and Newhardt 2011). Pontiac now bragged about its waiting lists.

Pontiac was able to establish a unique position for itself in the market based on strong

and reliable performance (Liu, Rouse, and Yu 2015). Its success created the new category

of “American muscle cars” (Zuehlke 2006). Pontiac went on to launch other variants of

the performance cars such as Firebird and Grand Prix.

23 GTO stands for “Gran Turismo Omologato” best known from the later Ferrari GTO (Holmstrom and Newhardt 2011). The GTO package offered with Pontiac Tempest in 1964 included a 389 cubic inch V8 engine, 343 final bhp at 4800 rpm (Holmstrom and Newhardt 2011), which was strong yet not sports-car like top of the market at the time.

136

• Status before the incidence

Pontiac had cemented their position so strongly in the market, that when the first

oil crisis struck America in 1973, other American muscle cars’ sales stumbled but Pontiac

remained steady (Zuehlke 2006). However, Pontiac began to change to its leadership too

frequently. Between 1969 and 2009, Pontiac brought in a new General Manager every

four years on an average (Liu, Rouse, and Yu 2015). Changing of leadership too

frequently lead Pontiac’s firm vision and strategy to waiver. Loss of clear focus resulted

in the design direction to blur. By the late 1980s, Pontiac was investing less and less in

improving technology and over-sharing batch manufactured components with other GM

brands (Liu, Rouse, and Yu 2015). Japanese auto makers were also gearing up and

attracting the American buyer.

• The Challenge: Issues, pain points, and challenge

Batch manufacturing with other GM models decreased Pontiac’s distinct appeal and sales

Fig. 3.2: Pontiac Aztec aimed for a “bold” design with plastic fenders and headlights detached from the main body. Image source: (A. Miller 2015).

137

suffered (Liu, Rouse, and Yu 2015). Decreased cash-flow prevented investment in newer technology. GM entering into a joint venture with Japanese automaker Toyota, meant stronger competition at home (Liu, Rouse, and Yu 2015). By the late 1990s, Pontiac tried something different from its performance car image in hopes of capturing a larger audience but it backfired as it further diluted its image. One such offering which particularly hurt the company was the Pontiac Aztec model (A. Miller 2015). In trying to attract a wider market from younger buyers looking for something offbeat to somewhat older ones who wanted the convenience of a minivan, Pontiac frankenstein-ed a design which failed to appeal altogether (A. Miller 2015).

In the desperation to regain market share and increase sales, Pontiac sought a new kind of automobile that would once again carve out its unique place in the market. Aztec’s research team conducted their customer studies but unfortunately, the design team failed at interpreting and articulating what customers were looking for (A. Miller 2015). Despite some of GM’s best designers being assigned to the project, they failed to sense the need of the market. While the design was in line with the contemporary mainstream design, it was out of sync with what the audience desired. The designers integrated elements of modularity, two-tone color scheme, and edginess which defined the automobile design style of the 1990s (Covelllo 2003). Due to more than one designer trying push for their own “versions” of the design and the lack of any guardrails from leadership resulted in a

“compromise” design that made everyone on the design team happy (A. Miller 2015). It was described as an audacious design with two-tone exterior – the promotional imagery of the Aztec model often showed images yellow body with grey plastic fenders, indicator

138

lights detached from the main headlights (A. Miller 2015) (See Fig. 3.2). –Pontiac Aztec

was launched in 2000. Despite being on a budget, on time and offering advanced

engineering for its time and segment it turned out to be a huge disappointment (A. Miller

2015). The sales were so minimal that GM could not even break even. Despite much

legwork, when the numbers did not look up, the company decided to lay it to rest in 2005.

• Why did they have this problem?

The confused approach towards creating a vehicle that would uniquely stand out,

as well as capture an extremely wide and varied customer base, actually resulted in the

creation of a vehicle that no one wanted. The leadership wanted to create an impact

similar to Pontiac Tempest GTO. The designers failed to read the signals from the

customers at the right time and align with common design guidelines. With a diluted

brand image and nothing unique to offer, the Aztec was registered as a failed attempt

from the Pontiac arsenal. Pontiac overall kept declining steadily since the 2000s. Finally,

not investing in technology with the times, and missing clear market signals lead to the

end of the American muscle car brand Pontiac in 2010 (Liu, Rouse, and Yu 2015).

• Analysis:

The concept of Aztec was ill-timed and misconstrued. The market was not ready

for true cross-over or multiple utility vehicles. Confused messaging about it being a

performance vehicle, “bold” in design, catering to the young, the old and everyone in

between set the Aztec on a path to failure. Once again the failure occurred due to

139

incorrect assessment of what was needed at the time.24 Hence, forecasting methods could

have helped Pontiac gauge the emerging needs, and changing context and deliver a

product that could have set the course for the future.

In the desperation for commercial success, Pontiac tried to re-do what they had

done in the mid-1960s without considering what factors had changed in the past three

decades. The direction was to recreate a unique identity for the Pontiac cars just like the

GTO package in the 60s while keeping costs low. But the company failed to capture the

pulse of the market.

What did Pontiac expect to happen and why?

Pontiac prepared to focus all its resources on creating a unique identity for its cars once

again. With a very tightly controlled budget and timeline, they engaged some of the best

designers and engineers at General Motors. They expected a revival of the brand brought

about by a design which strike the right chord with the customers.

What was the forecasting need in this case?

Pontiac needed a paradigm-shifting product but did not plan for it accordingly. Since

most of their existing product line was failing, they needed a different approach to alter

those numbers. An interventional research that exposed what the customers really needed

and gleaned on an opportunity for out-of-the-box innovation was the need of the hour.

24 In my mega trends research for 2011, I highlighted the indicators which heralded that the time for MUVs or multi-utility-vehicles would make a successful category for the Indian market. Success of subsequent vehicles such a Renault Nissan, Mahindra XUV and Ford EcoSport in the Indian consumer automobile market only validated that forecasted trend.

140

Considering the lead time in the automobile industry, Pontiac needed a forecasting strategy that would work for a longer term future.

What (methods) could have prepared them for the change?

Methods which would have gathered information to break away from status quo, a dated view of customer preferences and including emergent factors in the American automobile industry would have proven helpful. Pontiac invested in market research. The scope of the market research was to identify what kind of vehicle could help Pontiac capture the widest market demographics. The design team construed the research findings to conclude that Pontiac needed a vehicle which could be adopted by the young adventurer, the older convenience seeker and everyone in between, while being audacious in its design elements. Systematic investigation through a combination of methods such as environmental scanning, design thinking, morphological analysis and Delphi depending on available lead time could have helped the company define its scope better and saved them from the catastrophic fate that Aztec succumbed to.

Where there any limitations in running a forecasting research?

As the lead times associated with the automobile industry is at least three to five years

(Unger and Eppinger 2009), they should have invested in forecasting research on an on- going basis rather than a need-to basis. More importantly, when seeking a paradigm-shift quantitative data from past sales records could have helped them define the future strategy. Pontiac needed a forecasting plan that could articulate an innovative approach to resonate with customer preferences and attitudes and would be relevant for long-term future.

141

Conclusion

In the above-mentioned cases, the companies struggled commercially as a result of failing to keep track of changing consumer preferences. Even though the forecasting needs to be varied in each case – Laura Ashley needed to forecast for the preferred state;

Britannica needed to predict the impact of technology on the timing to pivot, and Pontiac needed to plan for a paradigm-shift – but they all needed a mechanism to sense the changing trends in their industries.

Upon analysis of the case studies, we find common threads among all three case studies that would help forecasters to select the appropriate methods for forecasting.

These common threads are the forecasting attributes that determine the applicability of methods to specific situations.

In case of Laura Ashley and Pontiac we saw that while both companies had sales data from past years, the steady decline in numbers would have only raised questions rather than provide any answers. It, however, established that their current approaches were not working. Britannica encyclopedia, on the other hand, had an upward graph in

1990, which showed no cause for immediate concern. Britannica’s sales graph inverted soon after the launch of Encarta. Hence, the availability of past data and how does it help in forecasting the company’s future is an important attribute to consider while choosing forecasting research methods.

Another consideration while planning for the future is that of lead time. In case of

Laura Ashley, the fashion industry operates on much shorter leads churning new collections each season or every year. In comparison, automobile industry functions on

142

much longer lead times as they components and processes involved are much higher in number. Shorter lead times in case of Laura Ashley necessitated that they kept themselves updated of the latest trends at all times to respond to changing fashion requirements of the customers. Longer lead times in case of Pontiac implied that their forecasting research focused on long-term consumer trends and outcomes. Shorter lead times in case of Laura Ashley meant that they delivered collections season after season but with a long-term understanding of evolving consumer taste.

The nature of forecast sought also differed considerably in each of the three cases.

Laura Ashley’s management commissioning a forecast that combined creative direction and business decision would have been the ideal solution. Pontiac needed an interventional and innovation-led approach, with methods that could deliver a paradigm- shift. Britannica needed a prescriptive forecast that allowed them to take business decisions taking into account external factors that were changing the conventional model of encyclopedia sales.

The desired forecasting range is a factor tied to the lead time, the industry, the nature of forecast, whether the data available from the past is quantitative or qualitative if trend analysis happening on a continuing basis, and most importantly, what is the forecasting need. For instance, Laura Ashley doing something different every season blurred their brand identity. It might seem that since the lead times for the fashion industry is short, the desired forecasting range would be short term. However, that is far from true. The creative direction needed to sense the larger shift in women’s self- perception, necessitating to set a long-term creative strategy in place. Acting on the long-

143

term creative strategy, they should have designed the seasonal collections. Measuring the commercial response to the different designs should have been used to indicate the customers’ mood. Responding, to customers’ demand while holding on the quintessential

Laura Ashley design elements would have kept the company relevant to its patrons and commercially viable. In combination with the creative impasse, Laura Ashley, the company also needed to forecast the impact of changing global trade dynamics in order to take appropriate business decisions.

Hence, the determination of the forecasting need is the first and foremost aspect in initiating any kind of forecasting research. Then, availability of past information

(quantitative or qualitative), lead times, nature of forecast, and forecasting range are all attributes essential to the selection of appropriate methods of forecasting. Chapter 4, dives deeper into the topic of forecasting needs and attributes and how they enable selection of forecasting methods to create the forecasting framework.

144

CHAPTER 4

FORECASTING NEEDS AND ATTRIBUTES

We begin this chapter with a recap. Chapter 2 collected, reviewed and analyzed forecasting methods and identified them into categories of methods of observation, estimation, and intervention. Chapter 3, addressing Q 2 of the sub-questions, determined the common attributes among the case studies which would be important factors in the selection of forecasting methods.

Additionally, Chapter 3 discovered that ascertaining the need for forecasting is the foremost task of initiating a forecasting inquiry. Next in the series of sub-questions are:

Q 3: What determines the combination of more than one forecasting methods towards a design process?

Q 4: What warrants the need for forecasting studies to be the starting point of a design process?

Chapter 4 addresses Q 3 and Q 4 and proposes arranging the methods and attributes in a framework that may address the key research question.

Attributes for Selection Forecasting Methods

Attributes are the factors which the forecaster will need to refer to, in order to determine which forecasting methods are available to them within the caveats of operationalization. Examining the case studies in Chapter 3, we outlined some common factors that will determine the methods to be implemented in alignment with the larger need for forecasting. The common factors or attributes that came to fore through the case studies were – the availability of existing data, the forecasting range, lead times and the nature of forecast sought. In order to understand which methods are available to be selected from, the forecaster must understand the role these attributes play in the selection of methods.

145

Availability of existing data

Regular tracking of data can reveal the level to which the product/design is resonating with the consumer. Declining numbers in sales or acceptance indicate that consumer’s preference has shifted and the product or design needs to be re-evaluated. When working with simple models, using existing data may pose a limitation as the model may assume that the past conditions will not change in the future (Makridakis, Wheelwright, and Hyndman 2008). Sophisticated models that assume change together with the existing data can help with this limitation (Taleb 2007), especially when looking at long-term forecasts. Hence, while the availability of existing data does guide the selection of the forecasting methods, the forecaster must understand the range for which the forecast will be valid. When the need for forecasting is around postulating the relevance of a product with consumers as it is, i.e. without any substantial design change, statistical methods of forecasting existing can be helpful in predicting the future by absorbing existing data. Statistical methods are relevant in case of products with short life cycles as the required forecasting range for such products is also short term. Hiray (Hiray 2011), in context of business management, states that statistical methods like Time-Series are particularly effective over the short term. After applying sales data to plan for retail trends Sun et al. (Sun et al. 2008),

Lin et al.(Lin et al. 2010) traced the textile colors using machine learning. However, when no data is available such as in case of a new product then forecaster must select methods which help gather insights around the viability of the new product. Methods such as design trends analysis, environmental scanning and Nystrom’s framework serve the purpose of gathering and analyzing information based on present needs and preferences of consumers. Once initiated design trends analysis, environmental scanning and Nystrom’s framework these methods work by a collection

146

of information to identify patterns and trends. Forecasters or forecasting agencies often times work on analyzing design trends and environmental scanning on an on-going basis. In case of methods like Ethnography, Surveys, the Delphi method, and Design Thinking, data acquisition is part of the method and therefore non-availability of past information is not a deterrent in the selection of these methods. Forecasting research from such methods can inform the designers’ creative direction. When organizations aspire to groundbreaking innovation to win over market shares and consumers’ hearts then the guiding data must come from consumers themselves.

Attributes Methods

Methods dependent on Ethnography availability of existing data or information Survey - YES (Y) - No (N) Delphi

NO YES Design Thinking

Environmental Scanning

Design Trends Analysis

Nystrom’s Framework

Machine Learning

Statistical Modeling

Cross impact Analysis

Decision analysis

Morphological Analysis

Fig. 4.1: Forecasting methods and respective dependence on availability of existing data.

147

Interventional methods such as ethnography, surveys, design thinking help carve out the innovations when no prior data can be availed. Rinallo and Golfetto (Rinallo and Golfetto 2006), in their paper on the marketing strategies in Premiere Vision Paris, a leading trade fair for the apparel industry explain how utilizing ethnographic methods can unearth the long term tactics of

French and Italian fabric producers to provide textile innovations. They also explain, how value creation is possible irrespective of the whatever sold in the past. Fig. 4.1 represents a view of methods with respect to the availability of existing data. As part of future work, as more and more instances are collected of selection of forecasting methods, the varying extent to which methods are dependent on existing data, can be better defined.

Forecasting range

The concept of forecasting range can be interpreted in multiple ways. The range for which forecasting is required depends on various factors such as the industry for which forecasting is needed; how is the forecast to be used; what kind of information and resources are available, etc.

The forecasting range thus becomes an important stipulation in the selection of forecasting methods. Forecasts can be short, mid or long range – but essentially, they are components which feed into planning for the short, mid and long-term future (Jon Scott Armstrong 1985). Short range forecasts can range from few months to a year, mid-term forecasts range from a year to three years, and long-term forecasts can go up to 10 years (Fahey, King, and Narayanan 1981).

Short range forecasts can utilize past data and estimate the expected future in terms of product sales, merchandise planning, what design profiles to stock/re-order/manufacture. Mid-range forecasts benefit from observation and analysis of current trends for design planning as consumer choices are expected to change only moderately over the intermediate range. Long range

148

forecasts create larger future scenarios where the likelihood of change is higher and dependent on various factors. Hence, the definitiveness of the forecasts becomes more prescriptive as the forecasting range changes from short to long-term. Fig. 4.2 represents a view of methods with their respective forecasting ranges.

Attributes Methods

Forecasting Range Ethnography

- Short (Sh) Survey - Medium (M) - Long (Lg) Delphi

(Sh) (M) (Lg) Design Thinking

Environmental Scanning

Design Trends Analysis

Nystrom

Machine Learning

Statistical Modeling

Cross impact Analysis

Decision analysis

Morphological Analysis

Fig. 4.2: Forecasting methods and respective Forecasting ranges.

Little et al describe the change in uncertainty (Fig. 4.3) as time progresses from a project management perspective (Gryphon et al. 2006). Little explains that at the time of initiation of a

149

project, expectations, requirements and are very broadly defined, hence the estimates also include a large variance (Gryphon et al. 2006). As the project progresses, estimates can be provided with better precision, thus reducing the uncertainty. When forecasting for short-term design projects, better “estimates” from today can help plan the next design iteration, as compared to a design project with a long-range horizon.

Plausible Futures

Today Time Future Horizon

Fig. 4.3: Cone of Uncertainty. As the range of time increases, a variety of plausible future outcomes become possible based on factors that may not be known today (McGovern 2017) .

When National Hurricane Center releases the "cone of uncertainty" on hurricane forecast maps during the hurricane season on hurricane showing the probable track of the center of the hurricane at periodic intervals (Fig. 4.4). Meteorologist Danielle Banks explains that typically a

“cone of uncertainty” may show forecast up to five days out from the last recorded position of the storm (Banks 2017). If the hurricane forecast cone projects the track at twelve-hour intervals, then the error in the forecasted track will be least for the next 12 hours but will increase with an increase in forecasting period. The cone appears to broaden with longer forecasting range

150

(greater uncertainty about the track five days in advance). Alternatively, the certainty of the forecasted track increases with shorter forecasting range (12 hours).(McGovern 2017)

Fig. 4.4: Hurricane Forecast Cone. As the forecasting range of time increases the uncertainty of the forecasted track also increases (McNoldy 2013).

Lead time (McNoldy 2013)

Lead time (Unger and Eppinger 2009) is an important factor in the product development cycle of any product. Johnson and Kirchain (Johnson and Kirchain 2011) define it as the time taken to develop the product’s proof of concept from the initial conceptualization point. From a manufacturing perspective, lead time depends on the physical availability or resources, materials, number of components and processes involved. From a supply chain perspective (Mather 1986), managing lead times is important to timely satisfy customer demands while keeping inventory stocks low. However, from a design forecasting standpoint lead time is critical to adjudge the strategic window of opportunity. Forecaster’s role is to access the speed and direction of change in consumer needs and preferences. Based on the forecasting research, the forecaster needs to be able to inform product design on what will be relevant and appealing to the consumer taking into

151

account the lead time that would have elapsed by the time the product launches. In their 2011 paper on automotive design, Johnson and Kirchain (Johnson and Kirchain 2011) established that lead-time is a key factor in making structural and technological decisions about the product design. Prior research on the topic of lead times, by Clark and Fujimoto (Clark and Fujimoto

1989) states that more innovative products require longer lead times, than the ones which are only iteratively innovative. Not accounting for appropriate lead times in the selection of forecasting methods can prove detrimental. The Pontiac Aztec was a product developed under immense pressure and shrunken timelines. The need for a product that was to create a unique

Attributes Methods

Lead Time Ethnography Time needed to carry out the method Survey - Short (S) - Long (L) Delphi

(S) (L) Design Thinking

Environmental Scanning

Design Trends Analysis

Nystrom

Machine Learning

Statistical Modeling

Cross impact Analysis

Decision analysis

Morphological Analysis

Fig. 4.5: Forecasting methods and respective lead times.

152

identity and product positioning, required a paradigm-shifting innovative approach and the corresponding lead time. Shrinking the lead time resulted in selection of methods that provided inadequate and ambiguous creative direction. Hence, forecasting methods must be selected based on lead times in tandem with the forecasting need and strategic design motive. Fig. 4.5 represents methods and their respective lead times.

Nature of Forecast required: Creative direction/business decision/predicted value

The purpose of forecasts germane to the field of design is to inform the designers of the changing consumer needs and preferences, so that relevant design solutions can be created. Factors such as social, cultural, economic, political or technological developments that can alter consumer needs and preferences are less likely to change over a short period of time. But such factors can vary greatly over a longer time frame. As described above, the lead times also vary for different product development cycles. With the help of statistical methods, forecasters are equipped to provide definitive predictive values for near-term that reflects consumers’ interest in the product features and design profiles through sales estimates. Such forecasts will provide quantitative

values as an indication of the future. The purpose, however, morphs to a more prescribed one as organizations seek creative direction for new product development in long-term future that will satisfy future needs of consumers. Methods like Morphological analysis, Design Thinking or

Delphi can fulfill that role. The nature of forecasting sought in this case to indicate the creative direction for the future. In intermediate future, designers and organizations also have to look out for competition in addition to the shifts in consumer taste and current developments. The purpose of a forecasting inquiry in such a situation becomes to aid decision making for the business and methods such as decision analysis can be called upon. Depending upon the need, appropriate

153

forecasting process can be utilized to provide either (a) quantitative values, (b) business decision,

(c) creative direction, or a combination of these. Fig. 4.6 represents a view of methods and the nature of forecast they provide. Combining Fig. 4.1, Fig. 4.2, Fig. 4.5, Fig. 4.6 and Fig. 2C.1 we can view the forecasting methods and their respective attributes in Fig.4.7.

Attributes Methods

Nature of forecast: Ethnography

- Creative Direction (C) Survey - Business Decision (B) - Quantitative Value (Q) Delphi

(C) (B) (Q) Design Thinking

Environmental Scanning

Design Trends Analysis

Nystrom

Machine Learning

Statistical Modeling

Cross impact Analysis

Decision analysis

Morphological Analysis

Fig. 4.6: Methods and respective nature of forecast.

154

) Lg ( ) ) (M) Sh Lg Range Short ( Short Medium (M) Long ( Forecasting (Sh) Forecasting Range - - - Lead Time Time Lead (L) ATTRIBUTES

of Data Availability Short (S) Short Long (L) (S) Lead Lead Time Time needed to carry out the method - - of attributes. Forecast Nature ) PoP respective respective Yes their their ) YES (Y) No (N Collection data/info of is part of the process ( METHOD NO Impact Impact Analysis - FORECASTING FORECASTING Methods dependent Methods dependent on availability of existing data or information - - - Ethnography Survey Delphi Design Thinking Environmental scanning Trends Design Analysis Nystrom’s Framework Machine Learning Statistical Modeling Cross Decision Analysis Morphological Analysis Environmental Scanning Trends Design Analysis Nystrom’s Framework Machine Learning ethods and ethods (Q) Forecasting Forecasting m : (B) METHODS CATEGORY OF OF CATEGORY FORECASTING Creative Direction (C) Business (B) Decision Quantitative Value (Q) Interventional Observational Estimation Observational

(C) Fig. 4.7 shift - Paradigm Nature forecast: of - - -

NEED Prefer

Predict FORECASTING

155

Need for Forecasting

The different purposes or functions for which forecasters, designers, and researchers need forecasting research may range from – being able to predict the future state (Chambers, Mullick, and Smith 1971a) based on past patterns and to be able to react to it – to implementing change

(to nudge) in present norms or circumstances (Rhyne 1981) (Asur and Huberman 2010) in order to attain a preferred a state of affairs in the future – to – preparing for a paradigm shift in consumer needs (Paap and Katz 2004), that may not be imminent yet but if discovered can drastically impact the state-of-the-art. From the cases discussed in Chapter 3, Laura Ashley needed to plan for a preferred future, Pontiac needed to plan for a paradigm-shift; whereas

Britannica needed to predict the impact of change on its business model. However, these three needs to ‘Predict’, ‘Prefer’ and create ‘Paradigm-shift’ are not mutually exclusive (Gans 2016).

A forecasting ask may require addressing all three, or any two or any one of these needs.

To Predict

To predict is the function of forecasting research that enables a calculated and informed response based on the observation and study of past and/or current data. Past data could be available in objective/quantitative format or it could be available in a qualitative format (collection of rich descriptive information which may have been organized by a common theme or story).

This function of forecasting research answers short term queries under the condition of constancy

(Makridakis, Wheelwright, and Hyndman 2008). In case of the Britannica encyclopedia (Shane

Greenstein and Devereux 2017), the sales data from past years showed an upward graph with the sales being the highest in 1990. However, their numbers started declining the very next year with

156

the advent of Encarta (Shane Greenstein and Devereux 2017). Only referring to methods which can estimate a future value based on the past information would create an incomplete and incompetent framework, in case of need for “prediction”. In order to cater to the need to predict, the forecasting framework should not only include methods of estimation but also methods that allow for observation of current happenings to validate or highlight the presence other factors which may change the projection. Methods of estimation which project a future value based on past data, can cater to the need for prediction only in two cases – when the prediction happens in a vacuum and no other external event can alter that prediction; or when the prediction is needed for such a short-term future when external events may not lead to an impact on the outcomes. In most real-life case, where both these conditions are absent, the forecaster must take into account methods of observation as well as an estimation for the purpose of prediction. Hence, To Predict:

Observation + Estimation.

The nature of insights ‘predictability’ provides, assists with decision making towards periodic sales forecasts, expansion of resources, production planning, logistics planning, marketing strategy, merchandise ordering, inventory assessment etc. Organizations working in situations such as rapid growth scenarios, day to day planning or periodic assessment may leverage forecasting focused on predictability. Areas of business and corporate planning related to revenues, sales, production, etc. are some examples where insights from ‘predictability’ play a key role.

Operationalizing ‘Predictability’ would require selecting methods which satisfy a certain set of attributes. The attributes of availability of past data – qualitative and/or quantitative will dictate which methods can be selected and therefore is most relevant for the context of predictability. Methods of observation and estimation will differ based on whether the data is

157

qualitative or quantitative. Examples of methods of observation for qualitative data would include environmental scanning, collecting and analyzing design trends, Nystrom’s method of capturing the Zeitgeist and preparing data files for machine learning algorithms. Methods of observation for quantitative data would include a collection of pertinent data sets such as survey data and other records regarding the incumbent trend.

The attributes of desired forecasting range and nature of forecast required further help in selecting the forecasting method. Depending on how far along in the future is the forecast is sought, the forecaster will need to determine the method of estimation. Some methods which can predict the future of the trend based on available quantitative data include Statistical methods such as

Regression analysis and Time Series Analysis. Methods such as cross-impact analysis, decision analysis, and morphological analysis are useful for handling descriptive data. The available lead time can further help in narrowing down the methods of observation and estimation. Fig. 4.8 shows the attributes arranged the attributes for selecting methods by degrees of relevance for

‘predictability’. On a scale of 0 to 2, ‘0’ represents that the attribute is not relevant towards the need to predict; ‘1’ represents that the attribute is moderately relevant, while a score of ‘2’ shows the most relevance of the attribute toward the need to predict.

Availability of Forecasting Lead Time Nature of Data Range Forecast PREDICT 2 1 1 1 Fig. 4.8: Arranging attributes by their relevance towards the need for ‘Predictability’ on a scale of ‘0’ to ‘2’.

158

To Prefer

‘Preferability’ addresses implementing a change in present situation in order to attain a preferred state in future. This function of forecasting research focuses on creating a future by identifying agents that are driving a change in consumer preferences. Often times, incremental innovation strategies (Norman and Verganti 2014) and are based on this approach. ‘Preferability’ guides towards a future that is for the general betterment of the organization, or society or a specific audience. This is why it may also be looked upon as a problem-solving strategy that generates recommendations and not necessarily numerical results.

‘Preferability’ insights inform decision making and internal objectives about business strategy, kind of product development effort needed, whether product innovation is the need or product improvement, price positioning, product architecture, target markets, resource expansion estimates, market distribution, economic trends, color-material-finish explorations etc. Insights of such nature are crucial when organizations are aiming for better profits, higher market share, increased engagement or in general creating a better impact than the current situation. Some typical situations when forecasting research for the purpose of ‘preferability’ is called upon are – identification of market or offering gaps, new product or process development, new market entry strategies, early product introductions, diversification of brand or portfolio, communication strategy and media plans, trade policies, environmental impact etc. Industries commissioning

‘preferability’ research include fashion, textile and lifestyle goods, automobiles, construction and architecture, digital personalization and recommendation engines in ecommerce or retail, robotics in social realms, music, film and publishing industry, education etc.

159

Preferability research focuses on an assessment of the present than a collection of past data.

Methods such as Environmental Scanning, analysis of Design trends, Nystrom’s Framework are pertinent for assessment of the present trends. A forecasting inquiry geared for ‘preferability’ may extract value from past data to gauge the current situation, but does not depend upon the empirical precision of past values. Based on their analysis of current information acquired by employing one of the aforementioned methods, forecasters can make recommendations which provide creative direction or guide towards a strategic business decision. The decision makers can estimate the impact of implementing the recommendations. In Laura Ashley’s case, the management could have estimated the impact of outsourcing their production. A change in creative direction for Laura

Ashley would have warranted continuous monitoring of sales every season. The merchandise selling out most would have confirmed the resonation of the creative direction with the audience and provided an estimation of future sales. The need to attain a Preferred state, hence encapsulates both observation and estimation.

Operationally, preferability research focuses on mining change in consumer preferences as a result of on-going changes around the consumers to postulate the future shape of things. The expectation from it is not to generate precise numbers but provide a directional recommendation in order to attain the preferred state in terms of the desired nature of forecast. Determining nature of forecast sought therefore is the most relevant attribute for the context of preferability. The forecaster would rely on methods of observation and estimation to underscore the current state.

For this reason, availability of existing data becomes an important factor. Ideally, the observation and analysis of happenings and trends in the surroundings should be an on-going process and should allow for lead times depending on the industry where it applies. The direction should be

160

able to provide a strategic guidance for a mid to long-term forecasting range. Automated or computational applications may be helpful in studying inter-dependencies between impacting factors but are not mandatory as far as creative direction is needed. Fig. 4.9 shows the attributes arranged the attributes for selecting methods by degrees of relevance for ‘preferability’. On a scale of 0 to 2, ‘0’ represents that the attribute is not relevant towards the need to predict; ‘1’ represents that the attribute is moderately relevant, while a score of ‘2’ shows the most relevance of the attribute toward the need for preferability.

Nature of Availability of Lead Time Forecasting Forecast Data Range PREFER 2 1 1 1 Fig. 4.9: Arranging attributes by their relevance towards the need for ‘Predictability’ on a scale of ‘0’ to ‘2’.

Paradigm Shift

Before approaching paradigm shift, we explore the concept of a paradigm. Thomas Kuhn in his canonical work “The Structure of Scientific Revolutions” (Kuhn 1962) defines paradigm as the “continuation of a particular research tradition” (Kuhn 1962) that is accepted by its’ community to provide a basis for its further practice, as well as to provide model solutions to the problems in that discipline for a period of time (Kuhn 1962). By defining a “paradigm” in this way, Kuhn related it to “normal science” which represents research and rules based on a specific scientific achievement in the past (Kuhn 1962).

When the rules, practices and assumptions of the current paradigm or normal scientific activity are unable to provide an answer to a problem it represents a breakdown

161

or an anomaly (Kuhn 1962). The practitioners of a discipline may try to resolve the anomaly but if even after repeated attempts it remains unresolved; the situation may pose as a crisis of the current paradigm and the practitioners begin their search for new rules

(Kuhn 1962). The search for new rules, leads the disciple to investigate new practices and assumptions, that can serve as a basis for further research (Kuhn 1962). This identification of new rules and practices, due to anomalies and crisis of the previous paradigm is what Kuhn referred to as a “paradigm shift” or a “scientific revolution”

(Kuhn 1962). Kuhn points out that the investigative research done around an anomaly, may sometimes indicate the outlines of the new and upcoming paradigm (Kuhn 1962).

He suggests that the emergence of the new paradigm actually occurs, even before a crisis with the existing paradigm can be spotted, however this is not to say that all anomalies will result in a crisis and subsequently in a paradigm shift (Kuhn 1962).

Referring back to the discussion in Chapter 2, about the drivers of innovation,

Paap and Kaatz state that in order to truly sense the future, one must pay attention to changing customer needs and preferences (Paap and Katz 2004). They represented three distinct cases of interaction between changing needs, drivers25 and technology, that may result in a paradigm shift (Paap and Katz 2004):

a) old technology matures relatively to the dominant driver.

25 (Paap and Katz 2004) define ‘driver’ as the performance characteristic with the greatest leverage for customer decision making.

162

b) The previous driver matures, a new driver emerges and old technology is unable to meet

the needs of the new dominant driver.

c) The environment or the external factors change.

These three cases are also aligned with Kuhn’s explanation of emergence of a new paradigm and emphasize the relevance of paradigm shifts in forecasting consumer trends.

In order to track, understand and respond to the paradigm shift, methods of observations, estimation as well as intervention would be required. However, the situation of a paradigm shift is not limited to responding to a paradigm shift. It also includes cases where an organization or an individual is trying to achieve or facilitate a paradigm shift. The Pontiac Aztec example in Chapter

3 points to the latter case. Kim and Mauborgne (Kim and Mauborgne 2004) state that Blue Ocean strategies can create an entirely new and unique value for the consumer, leading to a new market space where competition becomes irrelevant thereby aiming for a paradigm shift. Kim and

Mauborgne (Kim and Mauborgne 2004) further explain, that while reacting to competition the strategic profile of a company or product loses its’ uniqueness and becomes ‘muddled’ with those of others operating in the space. The strategic profile of innovators’ or innovations boldly stand apart as they decide to dramatically increase focus on certain factors of competition and bow out of others, or create completely new factors altogether (intervention). It is this process of determining such strategic decisions that foraying into paradigm-shifting research finds answers to. Hence, a forecasting inquiry dealing with the need for paradigm shift needs to include methods of observation, estimation but most importantly, methods of intervention to identify that unique value.

163

In the case of Pontiac, discussed in Chapter 3, it was this paradigm shift that they were seeking.

One of their attempts resulted in Pontiac Aztec which failed to strike any chord with the customers.

Primed by the past data on steadily declining sales, Pontiac needed to bring together observation of current market trends and well as interventional mechanisms to draft a strategy of re-invention.

In his 1962 book, “The Structure of Scientific Revolutions” when Kuhn explained the concept of paradigm and paradigm shift (Kuhn 1962), its influence went far beyond the just the scientific community. Kuhn’s theory provided an explanation for how science changes over time

(Kuhn 1962). Kuhn deemed that scientific changes come about in two key ways – incremental changes in the course of stable “normal science”, and drastic scientific revolutions (Kuhn 1962).

Fig. 4.10 shows that during the course of scientific development, a paradigm-shift phase is characterized by a radical transition.

Fig. 4.10: Graph representing a paradigm shift in a product’s performance as the time progresses (Cols 2016).

164

Thinkers, philosophers and authors in a variety of fields including pop culture started giving Kuhn’s concepts of paradigm and paradigm shift their own interpretation beyond Kuhn’s theory of science (Masterman 1970). Few years after Kuhn published his “The Structure of

Scientific Revolutions”, Masterman upon careful review of Kuhn’s work identified twenty-one different uses of the word paradigm in Kuhn’s book (Masterman 1970). This drew attention to the fact Kuhn’s definition of a “paradigm” could be interpreted in multiple ways and was somewhat loose (Wendel 2008). Despite the fact that Kuhn’s critics found his definition of paradigm shift to be crude and open to interpretation (Wendel 2008), Kuhn did provide defining characteristics of a paradigm and paradigm shift. Robert Fulford’s description shows that by

1990, how the concepts of paradigm and paradigm shift had come to be accepted and resonated within social, scientific, economic, domains alike (Fulford 1990).

In a paradigm shift, existing well-accepted assumptions give way to new ones (Kuhn

1962). Elucidating the characteristics of a paradigm shift Kuhn explained that a paradigm shift occurs when the existing paradigm is no longer able to provide answers to questions despite multiple attempts by its community of practitioners (Kuhn 1962). with a newer paradigm that already has garnered support within the discipline, and will hold even as new information is brought to light (Kuhn 1962). He also warned that unless a sizeable number of scientists in a field agree and rally for the new paradigm, a paradigm shift will not occur (Kuhn 1962).

A recent case in point is the increasing degree of digital collection and sharing of personal data for targeted marketing. Marketers have increasingly utilized the potential of big data over the past decades to drive profit for their clients through profiling of customers. Digital platforms such as Facebook have encouraged their users to share their personal data, through

165

loosely defined and sometimes even illicit practices of gaining consent (Lewis 2018). So much so, that sharing personal information and experiences with an online community of strangers came to regarded as being a ‘professional influencer’ whom big companies seek out for their

“reach”. In such a paradigm, when Cambridge Analytica used the data of 87 million Facebook users and their friends, the incident framed a tipping point heralding a paradigm shift in practices around personal digital data, marketing, technology and regulations. Many are opining that it is not a breach but a deliberate exploitation of loosely defined regulation around collection and sharing of digital data. Aleksandr Kogan and Christopher Wylie have been credited with creating the Facebook quiz – “This is Your Digital Life” that was used to obtain the personal information of users which Cambridge Analytica later used to strategize the 2016 election campaign for

Donald Trump (Haupt 2018). The personality predicting model built by Cambridge Analytica can be thought of as a highly advanced and sophisticated model of customer profiling that companies might use for marketing. With the focus on 17 states, crucial for Trump’s campaign,

32 personality types were identified. The personality types deduced potential Trump supporters based on their likes and digital behavioral patterns and strategically target them throughout the campaign (FirstHive 2017). This event highlighted the gaping lack of proper regulation, and growing misuse of big data toward targeted marketing. However, these events have not occurred overnight. Resentment against harvesting customers’ personal data for marketing had been brewing but this event, but third party data harvesting has been on the rise due to low level of regulation around data protection. While policy makers and authorities are questioning legislation, pioneers in technology are already discussing secure alternatives for data collection and sharing. The General Data Protection Regulation (or GDPR) will provide customers stronger

166

control over their data, while creating stricter laws on data protection for companies to comply with (Tankard 2016). This will cause a shift in how companies acquire marketing data and shift focus towards responsible methods of using digital data. On the technology front, zero- knowledge end-to-end encryption that protects data can limit the exploitation of personal information by companies (Brassard 2003). Open secure techniques such as Secure Multi-Party

Computation (SMPC) based on differential privacy enable secure sharing of data (Eigner et al.

2015). Fully Homomorphic Encryption or (FHE) is another technology developed by Craig

Gentry enables data analysis without actually exposing the data (Gentry, Halevi, and

Vaikuntanathan 2010). Rapid research in Blockchain technology also promises a secure infrastructure for managing digital data. This episode of Facebook and Cambridge Analytica represents a paradigm shift that we are in the midst of.

Forecasting Availability of Lead Time Nature of Range Data Forecast PARADIGM-SHIFT 2 1 0 or 1 1

Fig. 4.11: Arranging attributes by their relevance towards the need for ‘Paradigm shift’ on a scale of ‘0’ to ‘2’.

Summary

Summarizing the forecasting needs and the categories of forecasting methods required to fulfill those needs have been arranged in Fig. 4.12.

Combining Fig. 2C.1 (Forecasting methods organized by categories), Fig. 4.12

(Categories of forecasting methods germane to different forecasting needs), Fig 4.7 (Forecasting methods and their respective attributes), we can prepare a holistic vision of methods and respective attributes arranged by forecasting needs and method categories. Organizing the forecasting methods based on the forecasting needs identified in this section, together with the

167

attributes provides a holistic view of the composite forecasting framework, as shown in Fig.4.13.

Fig. 4.13 reads from left to right, presenting the forecasting needs (to predict, to prefer, and paradigm-shift), categories of methods (interventional, observational, and estimation), the forecasting methods as described in Chapter 2 and their attributes in one view.

Composite Forecasting Framework

Finally, as a synopsis, we present a composite forecasting framework as a dynamic set of questions, that functions as a forecasters’ toolkit, in selecting an appropriate set of forecasting methods based on what is the forecasting need, bounded by the operational caveats as inherent in relevant attributes. Situations warranting a forecasting inquiry may arise from a variety of contexts and require to address the forecasting needs (Predict, Prefer, Paradigm-shift) distinctively or in combination.

The composite framework (Fig. 4.8) and (Fig. 4.7) also illustrate the possibilities where the situation requires for more than one distinct forecasting need. For instance, if the forecasting need is to drive preferability but in a paradigm-shifting move, the framework would equip the forecaster to select methods holistically, and synchronize their application to satisfy both needs.

To illustrate the functioning of the framework we create three distinct samples, one for each forecasting need. These samples exemplify the role played by attributes in the selection of methods and hence in what order might they be considered.

168

Predictability

Preferability Needs

Forecasting Forecasting Paradigm-shift Intervention Estimation ing their methods, attributes, ) Observation Lg ( ) ) (M) Sh Lg M, L M, + M L M M Sh Sh Sh Sh Sh , + M + L M + L + M L + M Sh Short ( Short Medium (M) Long ( Range Sh Sh Forecasting Forecasting (Sh) Forecasting Range - - - L Time L L L L L L S S S S S S + L (L) Lead S Short (S), Long (L) Short (S) Short Long (L) (S) Lead Lead Time Time needed to carry out the method - - C B C C C C Q Q B B C C + B + C C, B, C, Q (Q) Natureof Forecast (B) on Creative Direction (C) Business (B) Decision Quantitative Value (Q) (C) No No No No

Yes Yes Yes Yes Yes Yes Yes Yes data

Nature forecast: of - - - Dependent Analysis Yes Framework Delphi à ) Survey Ethnography Design Thinking Design YES (Y) No (N NO Decision analysis analysis Decision Machine Learning Machine Statistical Modeling Cross impact Cross Design Trends Analysis Trends Design Nystrom’s Morphological Analysis Analysis Morphological Methods dependent Methods dependent on availability of existing data or information - - Environmental Scanning Fig. 4.12: The composite forecasting framework: Systematic arrangement forecast ofTheframework: Systematic Fig.forecasting composite 4.12: and forecasting needs. Attributes Methods

169

Each example starts with identifying the forecasting need followed by the subsequent filter questions based on operational attributes. The first filter question for each forecasting need corresponds to the most relevant attribute for that need. Fig. 4.13 combines Fig. 4.8, Fig. 4.9 and

Fig. 4.11 representing the three forecasting needs – predictability, preferability and paradigm-shift and the attributes most relevant for each. The first sample addresses the forecasting need to

“Predict”, the second sample addresses the need to plan for a “Preferred” state and the third to design for a “Paradigm-Shift”. Fig. 4.14 (A, B, C) provide a walk-through of the forecasting framework by exemplifying the above three cases of ‘predictability’, ‘preferability’ and ‘paradigm shift’.

Forecasting Range Availability of Data Lead Time Nature of Forecast

PREDICT 1 2 1 1

PREFER 1 1 1 2

PARADIGM-SHIFT 2 1 0 or 1 1 Fig. 4.13: Arranging attributes by their relevance towards the forecasting needs – Predictability, Preferability and Paradigm shift on a scale of ‘0’ to ‘2’.

The relevance of attributes towards the forecasting needs determines the order in which the forecaster selects the forecasting methods. When approaching more than one forecasting needs simultaneously, the forecaster will address the most respective most relevant attributes simultaneously, as depicted in Fig. 4.16.

170

X X No Long Time Time X Lead Lead Long Med Quant. Quant. Value Time Time data/information data/information available? Short Short Forecasting Forecasting Range methodology. methodology. Sta tistica lModeling MachineLearning Sta tistica lModeling MachineLearning Lead Lead Paradigm shift Sta tistica l ModelingLearning Machine Yes • • Long Is the past past Is the De lph i

Level of innovation required: disruptive required: innovation of Level basic assumptions, ways of thinking and of thinkingand ways assumptions, basic X Med Causing a change in the framework containing the the Causing containing a change framework in the Range Range Forecasting Forecasting X Short Delphi Long X The framework framework arranges Thethe methods, Long

Time Med Lead Lead Range Range bility.” Survey Forecasting Forecasting No Survey Delphi Short Short Survey • • prefera X X Long Long Time Time Prefer Med Analysis Impact Analysis - Lead Lead Analysis the forecasting need? the forecasting Business Business Decision Analysis Analysis Impact An. Impact - Impact - Cross Analysis Decision Forecasting Forecasting Range Short Yes Is the past data/information past Is data/information the available? Short • • Cross De c isio n Cross De c isio n Level of innovation required: incremental required: innovation of Level What is is What forecast the sought? nature of What is different for a future state if no action is taken. is no action differentstate if for a future Planning towards a preferred future, that might be Long Morphol ogical Analysis Med Ethnogr aphy Morphol ogical Analysis Forecasting Forecasting Range Short Ethnogr aphy ards the forecasting need forforecasting ards the “ Ethnography Ethnography Morphological Analysis w Long • • Time Time X Long Lead Lead Design Design Thinking Med Ethnography Survey Thinking Design Survey De sig n Thinking No Short • • • • • Short Forecasting Forecasting Range Survey De sig n - - Thinking Sc Long Predict Long the past information. information. thepast Time Time Environmental Environmental De sig n - Sca nning - An. Trends Environmental Environmental De sig nT re n dsAn Nystro m’s F ra me w o rk data/information data/information available? • • • Postulating the future, based on Lead Lead Level of innovation required: minimal minimal required: innovation of Level Creative Direction Creative : Forecasting Forecasting framework to: X Forecasting Forecasting Range Environmental Scanning Analysis Trends Design Framework Nystrom’s Analysis Morphological Med A De sig nT re n d Nystro m’s Short Is the past past Is the Yes • • • • - Analysis - Framework ig. 4.14 F manner. for a attributes,method systematic treein a allow selection branching in to and questions

171

The questions in Forecasting Framework shown in Fig. 4.9A, are in the following order:

• What is the forecasting need?

- To predict - Plan for a preferred future state - Causing a paradigm-shift

(Based on the selection for planning for a preferred future state, the next question based

on the ‘nature of forecast’ operational attribute is stipulated.)

• What is nature of forecast sought?

• Creative Direction • Business Decision • Quantitative Value

(For each of the above options, questions related to the rest of the attributes are

stipulated in the following order.)

• Is past data/information available?

- Yes - No • What is the lead time?

- Short - Long • What range of forecast is needed?

- Short term - Midterm - Long-term

172

Long Long De lph i Med De lph i No X X Long X Short Lead Time Lead Time Ethnography Survey Thinking Design Delphi • • • • Med Forecasting Forecasting Range Quantitative/Predicted Quantitative/Predicted Value methodology. methodology. Paradigm shiftParadigm Short Short Survey

Survey Level of innovation required: disruptive required: innovation of Level basic assumptions, ways of thinking and of thinkingand ways assumptions, basic Causing a change in the framework containing the the Causing containing a change framework in the Long Long Ethnography Ethnography Survey Delphi Nature of Nature Forecast Business Business Decision • • Med Ethnography X Long X Short Lead Time Lead Time Med Forecasting Forecasting Range Ethnography Ethnography Survey Design Thinking Short Creative Direction Direction Creative Short Survey De sig nT h in kin g • • • Survey De sig n - - Thinking Prefer data/information data/information available? X the forecasting need? the forecasting Long X past past Long Level of innovation required: incremental required: innovation of Level Is the What is Med Sta tistica l Modeling different from a future state if no action is taken. taken. is action no if different state from a future Planning towards a preferred future, that might be Lead Time Lead Time

Learning Forecasting Forecasting Range

Short MachineLearning StatisticalModeling Machine Learning Learning Machine Statistical Modeling Short Quantitative/Predicted Quantitative/Predicted Value • • Machine Sta tistica l Modeling X Long X Long X Med Nature of Nature Forecast Impact Analysis - Lead Time Lead Time Analysis Analysis Impact - Impact Forecasting Forecasting Range Business Business Decision Cross Analysis Decision Short Cross Analysis De c isio nAn a lysis • • Short Cross Cross De c isio n - Analysis - Long Predict Long Framework Framework Analysis* Sca nning* Analysis* Framework Framework innovation required: minimal existing information. information. existing Analysis Scanning Trends Analysis Trends Framework* Yes Learning Learning Med DesignTrends Analysis Nystro m’s Morphological Analysis - - - Impact Analysis Postulating the future, based on Environmental Scanning Environmental De sig n Nystro m’s Morphological Analysis - Lead Time Lead Time Level of of Level B: Forecasting Methods and their respective attributes predict. and need theirfocused on to Methods the B: respective Forecasting Creative Direction Direction Creative Forecasting Forecasting Range X Machine Machine Statistical ModelingCross Analysis Decision Environmental Trends Design Nystrom’s Morphological Environmental Trends Design Nystrom’s Framework Morphological Analysis Short • • • • Short • • • • • • • • De sig n Trends Analysis g. 4.14 i F method. the out carrying of process the of part is collection *Data

173

The questions in Forecasting Framework shown in Fig. 4.9B, are in the following order:

• What is the forecasting need?

- To predict the future state - To plan for a preferred state - To prepare for a unique future state (Based on the selection for planning for a preferred future state, the next question based

on the ‘nature of forecast’ operational attribute is stipulated.)

• Is past data or qualitative information available?

- Yes - No

• What is nature of forecast sought?

• Creative Direction • Business Decision • Quantitative Value

• What is the lead time?

- Short amount of time - Longer lead times are acceptable

• What range of forecast is needed?

- Short term forecast - Medium range forecast - Long range forecast

174

Long De lph i X Sca nning Value Lead Time Lead Time X Quantitative Quantitative Short Long Long No

De lphi

Ethnography Environmental Analysis Trends Design Thinking Design Delphi Analysis Morphological phy Long Ethnogra • • • • • • shift. Paradigm shiftParadigm Decision - Business Business Lead Time Lead Time X De sig n Thinking Thinking Short Yes De lph i m data/information data/information available? methodology and assumptions. assumptions. and methodology Nature of Nature Forecast Level of innovation required: disruptive required: innovation of Level No bringing about change in ways of thinking, of thinking, bringingabout ways in change Trend Trend De sig n Thinking Ethnogr aphy Causing by a change framework existing in the Long Is the past past Is the Environmental Environmental De sig n Morphological - sc a n n in g - Analysis - Analysis Creative Creative Direction Lead Time Lead Time X Environmental Scanning Scanning Environmental DesignTrend Analysis De sig nT h in kinMorphological g An. Ethnography • • • • • Environmental Environmental Sca nning De sig nT re n d Analysis Morphological Analysis Short Yes • • • Long De lph i X Value X Lead Time Lead Time Quantitative Quantitative Short No De lphi Long Ethnograp hy Decision Business Business Lead Time Lead Time Medium X Prefer Yes Short De lph i data/information data/information available? De sig n Thinking Nature of Nature Forecast Design Thinking Thinking Design Ethnography Delphi Analysis Trends Design Framework Nystrom’s the forecasting need? the forecasting Forecasting Forecasting Range No • • • • • Ethnography Ethnography De sig n Thinking Is the past past Is the Long • • Level of innovation required: incremental required: innovation of Level What is Nystro m’s Framework different for a future state if no action is taken. is no action differentstate if for a future Creative Creative Direction Planning towards a preferred future, that might be Ethnography Ethnography Nystro m’s F ra meDe w sig o n rk T h in kin g Lead Time Lead Time • • • X Yes Short Nystro m’s Framework X No X Long Value Lead Time Lead Time nd their respective nd attributes need theirfocused on for a the paradig respective Quantitative Quantitative Short Machine Sta tistica l Machine Learning Sta tistica l Modeling - Learning - Modeling MachineLearning StatisticalModeling Yes • • • • X X No Long Impact - Lead Time Lead Time Impact Short Cross De c isio n - Impact Analysis Impact - Analysis - Analysis - data/information data/information available? Cross Analysis De c isio n Analysis Cross De c isio nAn a lysis Business Business Decision Nature of Nature Forecast Yes • • • • X Lg Predict No innovation required: minimal Sh the past information. information. thepast Is the past past Is the Survey De sig n - - Thinking Analysis Short Impact Analysis - Survey De sig n Thinking Postulating the future, based on • • : Forecasting Methods a Methods : Forecasting Level of of Level X Lead Time Lead Time Lg Creative Creative Direction Survey Thinking Design Analysis Trends Design Learning Machine Statistical ModelingCross Analysis Decision Survey De sig nT h in kinDe g sig nT re n ds • • • • • • • • • • Sh De sig n Trends Analysis Yes De sig n Trends Analysis ig. 4.14C F

175

The questions in Forecasting Framework shown in Fig. 4.X, are in the following order:

• What is the forecasting need?

- To predict the future state - To plan for a preferred state - To prepare for a unique future state (Based on the selection for planning for a preferred future state, the next question based

on the ‘nature of forecast’ operational attribute is stipulated.)

• What range of forecast is needed?

- Short term forecast - Medium range forecast - Long range forecast

• What is nature of forecast sought?

• Creative Direction • Business Decision • Quantitative Value

• Is past data or qualitative information available?

- Yes - No

• What is the lead time?

- Short amount of time - Longer lead times are acceptable

176

Fig. 4.14A, B, and C illustrate the variants of the forecasting framework approaching the three forecasting needs of ‘predictability’, ‘preferability’ and ‘paradigm shift’. Working with the framework, the forecaster can first and foremost establish the need for carrying out a forecasting exercise. The need could distinctly seek ‘predictability’, ‘preferability’ or ‘paradigm shift’, or could represent a combined forecasting approach to address more than one need. For instance, a situation that requires forecasting on ‘preferability’ through a ‘paradigm shift’. Fig. 4.14A, B, and

C show the structure of the framework, which starts with identifying the forecasting need.

Identifying the need determines which attribute must be checked first. For instance, if the forecasting need identified is to predict, the forecaster’s next step would be to check if there is existing data to understand the patterns so far. The first attribute will determine the second attribute to be called upon and so on, till the recommended forecasting methods can be arrived upon. Fig.

4.15 summarizes the general structure of the framework.

177

I. Identifying the forecasting need

II. First operational attribute is selected and applied based on the forecasting need:

a. Predict à Availability of past data b. Prefer à Nature of forecast sought c. Paradigm-shift à Forecasting Range

III. Second operational attribute is selected and applied.

Subsequently, third and fourth operational attributes are applied. Selection of the attribute to be applied depends on the previous attribute, till all the attribute filters have been applied and yield recommendation of the forecasting methods.

IV. Recommended forecasting methods yielded.

Fig. 4.15: Functional Summary of the Forecasting frameworks represented in Fig. 4.14A, 4.14B and 4.14C.

Fig. 4.16, further illustrates a situation when more than one forecasting needs, are to be pursued concurrently. The example in Fig. 4.16, shows the framework’s integrative approach by taking the needs of ‘preferability’ and ‘paradigm-shift’ into consideration simultaneously. As discussed above (also see Fig. 4.15), the first operational attribute for ‘preferability’ and

‘paradigm shift’ is applied concurrently. In case of ‘preferability’, the first operational attribute to be applied is nature of forecast sought. In case of ‘paradigm-shift’, the first operational attribute to be applied is forecasting range. In Fig. 4.16, during the first stage, the methods are

178

split by nature of forecast sought into methods providing creative direction, business direction, and quantitative values. Simultaneously, the methods are split by forecasting range into methods providing a short-term forecast, mid-term forecast, and long-term forecast. Assuming the requirement is of a long-term creative forecast, at this stage other methods which provide strictly quantitative values or business directions or short-term or mid-term forecasts are eliminated.

Among the methods which provide creative directions, and the ones which provide long-term forecasts – only the methods which satisfy both these conditions are selected, and the rest are eliminated. Thereafter, the attribute determining dependency on the availability of data and subsequently attribute determining lead times are applied to yield recommended forecasting methods. A situation seeking preferability and paradigm-shift by understanding long-term creative forecast can utilize environmental scanning and design trends analysis if the lead times available are long. Alternatively, if the lead times are short, the forecasters can leverage design thinking. This example illustrated in fig. 4.16 is only indicative and attempts to show that by using discretion in deciding key decision points for filtering of methods the forecasting framework can address more than one forecasting need. Similarly, if the categories of attributes are not mutually exclusive, the forecaster can pursue multiple categories of an attribute. For instance, in the present example, the forecaster is looking for methods to achieve long-term forecast for creative as well and business direction, methods which fulfill both those criteria under nature of forecast required can be included for consideration. Similarly, the framework can adopt an integrative approach at addressing the different forecasting needs. When forecasters understand the context and need for a forecasting study, the framework provides a guided yet flexible and integrated approach for the selection of forecasting methods.

179

What is the forecasting need?

Prefer Paradigm shift Planning towards a preferred future, that might be different for a future state if Causing a change in the existing framework by bringing about change in no action is taken. ways of thinking, methodology and assumptions. Level of innovation required: incremental Level of innovation required: disruptive

Nature of Forecast Forecasting Range

Short Medium Long Creative Direction Business Decision Quantitative Value

• Survey • Cross-Impact Analysis • Survey • Ethnography • De sig n T h in kin g • De c isio n An a lysis • Machine Learning • Design Thinking • Design Thinking • De sig n T re n ds Analysis • De lph i • Statistical Modeling • Environmental Scanning • Design Trends Analysis • Ethnography • Ethnography • Survey • Design Trends Analysis • Environmental Scanning • Machine Learning • Delphi • Design Thinking • Nystro m’s F ra me w o rk • Statistical Modeling • Design Trends Analysis • Morphological Analysis • Delphi • Cross- Impact Analysis • Nystrom’s Framework • Morphological Analysis • Decision Analysis

• De sig n Thinking • De sig n T re n ds Analysis • Environmental Scanning

Is the past data/information available? (Is the method dependent on available of past data?)

Yes No

• De sig n T h in kin g X • De sig n T re n ds An a lysis • Environmental Scanning

Lead Time

Short Long

• Ethnography De sig n T h in kin g • Design Trends Analysis

Fig. 4.16: Example of forecasting framework when taking the forecasting needs for ‘preferability’ and ‘paradigm-shift’ into consideration simultaneously.

180

CHAPTER 5

APPLYING THE COMPOSITE FORECASTING FRAMEWORK

The forecasting framework described in Chapter 4, address the key research question stated at the onset of the dissertation. However, the framework remains dynamic based on the needs and the pool of methods it incorporates. It holds the possibility and necessity for improvisation. For the current set of methods (as described in Chapter 2) it stills needs to be verified how the framework can be applied to different situations. In this chapter, we examine the forecasting framework by re-visiting the case studies in Chapter 3 based on the information available.

Re-visiting “Laura Ashley”

On the basis of the analysis of the Laura Ashley’s case in Chapter 3, the forecasting need was to chart the company’s path to a preferred state for business profitability by assessing the impact of contemporary events. Hence, the primary forecasting need that stems here is to plan for a future state, by establishing a creative vision and taking informed business decisions based contemporary global changes in trade and labor laws. Fig. 4.14A is reviewed and filtered to fit the case of Laura Ashley Plc.

The manifestation of the forecasting framework for Laura Ashley being presented here

(Fig. 5.1) has been made under multiple assumptions. As the events occurred in the past and the exact details regarding what information was available to the Laura Ashley management, what was the internal work culture and hierarchy is not known, assumptions have been made toward a minimum viable framework. The need for the forecasting inquiry has been established to plan towards a preferable state, the other two flows (‘Predict’ and ‘Paradigm-shift’) have been omitted. Further, as the case study illustrates the management needed a bifocal strategy to

181

resurrect a creative vision that was in tune with times, as well as cognizance of the changing trade climate to inform business decisions. Hence, in response to the nature of forecast sought, we maintain flows for both creative vision and business decision, while leaving out the stream for quantitative value. As it cannot be said with certainty whether the designers and the internal team had been maintaining past data or qualitative information about changing preferences and to what extent, we take the minimum viable route and proceed with the assumption that, such data or information was not available. However, if any of that data and information was available, which most likely would have been the case, the framework would have adopted the subsequent flow of attributes accordingly. With the assumption, that no data or information was available methods such as Ethnography, Surveys, and Design Thinking present themselves to seek out creative direction, and Delphi and Surveys (oriented towards collected trade-related data) are available to inform a business decision. The attribute of ‘Lead Time’ would have depended on different factors and the choice of the then management. Typically, fashion industry operates on shorter lead times as new collections are launched every season (Sproles 1981).

Ideally, if longer lead times were possible it would have been the forecasters’ discretion to judge the relevance of methods since the brand needed not just a short term but long also a long-term creative vision to break the brand inertia. Hence, for both the creative direction and business decision flows we chose methods with short lead times. Finally, the filtered methods are adjudged based on their forecasting range. The methods yielded through the operational caveats of the attributes are:

• Creative direction

182

- Short-term forecast: Surveys focused on sensing the current fashion preferences and

customer needs.

- Short-term forecast: Design thinking, to analyze the ineffective design palate and

recommend pivot options for next collection/s.

- Mid-term: Design thinking.

• Business decision: Surveys oriented towards collected trade-related data, to assess short

to mid-term strategy and roadmap.

What is the forecasting need?

Prefer

What is the nature of forecast sought?

Creative Direction Business Decision

Is the past data/information available? Is the past data/information available?

Yes No Yes No

• Environmental Scanning • Cross- Impact Analysis • Survey • Ethnography • Design Trends Analysis • Decision Analysis • Delphi • Survey • Nystrom’s Framework • Design Thinking • Morphological Analysis Lead Time Lead Time

Lead Time Lead Time Short Long Short Long

Short Long Short Long Cross- Impact Analysis Survey Delphi De c isio n Analysis X

X • Environmental Sc • Survey • Ethnography • De sig n T re n ds An • De sig n • Morphologic • Nystro m’s F ra me w o rk Thinking al Analysis Forecasting Range Forecasting Range Forecasting Range

Short Med Long Forecasting Range Forecasting Range Forecasting Range Short Med Long Short Med Long

Cross- Impact An. Survey X X X De lph i Med Long Short Med Long Short Med Long De c isio n Analysis X

-De sig n T re n d -Environmental Ethnogr -Survey X Analysis Sca nning Design aphy -De sig n Morphol -Nystro m’s -De sig n Thinking Ethnogr Morphol Thinking ogical Framework Trends An. aphy ogical Analysis Analysis

Fig. 5.1: Manifestation of Forecasting frameworks for the case of Laura Ashley.

183

Revisiting “Britannica Encyclopedia”

The failure of the Britannica encyclopedia was a result of ineffectual and mistimed decision making. Despite regular tracking of the data, the leadership could not correctly predict their sales numbers, in an environment where technology was fast evolving and the opportunity within the educational media. Due to lack of strong motivation and evidence, it was hard to change the mindset of the sales force. Therefore, the forecasting framework for Britannica’s case needs to address the issue of prediction. Additionally, when Microsoft had approached Britannica about the digital CD-ROM encyclopedias, rejecting it was a business decision which needed to be better informed by changes taking place in the industry as well as society.

Fig. 5.2 represents the manifestation of the forecasting framework for Britannica encyclopedia. Since the need to predict is being addressed, the ‘Prefer’ and ‘Paradigm-shift’ flows have been omitted. The case study informs that past data regarding the sales of encyclopedias was regularly tracked and that 1990 was the year with a record high in sales.

Therefore, the stream of subsequent attributes pursues the options where past data/information was available. The methods within the purview of a business decision, are Cross-Impact

Analysis and Decision Analysis. Applying the attributes of lead time and forecasting range successively again yields Cross-Impact Analysis and Decision Analysis as the recommended methods for Britannica’s then situation.

184

What is the forecasting need?

Predict

Postulating the future, based on existing information. Level of innovation required: minimal

Is the past data/information available?

Yes

• Machine Learning • Statistical Modeling • Cross- Impact Analysis • Decision Analysis • Environmental Sca nning* • Design Trends Analysis* • Nystrom’s Framework* • Morphological Analysis*

Nature of Forecast

Creative Direction Business Decision Quantitative/Predicted Value

• Environmental Scanning • Design Trends Analysis • Cross-Impact Analysis • Machine Learning • Nystrom’s Framework • Decision Analysis • Statistical Modeling • Morphological Analysis

Lead Time Lead Time Lead Time

Short Long Short Long Short Long

Environmental Scanning Cross-Impact Machine Learning X De sig n Trends Analysis X X Analysis Statistical Modeling Nystro m’s Framework De c isio n An a lysis Morphological Analysis

Forecasting Range Forecasting Range Forecasting Range

Short Med Long Short Med Long Short Med Long

De sig n -Design Trends Analysis Sta tistica l -Cross Impact X X Machine Learning X Trends -Nystro m’s Framework Analysis Modeling Analysis -Morphological Analysis Sta tistica l -De c isio n Analysis Modeling

Fig. 5.2: Manifestation of forecasting framework for the case of Britannica Encyclopedia.

185

Revisiting “Pontiac Aztec”

Pontiac Aztec misfired at multiple levels despite being an engineering feat. The organization was looking for path-breaking identity but were confused about the target audience and hence their needs. The contemporary market also included Japanese automobile manufacturers as competition. Analysis of the Pontiac Aztec case in Chapter 3 highlights the need to bring about a paradigm-shift as the primary need, to be brought about by the new vehicle’s form and function.

The methods are filtered against the first attribute – nature of forecast sought. For new product development, the key focus is on the establishing a creative direction, the other two options

(quantitative value and business decision) are left out of from further consideration. By virtue of the case being positioned in the automobile industry, only the methods with long-term forecasting range qualify the caveat of forecasting range. Since this was a case of new product development hence the availability of robust quantitative data from the past is ruled out here.

However, as this was also a longer-term project, methods which are based on information collected and analyzed over a period of time can be considered. Thus, the recommendation yielded include methods – Environmental Scanning, Design Trends Analysis and Morphological

Analysis. It cannot be known from certainty what kind of qualitative studies were performed for the conceptualization of Pontiac Aztec. Therefore, if it is assumed that regular collection and analysis of qualitative information was absent, Ethnography is the recommended method.

Ethnography could have been instrumental for Pontiac, in revealing real consumer insights and addressing an existing feature or functionality gap, thus giving Pontiac the unique selling proposition that they had been looking for. Fig. 5.3 represents the framework manifested as per

Pontiac’s situation.

186

What is the forecasting need?

Paradigm shift

Forecasting Range

Long

• Environmental Scanning • Design Trends Analysis • Design Thinking • Delphi • Morphological Analysis

Nature of Forecast

Creative Business Quantitative Direction Decision Value

• Environmental Scanning • Design Trend Analysis De lph i X • De sig n T h in kin g • Morphological An. • Ethnography

Lead Time Lead Time

Short Long Short Long

De sig n -Environmental Thinking sc a n n in g X -De sig n Trend De lph i Analysis -Morphological Analysis -Ethnography

Is the past data/information available?

Yes No Yes No Yes No

Ethnography De sig n • Environmental De lph i Thinking X Sca nning X • De sig n T re n d Analysis • Morphological Analysis

Fig. 5.3: Manifestation of forecasting framework for the case of Pontiac Aztec.

187

Laura Ashley, Encyclopedia Britannica, and Pontiac Aztec were all cases that occurred in the past. With the limited information available about these case, Fig. 5.1, 5.2 and 5.3 exemplify the application of the forecasting framework to these cases. To examine the application of the forecasting framework with respect to current times, I review the contemporary situation of

Tiffany & Co. based on performance in the last ten years. The following section takes a detailed look the company’s origins, their evolution over time, recent challenges and scenarios for application of the forecasting framework.

• Introduction

Industry: Luxury Jewelry and specialty retailer

Their position in the industry:

The company was founded by Charles Lewis Tiffany and John B. Young in 1837 in New York

with a capital of $1000, as a stationery and fancy goods store (Tiffany & Co. 2018a). In 1941

J.L. Ellis joined as the third partner but by 1953 Tiffany had bought out his partners’ shares

and renamed the company as Tiffany & Co. In 1945, they published the first mail order

catalog in the U.S. called the “The Blue Book” introducing the “Tiffany Blue” for the first time

(Tiffany & Co. 2018b). At the time the catalog included a list of ‘useful and fancy articles’

(Tiffany & Co. 2018a). The company’s records state that Tiffany picked the color as it seemed

to be popular with the brides in the 19th century. Some unverified sources also mentioned that a

portrait of Empress Eugenie de Montijo sporting a turquoise cape, who was touted to have a

tremendous influence on fashion during the 19th century, inspired Charles Tiffany to pick the

color. It is interesting to note while the spirit of the times inspired Tiffany to select the Tiffany

Blue as their official color, the color eventually became an icon of luxury and style across the

188

world. It resembles how trend forecasters today such as Li Edelkoort, collect a sense of what they see around themselves and then project that through a trend story to be embraced by brands and consumers.

Working with specialty and luxury items Tiffany, created strokes during the course of the

19th century. Charles Tiffany came to be known as the “king of diamonds” after he acquired crown jewels of France in 1948 (Tiffany & Co. 2018a). In 1851, the company instituted the

925/1000 standard for sterling silver as was practiced in Britain, which was later adopted by the United States (“Charles Tiffany” 2014) due to his efforts. At the 1867 world fair in Paris,

Tiffany & Co. earn laurels for silver craftsmanship making them not the first American company to win such international accolade but also firmly established the company’s reputation as a design house (Tiffany & Co. 2018a). Tiffany & Co. over three decades established themselves as premier silversmiths and dealers of luxury timepieces. Tiffany &

Co.’s artwork and designs were known to be inspired by nature. In 1878, Tiffany & Co. acquired one of the world’s largest diamonds at 287.42 carats which were refined to 128.54 carats with 82 facets (Tiffany & Co. 2018a), and later named the “Tiffany Diamond”. Now

Tiffany & Co. became known for their diamond craftsmanship in addition to the craftsmanship of silver and other precious materials. Tiffany & Co. is also credited with creating the Tiffany

“Engagement setting” where the diamond is risen to stand above the band into the light

(Tiffany & Co. 2018a). Over the decades, Tiffany & Co. introduced many other precious gem- stones in the American market (Tiffany & Co. 2018a).

Along the decades, Tiffany & Co. also garnered some of the most prominent members of the American society as their clients. From politicians and Presidents to the European royal

189

families – Tiffany & Co. came to be trusted for their knowledge of the precious materials as well as their exquisite sense of design.

Upon the death of Charles Lewis Tiffany, his son Louis Comfort Tiffany, who was considered a world leader of the Art Nouveau movement became the company’s first official design director. In 1955, Tiffany & Co. was taken over by businessman Walter Hoving. He steered the company’s philosophy and image towards a new direction. His vision was to maintain the focus on quality while making the brand affordable for not just the uber rich and royals but also the common people who aspired to own a Tiffany product (Lieber 2017).

Hoving wanted the brand to focus on “good design” that could be mass merchandised (Lieber

2017). To this end, Hoving brought on Van Day Truex as the design director and Gene Moore for designing the window displays (Lieber 2017). Several other legendary designers such as

Jean Schlumberger, Elsa Peretti, and Paloma Picasso joined the company over the years.

Tiffany stores were opened in multiple cities across the country. Their affordable and contemporary designs struck a chord with growing number of women while maintaining their luxury brand image well through the 80s. In 1987, Tiffany & Co. forayed into international market especially Asia and Europe and also filed their IPO (Lieber 2017). In 1997, the company launched the “Return to Tiffany” collection which went on to become extremely successful (Lieber 2017). In 2009, Tiffany & Co. the Key Collection, followed by introducing the collection based on a new metal alloy “Rubedo” (Lieber 2017) (Sicard 2013). From 51 stores in 2003, the company opened 91 stores in 2012. As a result, the net sales of $ 2.2 billion increased $3.6 billion in 2011 but the profits have not been sustainable (Lieber 2017).

190

• Recent events and current status

The economic crisis of 2008-2009 had its impact on the net sales of Tiffany & Co. as well. Fig. 5.4 shows the steep decrease in the net sales over the holiday season (November –

December) in 2009. With the launch of the “Collection T” designed by Francesca Amfitheatrof brought on board in 2013 (Sherman 2014), the worldwide sales in 2014 touched $4.2 billion

(Lieber 2017). The first ever female design director of the company who was hired after a search of five years was let go after three years of her appointment when the top line did not show much improvement. In 2015, the company witnessed another big slump in sales (Aaron

2018a) which continued in 2016. Amid growing investor concerns, Tiffany & Co. named Reed

Krakoff as its Chief Artistic Officer and Alessandro Bogliolo as its new Chief Executive

Officer in 2017. The holiday season of 2017 brought some respite for the struggling company, but whether the upward curve is sustainable still needs to be seen.

• The Challenge: issues, pain points, challenges

As Tiffany & Co. went through one after another year of declining sales, it dismissed the design head and CEO one after the other in 2017. In an attempt to revive its appeal Tiffany &

Co. signed Lady Gaga to help strike a chord with its global customers just like Marilyn Monroe and did. The dismissal of CEO Frederic Cumenal, which came just hours before Lady Gaga was to feature Tiffany & Co.’s “HardWear” collection during 2017 Super

Bowl performance, clearly projected the company’s internal struggles outward. When

Francesca Amfitheatrof came on board as the design director she found Tiffany’s product range very merchandise lead, a direction that was brought about by Hoving (Aaron 2018a). She wanted to bring back the uniqueness that was embedded in the DNA of the brand by the

191

founder Tiffanys (Lieber 2017). But her efforts could not bring sustainable commercial success beyond the “Collection T” in 2014 (Tiffany & Co. 2018a). Overall, it seemed that having pivoted from the position of the high-end luxury jeweler to a good quality mass merchandised brand, Tiffany’s could not keep its’s special edge intact and failed to strike a chord with its audience. According an article by Victoria Murphy, published in November 2004 in the Forbes magazine, online diamond jewelry brand Blue Nile was worth $154 million dollars and selling as many diamond rings as Tiffany’s in the U.S. (V. Murphy 2004). According to the article,

Mark Vadon, who owns the brand stating his ambition said, “We want to be the Tiffany for the next generation" (V. Murphy 2004). While Tiffany’s margins might be seeing fluctuations, the company had enjoyed an undisputed aspirational monopoly in the minds of the people. Also, in

2006 Tiffany & Co. reported a 15% increase in their holiday sales year over year, pushing the net sales to $818,087,000 (Aaron 2007) which accelerated the company’s expansion plans for the following year (Aaron 2007). The economic crisis of 2008-2009 however impacted the company’s performance negatively bringing down holiday sales by 21% and overall sales declining by 3% (Aaron 2009). Fig. 5.4 shows that since the economic downturn in 2009,

Tiffany & Co. has yet to see the commercial success that would put the brand back at the indisputable position of leaders in coveted luxury jewelry.

• Why did they have this problem?

Tiffany & Co. has a history of almost 200 years. Charles and Louis Tiffany gave it the prestigious image of a luxury jeweler revered by the most prominent denizens. During the first

100 years, Tiffany & Co. came to be known not just as pioneers of design and precious materials but for instituting standards of quality for America. In 1955, when Walter Hoving

192

over the company he tried to make it a more attainable name for the masses while keeping the quality standards. By the turn of the century, however, the company seemed to have lost its edge as well considerable market ground to other newer and less expensive brands. In May

2016, in a newspaper article published by Reuters, Cavale and Patnaik wrote that Tiffany’s

“old-world luxury” charm is not working for the millennials. But it appears that the withdrawal is not abrupt.

According to Lieber, in the post-WWII era, not many women worked outside the house

(Lieber 2017). Jewelry was a high-value item mostly purchased on special occasions such as weddings for the purpose of gifting. The primary target for jewelry were men as they made the purchases, while women’s role was that of passive influencers (Lieber 2017). Even by 60s and

70s, the instances of women buying jewelry for themselves were rare (Lieber 2017). Even if

Holiday Season Sales Performance By Year

Change % Net Sales Worldwide YOY Change % Sales in Americas YOY

20 17 15

10 11 8 9 8 7 6 7 4 4 34 4 1 0 -1 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 -4 -76 -10

-20 -21

-30 -30

-40 Fig. 5.4: Graph representing percentage change in Tiffany & Co.’s holiday season sales year over year since 2008. Image source: Author after (Aaron 2018a).

193

women bought jewelry, it was bought as a gift for another female – daughters, mothers and aunts etc. (Lieber 2017). Thirty years later the buying scenarios and demographics have changed drastically. Based on a survey conducted in 2015 which showed the number and instances of women purchasing jewelry for themselves is on the rise, Diamond retailers De

Beers decided to focus equally on this audience as well those looking for bridal jewelry

(Bergstein 2017). “Female self-purchase” was clearly the trend to pursue (Bergstein 2017).

Another point that became clear from the Diamond Insight Report of 2016 was that these self- purchasing women where not awaiting a special occasion to buy themselves diamonds or jewelry, nor where these women expecting jewelry only as gifts. Together with “self- purchasing women”, “just because” was also the trend defining the jewelry buying patterns (De

Beers 2016).

While the consumer landscape has changed drastically, analysts and loyalists both cry lack of innovation in product design as the main reason for Tiffany & Co.’s troubles (Lieber

2017). Tiffany’s design collections in the recent years have been lackluster and they still are relying on their legacy to carry through in the future (Lieber 2017). Consumers feel that the kind of unique and exciting designs the era of Peretti and Picasso saw are simply missing today, and the offering mainly includes variants of the engagement ring and even that at unreasonably high prices (Lieber 2017). Not keeping a tab on customer preferences and becoming complacent is what has caused sluggish sales for Tiffany & Co. A less than optimal e-commerce channel has also been called out as a factor that leaves out the millennial audience

(Lauchlan 2017).

194

The newly appointed team of CEO Alessandro Bogliolo and Chief Artistic Officer Reed

Krakoff saw the holiday season sale tick a little in November – December 2017. In the press release made by the company in January 2018, Bogliolo said while the 2017 holiday sales have helped stop the downward trend of preceding years, the company needs to focus to evolving product offerings, customer experience and strategic spending (Aaron 2018a).

• What is the forecasting need?

The 2017 worldwide net sales increased 4% YoY to $4.2 billion. The management would like to continue the upward trend in sales. Bogliolo has already communicated the areas, the company is going to focus on 2018 onwards – evolving product offerings, customer experience and strategic spending (Aaron 2018b). This clearly announces the company’s plans to i) update their products and design while keeping the “Tiffany’s” identity, ii) improve areas of customer interaction – both digital and physical, and iii) enhance impetus in the area of mergers and acquisition. For our discussion we will focus on the first two areas.

The company has been unsuccessful in attracting and exciting the customers with their design offering in the preceding years. It is only understandable that they identified evolving their product offerings as one of the key business objectives. It has also been evident that

Tiffany & Co. has not been able to grasp the pulse of the millennial audience and has failed to appeal to them as a brand. Addressing customer experience across all channels is undoubtedly the need of the hour. Within their strategic spending plans, reviewing the retail store vs. e- commerce expansion strategy also seems cogent. The forecasting framework will attempt to solve this challenge. The actual application will depend on a dialogue between the forecaster and the company. After adequately understanding the context, the forecaster can approach the

195

situation appropriately. To show how forecaster may proceed, two different scenarios are presented here – scenario 1 and 2.

The creative implementation of this three-point plan requires an over-arching strategy that connects the dots and narrates a story. Since the 2017 holiday season showed positive sales, understanding which products/designs were appreciated by the customers will help establish a baseline for next year projection of those designs as well as indicate the prevalent design preferences. Therefore, predictability can be identified as a preliminary forecasting need. However, since the company aims to achieve a preferable future in terms of commercial success as well brand image through their product designs forecasting for preferability also presents a strong case. However, while preferability research can help designers with planning the future collections, a paradigm-shifting event can help the company achieve a breakthrough in any/all of the three identified areas. Fig. 5.5A refers to the application of the forecasting framework for this scenario, identified as scenario 1.

Fig. 5.5A illustrates applying the forecasting framework for all three forecasting needs – predictability, preferability, and paradigm shift simultaneously. As explained in Chapter 4, the most relevant operational attribute associated with each need is used first to filter the forecasting methods. Regarding the forecasting need to predict the most relevant attribute is the availability of existing data. In this case, the records about annual sales, as well details of collections and their market performance is available with the company. For the need for preferability, the first operational attribute is identifying the nature of forecast sought – creative direction, business decision or quantitative value. In this case, based on focus areas stated by

CEO Bogliolo, ‘creative direction’ and ‘business decision’ is selected in terms of the nature of

196

forecast sought. For the purpose of paradigm-shift, the first attribute filter is that of forecasting range. In the industry of fashion, jewelry, and design, customers expect to see new collections every year if not every season. For this reason, the forecasting range options of short to the medium range are selected. The forecasting methods that mutually satisfy these criteria are selected. Thereafter, the attribute of lead time is applied. The methods recommended with short lead time are Cross-Impact Analysis and Decision Analysis, while those with longer lead times are Design Trends Analysis and Nystrom’s Framework. While methods like cross-impact analysis and decision analysis enable the forecaster to predict based on existing data, they also aid the process of business decision towards preferability. The methods of Design Trends

Analysis and Nystrom’s Framework are particularly geared towards finding a preferred creative direction. The forecaster’s finding, in this case, can help frame the storyboard or design brief.

197

What is the forecasting need?

Predict Prefer Paradigm shift

Is the past data/information available? What is the nature of forecast sought? Forecasting Range

Yes No Creative Business Quant. Short Medium Long Direction Decision Value • Survey • Machine Learning • Ethnography • Ethnography • Design Thinking • Design • Statistical Modeling • Environmental • Survey • Survey • Design Trends Thinking • Cross- Impact Analysis Sca nning • Ethnography • Design Thinking • Delphi Analysis • Ethnography • Decision Analysis • Machine • Design Trends • Survey • Design Trends • Cross- • Machine Learning • Delphi Learning Analysis • Design Thinking Analysis Impact • Sta tistica l • Design Trends • Environmental Scanning* • Sta tistica l • Design Thinking • Delphi • Nystrom’s Analysis Modeling Analysis • Design Trends Analysis* Modeling • Delphi Framework • Decision • Cross- Impact • Nystrom’s • Nystrom’s Framework* • Morphological Analysis Analysis Framework • Morphological Analysis* • Morphological Analysis Analysis • Decision Analysis

• Cross-Impact Analysis • Decision Analysis • Design Trends Analysis • Nystrom’s Framework

Lead Time

Short Long

• Design Trends • Cross- Impact Analysis Analysis • Nystrom’s • Decision Analysis Framework

Fig. 5.5A: Application of forecasting framework per scenario 1 for Tiffany & Co.

Another scenario that the company may want to pursue at this point, could be to attain long-term paradigm-shift that establishes a unique position and identity for Tiffany & Co. once again. The paradigm shift could emerge from any realm – design, marketing, customer experience, e-commerce or a different one. Running through the forecasting framework, the forecaster would have to consider the forecasting need for paradigm shift over a longer forecasting range. Fig. 5.5B exemplifies the selection of methods based on this scenario, identified as scenario 2. Selecting methods based on creative direction and business decision as the desired nature of forecast, the forecaster can move on to check the next attribute, which is the availability of past data. Given that the company has detailed records of the prior

198

collections as well as their commercial performance, the forecaster can filter methods based on the presence of this information. Applying the final attribute of lead time, the methods yielded are ones requiring longer lead times. But the long lead times are in alignment when the company is seeking a longer-term paradigm shift. The recommended methods include

Environmental Scanning, Design Trends Analysis and Morphological Analysis. The forecaster can thus approach the forecasting need for a paradigm-shift methodically by pursuing any or all the above methods based on the availability of time and resources and other internal factors.

What is the forecasting need?

Predict Prefer Paradigm shift Postulating the future, based on Planning towards a preferred future, that might be Causing a change in the existing framework by the past information. different for a future state if no action is taken. bringing about change in ways of thinking, methodology and assumptions. Level of innovation required: minimal Level of innovation required: incremental Level of innovation required: disruptive

Forecasting Range

Short Medium Long

• Survey • Ethnography • Design Thinking • Design Thinking • Environmental Scanning • Design Trends Analysis • Ethnography • Design Trends Analysis • Machine Learning • Delphi • Design Thinking • Statistical Modeling • Design Trends Analysis • Delphi • Cross- Impact Analysis • Nystrom’s Framework • Morphological Analysis • Decision Analysis

Nature of Forecast Nature of Forecast Nature of Forecast

Creative Quantitative Creative Business Quantitative Creative Business Quantitative Business Decision Direction Value Direction Decision Value Direction Decision Value

• Environmental Scanning • Survey • Ethnography • Cross-Impact Analysis • Machine Learning • Design Trend Analysis • De sig n T h in kin g • Nystro m’s F ra me w o rk De lph i X • De c isio n An a lysis • Statistical Modeling • De sig n T h in kin g De lph i X • Design Trends Analysis • De sig n T h in kin g • Morphological An. • Ethnography

Is the past data/information available? Is the past data/information available? Is the past data/information available?

Yes No Yes No Yes No Yes No Yes No Yes No Yes No

• Cross-Impact • Machine • Environmental De sig n • Survey • Ethnography De sig n Analysis Learning Nystro m’s Sca nning Trends • De sig n • De sig n De lphi Thinking • De c isio n X • Sta tistica l X Framework X • De sig n T re n d X De lphi Analysis Thinking Thinking Ethnogr Analysis Modeling Analysis • Morphological aphy Analysis

Lead Time Lead Time Lead Time Lead Time Lead Time Lead Time Lead Time Lead Time Lead Time

Sh Lg Sh Lg Short Long Short Long Short Long Short Long Short Long Short Long Short Long Short Long

-Cross-Impact -Machine -Environmental De sig n -Survey X Nystro m’s De sig n Ethnograp De sig n Ethnogra Analysis Learning X X De lph i X sc a n n in g X De lph i Trends -De sig n X Framework Thinking hy Thinking phy X X -De c isio n -Sta tistica l -De sig n T re n d Analysis Thinking Analysis Modeling Analysis -Morphological Analysis

Fig. 5.5B: Application of forecasting framework per scenario 2 for Tiffany & Co.

199

Summary

This chapter brought together the case studies introduced in Chapter 3 and the forecasting frameworks synthesized in Chapter 4. Each case is approached from the perspective of a forecasting inquiry that could have altered the course of unfavorable outcomes that ensued. The analysis of case studies in Chapter 3 has been used as the basis for determining the forecasting need and subsequently the type of forecasting framework. In the absence of a narrative account of the exact happenings in each case by someone who witnessed the situations and challenges first, a set of assumptions had to be made to implement the framework. Additionally, the contemporary case study of Tiffany & Co. shows application the forecasting framework in the current times. At the very least, application of the forecasting framework synthesized at the behest of the research question stated in Chapter 1, does successfully offer recommendations of forecasting methods for each case. For the case situated in the fashion and apparel industry, the framework suggests a combination of Design Thinking and Surveys. For the case of Britannica encyclopedia, the framework returned Cross-Impact Analysis and Decision Analysis as methods to aid with timely decision making. In case of Pontiac Aztec, where a successful product could have salvaged a dwindling brand, the framework recommends an Ethnography study in case no prior data is available. In case of availability of prior data or collection of data as part of the forecasting research, research methods of Environmental Scanning, Design Trends Analysis and Morphological Analysis or a strategic combination of these would have been most valuable in providing Pontiac a strategic and long-term creative direction.

200

CHAPTER 6

CONCLUSION

This dissertation is focused on the field of forecasting research informing holistic design strategy in the area of consumer goods. From the large existing variety of interpretations of forecasting practices, this thesis emphasizes on the application of forecasting methods which can capacitate design processes to create solutions which are well-timed and relevant to consumer needs and preferences. The dissertation is addressed to the larger community of designers, design researchers, forecasters, trend strategists, decision makers, producers, innovation strategists, and anybody who is interested and involved in the creation of new products and solutions which will touch the human society.

The gap in research was lack of perspective towards understanding needs and context for leveraging forecasting studies and how different forecasting methods could fulfill such needs.

Forecasting methods are often discussed within the frame of reference of a specific situation, within a specific field. As discussed in Chapter 2, the research around forecasting methods usually involves the syntax of using forecasting method ‘x’ to forecast ‘y’. There is also incremental research work going on within the realm of different forecasting methods. Researchers inherently use forecasting methods prevalent and situated within their fields. When dealing with the context of design and design research, forecasting is not always considered the first step, nor is there an established go-to method. Due to such issues, it is often left up to designers to somehow understand the premise of the design challenge, and articulate their own design brief or struggle with a vaguely defined design brief. As designers proceed towards the process of design development, internal and external biases may act as detractors. This thesis not only emphasizes the importance and role

201

of forecasting in the design process, it also addresses such aforementioned issues. Addressing the question about how appropriate forecasting methods can be applied to design needs, for the overall benefit of the design process, this thesis poses the key research question – “How can modular orchestration of forecasting methods enable their strategic selection to constructively inform design processes?” The Forecasting Framework synthesized in Chapter 4 aims to answer the main research questions. Chapter 4 answers Q 2 (What attributes facilitate selection of forecasting methods towards informing the design process?), by identifying the operational attributes that determine the selection of forecasting methods. Further, Chapter 4 explains the different needs for forecasting and how they lend towards the selection of forecasting methods (Q 4 – What warrants the need of forecasting studies to be the starting point of a design process?). The forecasting framework brings together the forecasting needs, methods and attributes to address (Q 1 – How does a trend forecaster select from myriad different forecasting methods?), and to further illustrate

(Q 3 – What determines the order or combination of forecasting methods towards a design process?), Fig. 4.14A, B and C, and Fig. 4.16 show selection of methods in a systemic manner through branching trees.

The forecasting framework provides a systematic approach to selecting the appropriate method around the specific design challenge. The dissertation delves into the functional orientation and aptitude of different forecasting methods in design research. Pertinent methods have been studied and grouped as methods of observation, estimation, and intervention. Forecasters and designers collaborating to create designs vetted in the forecasting research also addresses the issue of personal biases influencing creative decisions. Designers have the responsibility to create aesthetic, innovative, yet ethical solutions. The forecasting framework guides the design

202

development and curbs chances of personal biases towards design selection that other stakeholders may have, as the methods themselves are based on research methodologies. Further, three different case studies are used to illustrate the gap, which the absence of forecasting research can create.

The cases indicated common factors which, determine operational caveats for application of forecasting methods. These operational factors are recognized as attributes for the selection of forecasting methods. In most real-world scenarios, the ‘Cost’ of conducting a study is an important operational attribute in determining the forecasting method. The cost factor brings with it a great degree of bias towards the selection process. The cost attribute has not been included in this discussion to keep the selection framework focused primarily on functional benefits of methods.

Guided by the forecasting needs, the organizing of forecasting methods and attributes makes the orchestration of forecasting methods possible. Application of the forecasting framework to the case studies retrospectively and with certain assumptions has been illustrated through the step-by- step branching trees in Fig. 5.1, Fig. 5.2 and Fig. 5.3. Applying the framework to the case studies has had to be done under certain assumptions, with the information that is currently available about the cases. With better or more information about the three companies, the application as well as the results obtained by application of forecasting framework could have been different. However,

Chapter 5 indicates how a similar exercise could be carried out for a current case with more details informing the needs and attributes for the selection process. Comparing Fig. 4.15A and Fig. 5.1, illustrates how a forecaster would approach a situation where the forecasting need centered around a certain preferred future. Assuming that ‘preferability’ was the primary forecasting need in case of Laura Ashley, Fig. 5.1, shows working through the forecasting framework by filtering methods through attributes. The attribute of nature of forecast is applied first to start selection of methods.

203

What is the forecasting need?

Predict Prefer Paradigm shift

Postulating the future, based on Planning towards a preferred future, that might be Causing a change in the framework containing the the past information. different for a future state if no action is taken. basic assumptions, ways of thinking and methodology. Level of innovation required: minimal Level of innovation required: incremental Level of innovation required: disruptive

What is the nature of forecast sought?

Creative Direction Business Decision Quant. Value

Is the past data/information available? Is the past data/information available? Is the past data/information available?

Yes No Yes No Yes No

• Environmental Scanning • Cross- Impact Analysis • Survey • • Ethnography Sta tistica l Modeling • Design Trends Analysis • Decision Analysis • Delphi • X • Survey Machine Learning • Nystrom’s Framework • Design Thinking • Morphological Analysis Lead Time Lead Time Lead Time Lead Time

Lead Time Lead Time Short Long Short Long Short Long

Short Long Short Long Cross- Impact Analysis X Survey Delphi De c isio n Analysis Sta tistica l Modeling Machine Learning X X • Environmental Sc • Survey • Ethnography • De sig n T re n ds An • De sig n • Morphological • Nystro m’s F ra me w o rk Thinking Analysis Forecasting Forecasting Forecasting Range Forecasting Range Range Range

Short Med Long Short Med Long Forecasting Range Forecasting Range Forecasting Range Short Med Long Short Med Long

Cross- Impact An. Survey X X X De lph i Sta tistica l Modeling Short Med Long Short Med Long Med Long De c isio n Analysis X Machine Learning X

-De sig n T re n d -Environmental Ethnogr -Survey X Analysis Sca nning Design aphy -De sig n Morphol -Nystro m’s -De sig n Thinking Ethnogr Morphol Thinking ogical Framework Trends An. aphy ogical Analysis Analysis

Fig. 4.15A reproduced: Forecasting framework towards the forecasting need for ‘preferability’.

What is the forecasting need?

Prefer

What is the nature of forecast sought?

Creative Direction Business Decision

Is the past data/information available? Is the past data/information available?

Yes No Yes No

• Environmental Scanning • Cross- Impact Analysis • Survey • Ethnography • Design Trends Analysis • Decision Analysis • Delphi • Survey • Nystrom’s Framework • Design Thinking • Morphological Analysis Lead Time Lead Time

Lead Time Lead Time Short Long Short Long

Short Long Short Long Cross- Impact Analysis Survey Delphi De c isio n Analysis X

X • Environmental Sc • Survey • Ethnography • De sig n T re n ds An • De sig n • Morphologic • Nystro m’s F ra me w o rk Thinking al Analysis Forecasting Range Forecasting Range Forecasting Range

Short Med Long Forecasting Range Forecasting Range Forecasting Range Short Med Long Short Med Long

Cross- Impact An. Survey X X X De lph i Med Long Short Med Long Short Med Long De c isio n Analysis X

-De sig n T re n d -Environmental Ethnogr -Survey X Analysis Sca nning Design aphy -De sig n Morphol -Nystro m’s -De sig n Thinking Ethnogr Morphol Thinking ogical Framework Trends An. aphy ogical Analysis Analysis

Fig. 5.1 reproduced: Manifestation of forecasting framework for the case of Laura Ashley.

204

The Forecasting Framework, however is only a simplistic and initial foray in the field of forecasting for design. With newer methods of forecasting research getting added to the list every year, the forecasting framework stands to potentially benefit from the inclusion of new methods.

The Forecasting Framework presents a dynamic and scalable architecture that will improve with future iterations. The forecasting framework can be reviewed and reiterated with addition of newer methods that bring in new information about selection of forecasting methods, thereby presenting the opportunity to refine the framework.

Another aspect for future work is extensive further testing of the framework in current real world cases. Testing the framework through cases discussed in Chapter 3 employs assumptions. As the cases discussed in Chapter 3, have already occurred and an exact narrative recounting all the details is not available, hence the framework proceeds with certain assumptions due to lack of in-depth and true information and about the events that occurred. In testing the framework with the real world and current opportunities, the attributes can be further refined, added or removed. By arguing for incorporation of forecasting research in the overall design process, and outlining a systematic way for approaching forecasting methods based on specific contexts and design challenges, this dissertation also opens scope for further conversations about the evolution of the forecasting landscape and how can designers leverage the same. In the future some forecasting methods might become redundant as more sophisticated methods take their place. While developments in artificial intelligence are speedily re- interpreting the definition of forecasting, it is a mutual responsibility of forecasters and designers to leverage the advancements for the larger and ethical benefit of the environment and society.

205

Overall, it will be interesting the watch the relationship between forecasting research and design practice evolve with change in the forecasting as well as design paradigm.

As explained at the beginning of this thesis, “Towards an orchestration of forecasting methods to devise strategies for design” – comes from a place of personal motivation. Recounting my personal experience in the design industry, I have come across many instances where, due to the lack proper forecasting research, the design processes get detracted. I wrote this thesis with the aim of contributing a small starting point towards synergizing forecasting and design efforts, which in turn may lead to further opportunities to build more creative solutions which can be embraced by us and our times.

206

REFERENCES

Aaron, Mark L. 2007. “TIFFANY’S HOLIDAY SALES INCREASE 15%.” Tiffany & Co. January 2007. http://investor.tiffany.com/news-releases/news-release-details/tiffanys- holiday-sales-increase-15.

———. 2009. “Tiffany Reports 2008 Financial Results.” Tiffany & Co. March 23, 2009. http://investor.tiffany.com/news-releases/news-release-details/tiffany-reports-2008- financial-results.

———. 2018a. “Press Releases.” Tiffany & Co. 2018. http://investor.tiffany.com/press-releases.

———. 2018b. “Tiffany Reports Fiscal 2017 Results.” Tiffany & Co. 2018. http://investor.tiffany.com/news-releases/news-release-details/tiffany-reports-fiscal-2017- results. Aguilar, Francis Joseph. 1967. Scanning the Business Environment. Macmillan.

Akins, Carl, and George Beschner. 1980. “Ethnography: A Research Tool for Policymakers in the Drug and Alcohol Fields. Symposium Papers.”

Albright, Kendra S. 2004. “Environmental Scanning: Radar for Success.” Information Management 38 (3): 38.

Albright, Richard E. 2002. “What Can Past Technology Forecasts Tell Us about the Future?” Technological Forecasting and Social Change, TF Highlights from ISF 2001, 69 (5): 443–64. https://doi.org/10.1016/S0040-1625(02)00186-5.

Alexander, Christopher. 1977. A Pattern Language: Towns, Buildings, Construction. Oxford university press.

Álvarez, Asunción, and Tom Ritchey. 2015a. “Applications of General Morphological Analysis.” Acta Morph. Gen 4 (1).

———. 2015b. “Applications of General Morphological Analysis: From Engineering Design to Policy Analysis.” Acta Morphologica Generalis 4 (January).

Amatriain, Xavier, Alejandro Jaimes, Nuria Oliver, and Josep M. Pujol. 2011. “Data Mining Methods for Recommender Systems.” In Recommender Systems Handbook, 39–71. Springer.

Ames, Annette. 2008. “ for a Projected Future.” Clothing and Textiles Research Journal 26 (2): 103–118.

207

Anderson, Ken. 2009. “Ethnographic Research: A Key to Strategy.” Harvard Business Review. March 1, 2009.

Anderson, Roger N., Albert Boulanger, Leon L. Wu, Viabhav Bhandari, Somnath Sarkar, and Ashish Gagneja. 2015. Forecasting System Using Machine Learning and Ensemble Methods. Google Patents. https://www.google.com/patents/US20150317589.

Apté, Chidanand, and Sholom Weiss. 1997. “Data Mining with Decision Trees and Decision Rules.” Future Generation Computer Systems 13 (2–3): 197–210.

Aremu, Mukaila A., Rotimi A. Gbadeyan, and Moriam A. Aremu. 2016. “Environmental Factors And Strategic Marketing Planning In Nigerian Insurance Industry.” DBA Africa Management Review 6 (1).

Armstrong, J. Scott. 2011. “Illusions in Regression Analysis.”

Armstrong, Jon Scott. 1985. Long-Range Forecasting. Wiley New York ETC.

———. 2001. Principles of Forecasting: A Handbook for Researchers and Practitioners. Vol. 30. Springer Science & Business Media.

Asur, S., and B. A. Huberman. 2010. “Predicting the Future with Social Media.” In 2010 IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, 1:492–99. https://doi.org/10.1109/WI-IAT.2010.63.

Ayres, Robert U. 1969. “Technological Forecasting and Long-Range Planning.”

Ball, Linden J., and Thomas C. Ormerod. 2000. “Putting Ethnography to Work: The Case for a Cognitive Ethnography of Design.” International Journal of Human-Computer Studies 53 (1): 147–68. https://doi.org/10.1006/ijhc.2000.0372.

Banks, Danielle. 2017. “Cone of Uncertainty: Facts and Myths About This Tropical Forecasting Tool.” The Weather Channel. September 4, 2017. https://weather.com/science/weather- explainers/news/tropical-storm-cyclone-forecast-cone-hurricane.

Bañuls, Víctor A., and Murray Turoff. 2011. “Scenario Construction via Delphi and Cross- Impact Analysis.” Technological Forecasting and Social Change 78 (9): 1579–1602.

Baudelaire, Charles. 1964. “The Painter of Modern Life.” The Painter of Modern Life and Other Essays 2.

BAUER, PF. 1984. Supermanaging-How To Harness Change For Personal And Organizational Success-Brown, A, Weiner, E. Amer College Personnel Assn C/O Richard Caple Counseling Ctr-220 Parker Hall Univ Missouri, Columbia, MO 65211.

208

Bergstein, Rachelle. 2017. “Female Self-Purchasing Isn’t Just A Jewelry Industry Pipedream.” Forbes. August 9, 2017. https://www.forbes.com/sites/rachellebergstein/2017/08/09/female-self-purchasing-isnt- just-a-jewelry-industry-pipedream/.

Bijker, Wiebe E. 1995. “Of Bicycles, Bakelites, and Bulbs: Toward a Theory of Sociotechnical Change. 1995.” MIT Press, Cambrigde) Contact: Hanne Lindegaard Department of Management Engineering Technical University of Denmark Produktionstorvet, Building 424: 2800.

Bishop, Christopher M. 2006. “Pattern Recognition.” Machine Learning 128: 1–58.

Bishop, Peter, Andy Hines, and Terry Collins. 2007. “The Current State of Scenario Development: An Overview of Techniques.” Foresight 9 (1): 5–25.

Blaszczyk, Regina Lee. 2006. “The Importance of Being True Blue: The Du Pont Company and the Color Revolution.” In Cultures of Commerce, 27–50. Palgrave Macmillan, New York. https://doi.org/10.1007/978-1-137-07182-8_3.

———. 2012. The Color Revolution. MIT Press.

Books.google.com. 2017a. “Google Ngram Viewer_trends.” 2017. https://books.google.com/ngrams/graph?content=trends&year_start=1800&year_end=20 00&corpus=15&smoothing=3&share=&direct_url=t1%3B%2Ctrends%3B%2Cc0.

———. 2017b. “Google Ngram Viewer_trends Forecasting.” 2017. https://books.google.com/ngrams/graph?content=trends+forecasting&year_start=1800&y ear_end=2000&corpus=15&smoothing=3&share=&direct_url=t1%3B%2Ctrends%20for ecasting%3B%2Cc0.

———. 2017c. “Google Ngram Viewer_trends, Trends Forecasting.” 2017. https://books.google.com/ngrams/graph?content=trends%2C+trends+forecasting&year_st art=1800&year_end=2000&corpus=15&smoothing=3&share=&direct_url=t1%3B%2Ctr ends%3B%2Cc0%3B.t1%3B%2Ctrends%20forecasting%3B%2Cc0.

Bose, Indranil, and Radha K. Mahapatra. 2001. “Business Data Mining—a Machine Learning Perspective.” Information & Management 39 (3): 211–225.

Box, George Edward P., and Gwilym M. Jenkins. 1970. “Time Series Analysis: Forecasting and Control, 1976.” ISBN: 0-8162-1104-3.

Brannon, Evelyn L. 2005. Fashion Forecasting. Fairchild Books.

209

Brannon, Evelyn L., and Lorynn R. Divita. 2015. Fashion Forecasting: Studio Instant Access. Bloomsbury Publishing USA.

Brassard, Gilles. 2003. Advances in Cryptology-CRYPTO’89: Proceedings. Vol. 435. Springer.

Britannica, Encyclop\a edia. 1965. “Encyclopaedia Britannica.” Inc., Chicago 111: 720.

Brito, Anderson J., and Adiel Teixeira de Almeida. 2009. “Multi-Attribute Risk Assessment for Risk Ranking of Natural Gas Pipelines.” Reliability Engineering & System Safety 94 (2): 187–198.

Brown, Elspeth H., Catherine Gudis, and Marina Moskowitz. 2006. Cultures of Commerce: Representation and American Business Culture, 1877-1960. Springer.

Brown, Matthew J. 2009. “Models and Perspectives on Stage: Remarks on Giere’s Scientific Perspectivism.” Studies in History and Philosophy of Science Part A 40 (2): 213–220.

Brown, Tim. 2009. “Change by Design.”

Brown, Tim, and Jocelyn Wyatt. 2010. “Design Thinking for Social Innovation IDEO.” Development Outreach 12 (1): 29–31.

Buchanan, Richard. 1992. “Wicked Problems in Design Thinking.” Design Issues 8 (2): 5–21.

Burnham, Kenneth P., and David R. Anderson. 2003. Model Selection and Multimodel Inference: A Practical Information-Theoretic Approach. Springer Science & Business Media.

Burton, John. 1969. Controlled Communication in the Resolution of Conflict. American Political Science Association.

Business Dictionary. 2018. “What Are Consumer Trends? Definition and Meaning.” BusinessDictionary.Com. 2018. http://www.businessdictionary.com/definition/consumer- trends.html.

Cannon, Thomas, and Dr Ronald W. Hasty. 1978. “Identifying and Defining Consumer Needs Using Human Factors and Market Research Techniques.” ACR North American Advances NA-05.

Caprara, Andrea, and Lucyla Paes Landim. 2008. “Ethnography: Its Uses, Potentials and Limits within Health Research.” Interface - Comunicação, Saúde, Educação 12 (25): 363–76. https://doi.org/10.1590/S1414-32832008000200011.

210

Centola, Damon, and Michael Macy. 2007. “Complex Contagions and the Weakness of Long Ties.” American Journal of Sociology 113 (3): 702–34. https://doi.org/10.1086/521848.

Chambers, John C., Satinder K. Mullick, and Donald D. Smith. 1971a. “How to Choose Right Forecasting Technique.” Harvard Business Review 49 (4): 45.

———. 1971b. “How to Choose the Right Forecasting Technique.” Harvard Business Review. July 1, 1971. https://hbr.org/1971/07/how-to-choose-the-right-forecasting-technique.

“Charles Tiffany.” 2014. Biography. April 2, 2014. https://www.biography.com/people/charles- tiffany-9507386.

Chatfield, Chris. 2016. The Analysis of Time Series: An Introduction, Sixth Edition. CRC Press.

Checkland, Peter. 1999. “Systems Thinking.” Rethinking Management Information Systems, 45– 56. Chevron. 2010. How Chevron Makes Decisions. https://www.youtube.com/watch?v=JRCxZA6ay3M.

Choi, Hyunyoung, and Hal Varian. 2012a. “Predicting the Present with Google Trends.” Economic Record 88 (s1): 2–9.

———. 2012b. “Predicting the Present with Google Trends: PREDICTING THE PRESENT WITH GOOGLE TRENDS.” Economic Record 88 (June): 2–9. https://doi.org/10.1111/j.1475-4932.2012.00809.x.

Clark, Kim B., and Takahiro Fujimoto. 1989. “Lead Time in Automobile Product Development Explaining the Japanese Advantage.” Journal of Engineering and Technology Management 6 (1): 25–58.

Cols, Juan. 2016. “The Constituent Elements of the Paradigm for Electric Power Transmission and Its Emergence from the Perspective of Nikola Tesla,” May.

Connor, Jerome T., R. Douglas Martin, and Les E. Atlas. 1994. “Recurrent Neural Networks and Robust Time Series Prediction.” IEEE Transactions on Neural Networks 5 (2): 240–254.

Copeland, Rita. 1995. Rhetoric, Hermeneutics, and Translation in the Middle Ages: Academic Traditions and Vernacular Texts. Vol. 11. Cambridge University Press.

Cord, David J. 2014. The Decline and Fall of Nokia. Schildt & Söderström.

Covelllo, Mike. 2003. Standard Catalog of Ferrari 1947-2003. Krause Publications.

Coyle, Geoff. 2004. Practical Strategy: Structured Tools and Techniques. Pearson Education.

211

Criscitelli, Theresa, and Walter Goodwin. 2017. “Applying Human-Centered Design Thinking to Enhance Safety in the OR.” AORN Journal 105 (4): 408–412.

Cronje, Johannes C. 2016. “The Future of Our Field-A STEEP Perspective.” TechTrends 60 (1): 5.

Cross, Nigel. 2001. “Designerly Ways of Knowing: Design Discipline versus Design Science.” Design Issues 17 (3): 49–55.

Cruz, Joseph A., and David S. Wishart. 2006. “Applications of Machine Learning in Cancer Prediction and Prognosis.” Cancer Informatics 2. http://search.proquest.com/openview/2d3564860fc6664a5f40ec196594c0f5/1?pq- origsite=gscholar&cbl=1026410.

Dam, Rikke, and Teo Siang. 2017. “Design Thinking: Get a Quick Overview of the History.” The Interaction Design Foundation. 2017. https://www.interaction- design.org/literature/article/design-thinking-get-a-quick-overview-of-the-history.

De Beers. 2016. “2016 Diamond Industry Outlook.” De Beers Group Corporate Website. 2016. http://www.debeersgroup.com/en/reports/insight/insight-reports/insight-report- 2016/outlook.html.

Degnegaard, Rex. 2014. “Co-Creation, Prevailing Streams and a Future Design Trajectory.” CoDesign 10 (2): 96–111. https://doi.org/10.1080/15710882.2014.903282.

DeJong, Gerald, and Raymond Mooney. 1986. “Explanation-Based Learning: An Alternative View.” Machine Learning 1 (2): 145–176.

DeWalt, Kathleen M., and Billie R. DeWalt. 2011. Participant Observation: A Guide for Fieldworkers. Rowman Altamira.

Diane, Tracy, and Tom Cassidy. 2009. Colour Forecasting. John Wiley & Sons.

Direction, Strategic. 2007. “Nokia: Big and Clever: How Large Firms Can Become Nimble.” Strategic Direction 23 (7): 14–16.

Dubbels, Brock. 2014. “Cognitive Ethnography: A Method for Design, Measure, and Analysis of Game Studies, Multimedia Learning for Design, Academic Performance, Leisure Studies, and Professional Development,” June.

Dunne, David, and Roger Martin. 2006. “Design Thinking and How It Will Change Management Education: An Interview and Discussion.” Academy of Management Learning & Education 5 (4): 512–523.

212

Durst, Carolin, Michael Durst, Thomas Kolonko, Andreas Neef, and Florian Greif. 2015. “A Holistic Approach to Strategic Foresight: A Foresight Support System for the German Federal Armed Forces.” Technological Forecasting and Social Change 97: 91–104.

Edelkoort, L. 1997. “The Story and Meaning of Color.” AXIS-TOKYO-, 022–027. ———. 1999. “The Theories Behind Colour Forecasting.” In Seminar at the Briggait Centre, Glasgow.

Eigner, Fabienne, Aniket Kate, Matteo Maffei, and Francesca Pampaloni. 2015. “Achieving Optimal Utility for Distributed Differential Privacy Using Secure Multiparty Computation.” Applications of Secure Multiparty Computation 13 (81).

Emerson, Robert M., Rachel I. Fretz, and Linda L. Shaw. 2001. “Participant Observation and Fieldnotes.” Handbook of Ethnography, 352–368.

Eriksson, Päivi, and Anne Kovalainen. 2015. Qualitative Methods in Business Research: A Practical Guide to Social Research. Sage.

Erilli, N. Alp, and Kamil Alakus. 2014. “Non-Parametric Regression Estimation for Data with Equal Values.” European Scientific Journal 10 (4).

Evans, James A., and Pedro Aceves. 2016. “Machine Translation: Mining Text for Social Theory.” Annual Review of Sociology 42: 21–50.

Evans, Martyn. 2011. “Empathizing with the Future: Creating Next-Next Generation Products and Services.” The Design Journal 14 (2): 231–51. https://doi.org/10.2752/175630611X12984592780087.

Evans, W. Douglas. 2016. “Lessons Learned and Future Social Marketing Research.” Social Marketing Research for Global Public Health: Methods and Technologies, 273.

Evered, Roger D. 1977. “Interest in the Future: A Search for Useable Measures.” Futures 9 (4): 285–302.

Eymard, Joë. 1977. “A Markovian Cross-Impact Model.” Futures 9 (3): 216–228.

Fahey, Liam, William R. King, and Vadake K. Narayanan. 1981. “Environmental Scanning and Forecasting in Strategic Planning—the State of the Art.” Long Range Planning 14 (1): 32–39.

Fann, Kuang Tih. 2012. Peirce’s Theory of Abduction. Springer Science & Business Media.

Ferber, Robert, Paul Sheatsley, Anthony Turner, and Joseph Waksberg. 1980. What Is a Survey? EMBRAPA-DMQ.

213

FirstHive, Team. 2017. “How Big Data Is Shifting The Marketing Paradigm!” FirstHive Marketing Technology Blog (blog). June 8, 2017. https://firsthive.com/blog/index.php/2017/06/08/how-big-data-is-shifting-the-marketing- paradigm/.

Fisher, Ronald Aylmer. 1925. “Theory of Statistical Estimation.” In Mathematical Proceedings of the Cambridge Philosophical Society, 22:700–725. Cambridge Univ Press.

Fontela, Emilio, and José M. Rueda-Cantuche. 2004. “Linking Cross-Impact Probabilistic Scenarios to Input-Output Models.” In First EU–US Seville Seminar on Future-Oriented Technology Analysis (FTA)–May.

Fortenbaugh, William W. 1998. “Cicero, On Invention 1.51–77 Hypothetical Syllogistic and the Early Peripatetics.” Rhetorica: A Journal of the History of Rhetoric 16 (1): 25–42.

Freedman, David. 1997. “From Association to Causation via Regression.” Advances in Applied Mathematics 18 (1): 59–110.

Fulford, Robert. 1990. “Globe and Mail (June 5, 1999),” June 5, 1990.

Galton, Francis. 1886. “Regression towards Mediocrity in Hereditary Stature.” The Journal of the Anthropological Institute of Great Britain and Ireland 15: 246–263.

Gans, Joshua. 2016. The Disruption Dilemma. MIT Press.

Gatsi, John Gartchie. 2016. Introduction to Quantitative Methods in Business. Xlibris Corporation.

General Electric. 2015. “Ecomagination.” July 1, 2015. https://www.ge.com/about- us/ecomagination.

Gentry, Craig, Shai Halevi, and Vinod Vaikuntanathan. 2010. “Fully Homomorphic Encryption over the Integers.” In Annual International Conference on the Theory and Applications of Cryptographic Techniques, 24–43. Springer.

Gibbons, Sarah. 2016. “Design Thinking 101.” Nielsen Norman Group. July 31, 2016. https://www.nngroup.com/articles/design-thinking/.

Gibney, B. Y. 2016. “Google Masters Go.”

Glenn, Jerome C. 2003. “Introduction to the Futures Research Methods Series.” Future Research Methodology, Version 2.

214

Glenn, Jerome C., and Theodore J. Gordon. 2009. Futures Research Methodology-Version 3-0. Editorial desconocida.

Godet, M. 1993. From Anticipation to Action, A Handbook of Strategic Prospective. 1993. UNESCO Publishing, Paris.

Godet, Michel, Régine MONTI, Francis MEUNIER, and Fabrice ROUBELAT. 2004. “Scenarios and Strategies.” A Toolbox For.

Gordon, Theodore, and David Greenspan. 1994. “The Management of Chaotic Systems.” Technological Forecasting and Social Change 47 (1): 49–62.

Gordon, Theodore J., and Howard Hayward. 1968. “Initial Experiments with the Cross Impact Matrix Method of Forecasting.” Futures 1 (2): 100–116.

Gordon, Theodore J., and Olaf Helmer. 1964. Report on a Long-Range Forecasting Study. Rand Corporation Santa Monica, CA.

Gordon, Theodore Jay. 1992. “The Methods of Futures Research.” The Annals of the American Academy of Political and Social Science, 25–35.

Gordon, Theodore, and Adam Pease. 2006. “RT Delphi: An Efficient,‘Round-Less’ Almost Real Time Delphi Method.” Technological Forecasting and Social Change 73 (4): 321–333.

Granger, Clive W. 1986. “Developments in the Study of Cointegrated Economic Variables.” Oxford Bulletin of Economics and Statistics 48 (3): 213–228.

Graves, Alex, Marcus Liwicki, Santiago Fernández, Roman Bertolami, Horst Bunke, and Jürgen Schmidhuber. 2009. “A Novel Connectionist System for Unconstrained Handwriting Recognition.” IEEE Transactions on Pattern Analysis and Machine Intelligence 31 (5): 855–868.

Greenstein, S. 2005. “The Anatomy of Foresight Traps [Foresight Management].” IEEE Micro 25 (3): 10–12. https://doi.org/10.1109/MM.2005.59.

Greenstein, Shane, and Michelle Devereux. 2017. “The Crisis at Encyclop\a Edia Britannica.” Kellogg School of Management Cases, 1–18.

Gryphon, Stephan, Philippe Kruchten, Steve McConnell, and Todd Little. 2006. “The Cone of Uncertainty.” IEEE Software 23 (5): 8–10.

Guerard Jr, John B. 2013. “Regression Analysis and Forecasting Models.” In Introduction to Financial Forecasting in Investment Analysis, 19–45. Springer.

215

Guh, R.-S., F. Zorriassatine, J. D. T. Tannock, and C. O’Brien. 1999. “On-Line Control Chart Pattern Detection and Discrimination—a Neural Network Approach.” Artificial Intelligence in Engineering 13 (4): 413–425.

Gujarati, Damoder N. 2009. Basic Econometrics. Tata McGraw-Hill Education. Haines, Stephen. 2016. The Systems Thinking Approach to Strategic Planning and Management. CRC Press.

Hall, Alan. 2012. “What Every Business Can Learn From Apple: Establish A Winning Culture.” Forbes. June 27, 2012. https://www.forbes.com/sites/alanhall/2012/06/27/what-every- business-can-learn-from-apple-establish-a-winning-culture/.

Hardle, Wolfgang, and Enno Mammen. 1993. “Comparing Nonparametric versus Parametric Regression Fits.” The Annals of Statistics, 1926–1947.

Haupt, Michael. 2018. “Why the Facebook/Cambridge Analytica Data Scandal Is Awesome News.” Hacker Noon. March 25, 2018. https://hackernoon.com/facebook-data-scandal- 50eedc7762b6.

Helmer, Olaf. 1967. “Analysis of the Future: The Delphi Method.” DTIC Document.

———. 1977. “Problems in Futures Research: Delphi and Causal Cross-Impact Analysis.” Futures 9 (1): 17–31.

Helmer, Olaf, Bernice Brown, and Theodore Gordon. 1966. Social Technology. Vol. 9. Basic Books New York.

Helmer, Olaf, and Nicholas Rescher. 1959. “On the Epistemology of the Inexact Sciences.” Management Science 6 (1): 25–52.

Higham, William. 2009. The next Big Thing: Spotting and Forecasting Consumer Trends for Profit. Kogan Page Publishers.

Hinton, Geoffrey, Li Deng, Dong Yu, George E. Dahl, Abdel-rahman Mohamed, Navdeep Jaitly, Andrew Senior, et al. 2012. “Deep Neural Networks for Acoustic Modeling in Speech Recognition: The Shared Views of Four Research Groups.” IEEE Signal Processing Magazine 29 (6): 82–97.

Hinton, Geoffrey E., Simon Osindero, and Yee-Whye Teh. 2006. “A Fast Learning Algorithm for Deep Belief Nets.” Neural Computation 18 (7): 1527–1554.

Hiray, Jagdish. 2011. Time-Series Methods of Forecasting.

Hodges, Andrew. 2012. Alan Turing: The Enigma. Random House.

216

Hoey, Brian A. 2014. “A Simple Introduction to the Practice of Ethnography and Guide to Ethnographic Fieldnotes.” Marshall University Digital Scholar 2014: 1–10.

Holmstrom, Darwin, and David Newhardt. 2011. GTO: Pontiac’s Great One. Motorbooks.

Hutchins, Edwin. 2000. “Distributed Cognition.” International Encyclopedia of the Social and Behavioral Sciences. Elsevier Science.

Hymes, Dell H. 1977. “What Is Ethnography?”

Hyndman, Rob J., and George Athanasopoulos. 2014. Forecasting: Principles and Practice. OTexts.

ISEG Marketing & Communication School. 2016. Darketing S03E09 - « TREND UNION 2012 » Avec Lidewij Edelkoort. https://www.youtube.com/watch?v=AgalvsMjuRE.

Jarne, Gloria, Julio Sánchez-Chóliz, and Francisco Fatás-Villafranca. 2005. “‘ S-Shaped’ Economic Dynamics. The Logistic and Gompertz Curves Generalized.” Electronic Journal of Evolutionary Modeling & Economic Dynamics, no. 3.

Johnson, Michael D., and Randolph E. Kirchain. 2011. “The Importance of Product Development Cycle Time and Cost in the Development of Product Families.” Journal of Engineering Design 22 (2): 87–112.

Jones, Graham A., Cynthia W. Langrall, Carol A. Thornton, Edward S. Mooney, Arsalan Wares, Marion R. Jones, Bob Perry, Ian J. Putt, and Steven Nisbet. 2001. “Using Students’ Statistical Thinking to Inform Instruction.” The Journal of Mathematical Behavior 20 (1): 109–144.

Jones, Nicola. 2014. “The Learning Machines.” Nature 505 (7482): 146.

Jordan, M. I., and T. M. Mitchell. 2015. “Machine Learning: Trends, Perspectives, and Prospects.” Science 349 (6245): 255–60. https://doi.org/10.1126/science.aaa8415.

Kahn, Herman, and Anthony J. Wiener. 1967. “The next Thirty-Three Years: A Framework for Speculation.” Daedalus, 705–732.

Kahneman, Daniel, and Amos Tversky. 1979. “Prospect Theory: An Analysis of Decision under Risk.” Econometrica: Journal of the Econometric Society, 263–291.

Kalmár, Marcus, and Joel Nilsson. 2016. “The Art of Forecasting–an Analysis of Predictive Precision of Machine Learning Models.”

217

Kashimura, Kaori, Yujin Tsukada, Takafumi Kawasaki, Hiroki Kitagawa, and Yukinobu Maruyama. 2014. “Design Approach Based on Social Science for Social Innovation Business.” Hitachi Review 63 (9): 548.

Kelley, Truman Lee. 1947. Fundamentals of Statistics. Harvard University Press.

Kestel, Sevtap. 2013. “Time Series Analysis.” https://www.empiwifo.uni-freiburg.de/lehre- teaching-1/summer-term-13/Material%20Time%20Series%20Analysis/classicalts.

Khodyakov, Dmitry, Sean Grant, Claire E. Barber, Deborah A. Marshall, John M. Esdaile, and Diane Lacaille. 2016. “Acceptability of an Online Modified Delphi Panel Approach for Developing Health Services Performance Measures.” Product Page. 2016. http://www.rand.org/pubs/external_publications/EP66653.html.

Kim, Eundeok, Ann Marie Fiore, and Hyejeong Kim. 2013. Fashion Trends: Analysis and Forecasting. Berg.

Kim, Jai-Ok, Sandra Forsythe, Qingliang Gu, and Sook Jae Moon. 2002. “Cross-Cultural Consumer Values, Needs and Purchase Behavior.” Journal of Consumer Marketing 19 (6): 481–502.

Kim, Kyoung-jae. 2003. “Financial Time Series Forecasting Using Support Vector Machines.” Neurocomputing 55 (1): 307–319.

Kim, W. Chan, and Renée Mauborgne. 2004. “Blue Ocean Strategy.” If You Read Nothing Else on Strategy, Read Thesebest-Selling Articles., 71.

Kineman, John J., and K. Anil Kumar. 2007. “Primary Natural Relationship: Bateson, Rosen, and the Vedas.” Kybernetes 36 (7/8): 1055–69. https://doi.org/10.1108/03684920710777838.

Koh, Joyce Hwee Ling, Ching Sing Chai, Benjamin Wong, and Huang-Yao Hong. 2015. “Design Thinking and Education.” In Design Thinking for Education, 1–15. Springer.

Kononenko, Igor. 2001. “Machine Learning for Medical Diagnosis: History, State of the Art and Perspective.” Artificial Intelligence in Medicine 23 (1): 89–109.

Kosow, Hannah, and Robert Gabner. 2008. Methods of Future and Scenario Analysis: Overview, Assessment, and Selection Criteria. Vol. 39. Deutschland.

Kotler, Philip. 2009. Marketing Management: A South Asian Perspective. Pearson Education India.

218

Kozinets, Robert V. 2002. “The Field behind the Screen: Using Netnography for Marketing Research in Online Communities.” Journal of Marketing Research 39 (1): 61–72.

Krizhevsky, Alex, Ilya Sutskever, and Geoffrey E. Hinton. 2012. “ImageNet Classification with Deep Convolutional Neural Networks.” In Advances in Neural Information Processing Systems 25, edited by F. Pereira, C. J. C. Burges, L. Bottou, and K. Q. Weinberger, 1097– 1105. Curran Associates, Inc.

Kroon, Jacob. 1990. General Management. Pearson South Africa.

Kuhn, Thomas S. 1962. The Structure of Scientific Revolutions. Vol. 2. University of Chicago press Chicago.

Kuhn, Thomas S., and David Hawkins. 1963. “The Structure of Scientific Revolutions.” American Journal of Physics 31 (7): 554–555.

Kumar, Nirmalya, Lisa Scheer, and Philip Kotler. 2000. “From Market Driven to Market Driving.” European Management Journal 18 (2): 129–142.

Lamb, Vanessa Martins. 2011. “The 1950’s and the 1960’s and the American Woman: The Transition from the" Housewife" to the Feminist.”

Landeta, Jon. 2006. “Current Validity of the Delphi Method in Social Sciences.” Technological Forecasting and Social Change 73 (5): 467–482.

Lauchlan, Stuart. 2017. “Tiffany Sees E-Commerce as a Small, but Valuable Gem.” Diginomica (blog). March 21, 2017. https://diginomica.com/2017/03/21/tiffany-sees-e-commerce- small-valuable-gem/.

Laura Ashley ecommerce website. 2018. “Laura Ashley_Our Heritage.” 2018. http://www.lauraashley.com/uk/about-laura-ashley/heritage/page/heritage.

LeCun, Yann, Yoshua Bengio, and Geoffrey Hinton. 2015a. “Deep Learning.” Nature 521 (7553): 436.

———. 2015b. “Deep Learning.” Nature 521 (7553): 436–444.

———. 2015c. “Deep Learning.” Nature 521 (7553): 436–444.

Lennon, Harry K., JD Pearse, and Michel Godet. 1979. The Crisis in Forecasting and the Emergence of the Prospective Approach: With Case Studies in Energy and Air Transport. Pergamon Press.

Leonard, Peter. 2014. “Mining Large Datasets for the Humanities.”

219

Lewis, Paul. 2018. “‘Utterly Horrifying’: Ex-Facebook Insider Says Covert Data Harvesting Was Routine.” . March 20, 2018. http://www.theguardian.com/news/2018/mar/20/facebook-data-cambridge-analytica- sandy-parakilas.

Li, Jun. 2008. “Ethical Challenges in Participant Observation: A Reflection on Ethnographic Fieldwork.” The Qualitative Report 13 (1): 100–115.

“Lidewij Edelkoort.” 2016. 2016. http://www.edelkoort.com/lidewij-edelkoort/.

Lieber, Chavie. 2017. “The New Tiffany & Co. Needs Women.” Racked. February 5, 2017. https://www.racked.com/2017/2/5/14502980/tiffany-jewelry-self-purchasing-woman.

Lin, J. J., P. T. Sun, J. J.-R. Chen, L. J. Wang, H. C. Kuo, and W. G. Kuo. 2010. “Applying Gray Model to Predicting Trend of Textile Fashion Colors.” The Journal of The Textile Institute 101 (4): 360–68. https://doi.org/10.1080/00405000802435827.

Linkov, Igor, A. Varghese, S. Jamil, Thomas P. Seager, G. Kiker, and T. Bridges. 2004. “Multi- Criteria Decision Analysis: A Framework for Structuring Remedial Decisions at Contaminated Sites.” In Comparative Risk Assessment and Environmental Decision Making, 15–54. Springer.

Liu, Chen, William B. Rouse, and Zhongyuan Yu. 2015. “When Transformation Fails: Twelve Case Studies in the American Automobile Industry.” Journal of Enterprise Transformation 5 (2): 71–112.

Long-Range Forecasting and Planning: A Symposium Held at the U.S. Air Force Academy, Colorado, 16-17 August 1966. 1967. U.S. Government Printing Office.

Mackinney-Valentin, Maria. 2011. “DART-New Teaching Methods for Organizing Intuition.” Nordes, no. 4.

Makridakis, Spyros, Rob J. Hyndman, and Steven C. Wheelwright. 1998. Forecasting: Methods and Applications. John Wiley & Sons, Inc.

Makridakis, Spyros, Steven C. Wheelwright, and Rob J. Hyndman. 2008. Forecasting Methods and Applications. John Wiley & Sons.

Malhotra, Sugandh, Lalit K. Das, and V.M. Chariar. 2015. “Classification of Forecasting Methods with Respect to Their Structure.” DS79: Proceedings of The Third International Conference on Design Creativity, Indian Institute of Science, Bangalore.

Manchester, Herbert H., and Cheney Brothers. 1916. The Story of Silk & Cheney Silks. Cheney Brothers, Silk Manufacturers.

220

Marcuse. 1964. One Dimensional Man: Studies in the Ideology of Advanced Industrial Society. Boston: Beacon Press.

Marcuse, Herbert. 1968. “Re-Examination of the Concept of Revolution.” Diogenes 16 (64): 17– 26.

Market Trends for Selected Chemical Products ... and Prospects To ... 1960. United Nations.

Martin, Robert C. 2002. Agile Software Development: Principles, Patterns, and Practices. Prentice Hall.

Martin, Roger L. 2009. The Design of Business: Why Design Thinking Is the next Competitive Advantage. Harvard Business Press.

Masterman, Margaret. 1970. “The Nature of a Paradigm.” Criticism and the Growth of Knowledge, Cambridge, 59–89.

Mather, Hal F. 1986. “Design, Bills of Materials, and Forecasting: The Inseparable Threesome.” Production and Inventory Management Journal 27 (1): 90–107.

McCallum, R. Andrew. 1995. “Instance-Based Utile Distinctions for Reinforcement Learning with Hidden State.” In ICML, 387–395.

———. 1996. “Hidden State and Reinforcement Learning with Instance-Based State Identification.” IEEE Transactions on Systems, Man, and Cybernetics, Part B (Cybernetics) 26 (3): 464–473.

McGovern, James. 2017. “Enterprise Architecture and Scenario Planning.” Gartner. April 28, 2017. https://blogs.gartner.com/james-mcgovern/2017/04/28/enterprise-architecture-and- scenario-planning/.

McKim, Robert H. 1972. “Experiences in Visual Thinking.”

McNoldy, Brian. 2013. “Atlantic Hurricane Season 2013: What’s New and What Should We Expect?” Washington Post, May 17, 2013, sec. Capital Weather Gang. https://www.washingtonpost.com/news/capital-weather-gang/wp/2013/05/17/atlantic- hurricane-season-2013-whats-new-and-what-should-we-expect/.

Milhaud, Gaston. 2006. “Rational Science.” The Philosophical Forum 37 (1): 29–46. https://doi.org/10.1111/j.1467-9191.2006.00227.x.

Miller, Aaron. 2015. “Why The Pontiac Aztec Was The Biggest Failure In Automotive History.” Thrillist. July 15, 2015. https://www.thrillist.com/cars/the-pontiac-aztec-was-the-biggest- failure-in-automotive-history.

221

Miller, George A. 1956. “The Magical Number Seven, plus or Minus Two: Some Limits on Our Capacity for Processing Information.” Psychological Review 63 (2): 81.

Miller, ROBERT B., and James C. Hickman. 1973. “Time Series Analysis and Forecasting.” Transactions Of Society Of Actuaries 25 (73).

Mitchell, Melanie. 2009. Complexity: A Guided Tour. Oxford University Press.

Molitor, Graham TT. 2003. “Molitor Forecasting Model: Key Dimensions for Plotting the’patterns of Change’.” Journal of Future Studies 8 (1): 61–72.

Morrison, James L., Ian Wilson, and H. Didsbury. 1996. “The Strategic Armament Re Sponse to the Challenge of Global Change.” Future Vision: Ideas, Insights, and Strategies. Maryland, USA: World Future Society, 166–181.

Murphy, Ian. 2016. “GE Aviation to Lower Airline Costs.” Enterprise Times. November 17, 2016. https://www.enterprisetimes.co.uk/2016/11/17/ge-aviation-lower-airline-costs/.

Murphy, Victoria. 2004. “Romance Killer.” Forbes. November 29, 2004. forbes/2004/1129/097.

Naiman, Linda. 2016. “Design Thinking as a Strategy for Innovation.” Creativity at Work (blog). 2016. https://www.creativityatwork.com/design-thinking-strategy-for-innovation/.

Nasrabadi, Nasser M. 2007. “Pattern Recognition and Machine Learning.” Journal of Electronic Imaging 16 (4): 049901.

Nelson, Charles R., and Charles R. Plosser. 1982. “Trends and Random Walks in Macroeconmic Time Series: Some Evidence and Implications.” Journal of Monetary Economics 10 (2): 139–162.

Neuhart, John, Charles Eames, Ray Eames, and Marilyn Neuhart. 1989. Eames Design: The Work of the Office of Charles and Ray Eames. Harry N. Abrams.

Njeri, Rebecca. 2017. “What Is A Decision Tree Algorithm?” Decision Tree Algorithm (blog). September 3, 2017. https://medium.com/@SeattleDataGuy/what-is-a-decision-tree- algorithm-4531749d2a17.

Norman, Donald A., and Roberto Verganti. 2014. “Incremental and Radical Innovation: Design Research vs. Technology and Meaning Change.” Design Issues 30 (1): 78–96.

Nystrom, Paul Henry. 1928. “Economics of Fashion.”

222

Okoli, Chitu, and Suzanne D. Pawlowski. 2004a. “The Delphi Method as a Research Tool: An Example, Design Considerations and Applications.” Information & Management 42 (1): 15–29.

———. 2004b. “The Delphi Method as a Research Tool: An Example, Design Considerations and Applications.” Information & Management 42 (1): 15–29. https://doi.org/10.1016/j.im.2003.11.002.

Olden, Julian D., Joshua J. Lawler, and N. LeRoy Poff. 2008. “Machine Learning Methods without Tears: A Primer for Ecologists.” The Quarterly Review of Biology 83 (2): 171– 193.

Oudshoorn, Nelly E. J., and Trevor Pinch. 2003. How users matter: The co-construction of users and technologies. MIT Press. https://research.utwente.nl/en/publications/how-users- matter-the-co-construction-of-users-and-technologies.

Oxman, Rivka. 2004. “Think-Maps: Teaching Design Thinking in Design Education.” Design Studies 25 (1): 63–91.

Paap, Jay, and Ralph Katz. 2004. “Anticipating Disruptive Innovation.” Research-Technology Management 47 (5): 13–22. https://doi.org/10.1080/08956308.2004.11671647.

Pearson, Karl, G. U. Yule, Norman Blanchard, and Alice Lee. 1903. “The Law of Ancestral Heredity.” Biometrika 2 (2): 211–236.

Person, Harlow S. 1922. “Shaping Your Management to Meet Developing Industrial Conditions.” Bulletin of The Taylor Society 7 (6): 211–217.

Pillkahn, Ulf. 2008. Using Trends and Scenarios as Tools for Strategy Development: Shaping the Future of Your Enterprise. John Wiley & Sons.

Plackett, Ronald L. 1950. “Some Theorems in Least Squares.” Biometrika 37 (1/2): 149–157.

Porter, Michael E. 1985. “Competitive Advantage: Creating and Sustaining Superior Performance. 1985.” New York: FreePress.

Porter, Michael E., and Mark R. Kramer. 2011. “The Big Idea: Creating Shared Value. How to Reinvent Capitalism—and Unleash a Wave of Innovation and Growth.” Harvard Business Review 89 (1–2).

Pouillard, Véronique. 2013a. “The Rise of Fashion Forecasting and Fashion Public Relations, 1920–1940: The History of Tobé and Bernays.” Globalizing Beauty: Consumerism and Body Aesthetics in the Twentieth Century, 151–69.

223

———. 2013b. “The Rise of Fashion Forecasting and Fashion Public Relations, 1920-1940s: The History of Tobé and Bernays.”

Prior, D., and Lucy M. Miller. 2010. “Webethnography: A Typology of Online Contexts and Consequent Research Implications.” In Proceedings of ANZMAC Conference.

Rams, Dieter. 1980. Dieter Rams: Ten Principles for Good Design. Retrieved 16/1/2014, 2014, from https://www. vitsoe. com/gb/about/good-design.

———. 1999. “Waste Not Want Not.” RSA Journal 148 (5491): 122–23.

RAND. 2017. “Delphi Method | RAND.” Delphi Method. 2017. https://www.rand.org/topics/delphi-method.html.

Rasmussen, Carl Edward. 2004. “Gaussian Processes in Machine Learning.” In Advanced Lectures on Machine Learning, 63–71. Springer.

Recknagel, Friedrich. 2001. “Applications of Machine Learning to Ecological Modelling.” Ecological Modelling 146 (1): 303–310.

Reeves, Scott, Ayelet Kuper, and Brian David Hodges. 2008. Qualitative Research: Qualitative Research Methodologies: Ethnography. Vol. 337. https://doi.org/10.1136/bmj.a1020.

Rhyne, Russell. 1974. “Technological Forecasting within Alternative Whole Futures Projections.” Technological Forecasting and Social Change 6: 133–162.

———. 1981. “Whole-Pattern Futures Projection, Using Field Anomaly Relaxation.” Technological Forecasting and Social Change 19 (4): 331–60. https://doi.org/10.1016/0040-1625(81)90005-6.

Rinallo, Diego, and Francesca Golfetto. 2006. “Representing Markets: The Shaping of Fashion Trends by French and Italian Fabric Companies.” Industrial Marketing Management, IMP 2005: Dealing with Dualities, 35 (7): 856–69. https://doi.org/10.1016/j.indmarman.2006.05.015.

Ritchey, Tom. 1998. “General Morphological Analysis.” In 16th Euro Conference on Operational Analysis.

———. 2011a. “General Morphological Analysis (GMA).” In Wicked Problems–Social Messes, 7–18. Springer. http://link.springer.com/10.1007/978-3-642-19653-9_2.

———. 2011b. “Modeling Alternative Futures with General Morphological Analysis.” World Future Review 3 (1): 83–94.

224

Rittel, Horst WJ, and Melvin M. Webber. 1973. “Dilemmas in a General Theory of Planning.” Policy Sciences 4 (2): 155–169.

Rochberg, Richard. 1970. The Use of Cross-Impact Matrices for Forecasting and Planning. Institute for the future.

Rokach, Lior, and Oded Maimon. 2014. Data Mining with Decision Trees: Theory and Applications. World scientific.

Roll, Martin. 2016. “The Secret of Zara’s Success: A Culture of Customer Co-Creation.” Martin Roll. 2016. https://martinroll.com/resources/articles/strategy/the-secret-of-zaras-success- a-culture-of-customer-co-creation/.

Rosen, Robert. 1985. “Anticipatory Systems: Philosophical, Mathematical$\backslash$& Methodological Foundations.”

———. 2012. “Anticipatory Systems.” In Anticipatory Systems, 313–370. Springer.

Rosenblatt, Frank. 1958. “The Perceptron: A Probabilistic Model for Information Storage and Organization in the Brain.” Psychological Review 65 (6): 386.

Roumiantseva, Anna. 2016. “The Fourth Way: Design Thinking Meets Futures Thinking.” October 19, 2016. https://www.linkedin.com/pulse/fourth-way-design-thinking-meets- futures-anna-roumiantseva.

Rowe, Peter G. 1991. Design Thinking. MIT press.

Russel, Stuart, and Peter Norvig. 2003. “Artificial Intelligence: A Modern Approach, 2003.” EUA: Prentice Hall.

Sak, Hasim, Andrew W. Senior, and Françoise Beaufays. 2014. “Long Short-Term Memory Recurrent Neural Network Architectures for Large Scale Acoustic Modeling.” In Interspeech, 338–342.

Saxby, Carl L., Kevin R. Parker, Philip S. Nitse, and Paul L. Dishman. 2002. “Environmental Scanning and Organizational Culture.” Marketing Intelligence & Planning 20 (1): 28–34.

Schich, Maximilian. 2015. “Figuring Out Art History.” ArXiv:1512.03301 [Physics, q-Bio], October. http://arxiv.org/abs/1512.03301.

Schlack, Julie Wittes. 2015. “Use Your Customers as Ethnographers.” Harvard Business Review. August 17, 2015. https://hbr.org/2015/08/use-your-customers-as-ethnographers.

225

Sherman, Lauren. 2014. “Meet Tiffany’s First Ever Female Design Director.” ELLE. August 26, 2014. https://www.elle.com/news/fashion-accessories/tiffanys-new-gem.

Shin, Meong Jin, and Tracy Diane Cassidy. 2015. “Designing a Fashion Driving Forces Website as an Educational Resource.” International Journal of Fashion Design, Technology and Education 8 (2): 173–83. https://doi.org/10.1080/17543266.2015.1045042.

Shmueli, Galit, and others. 2010. “To Explain or to Predict?” Statistical Science 25 (3): 289–310.

Sicard, Marie-Claude. 2013. “How Luxury Brands Work.” In Luxury, Lies and Marketing, 64– 168. Springer.

Smit, George L., and L. Bruce Archer. 1973. “A Methodology for Consumer Design.” In Human Factors and Ergonomics Society Annual Meeting Proceedings, 17:105–110.

Sproles, George B. 1974. “Fashion Theory: A Conceptual Framework.” ACR North American Advances NA-01. http://acrwebsite.org/volumes/5731/volumes/v01/NA-01.

———. 1981. “Analyzing Fashion Life Cycles: Principles and Perspectives.” Journal of Marketing 45 (4): 116–24. https://doi.org/10.2307/1251479.

Stigler, Stephen M. 1989. “Francis Galton’s Account of the Invention of Correlation.” Statistical Science, 73–79.

Stiner, Scott. 2016. “How To Use Ethnographic Research To Help Your Business.” Forbes. June 1, 2016. https://www.forbes.com/sites/forbestechcouncil/2016/06/01/how-to-use- ethnographic-research-to-help-your-business/.

Stoller, Paul. 2015. “In Defense of Ethnography.” Huffington Post (blog). August 24, 2015. https://www.huffingtonpost.com/paul-stoller/in-defense-of- ethnography_b_8028542.html.

Studenmund, A. H. 2000. “Using Econometrics: A Practical Approach.”

Sull, Donald. 1999. “Why Good Companies Go Bad.” Harvard Business Review. July 1, 1999. https://hbr.org/1999/07/why-good-companies-go-bad.

Sullivan, Lisa M., Kimberly A. Dukes, and Elena Losina. 1999. “Tutorial in Biostatistics an Introduction to Hierarchical Linear Modelling.”

Sun, Zhan-Li, Tsan-Ming Choi, Kin-Fan Au, and Yong Yu. 2008. “Sales Forecasting Using Extreme Learning Machine with Applications in Fashion Retailing.” Decision Support Systems 46 (1): 411–419.

226

Taigman, Yaniv, Ming Yang, Marc’Aurelio Ranzato, and Lior Wolf. 2014. “Deepface: Closing the Gap to Human-Level Performance in Face Verification.” In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 1701–1708.

Takegami, Eizaburo, Takafumi Kawasaki, Hisako Okada, Masatoshi Takada, Hitachi-GE Nuclear Energy, Ltd Ibaraki, and Japan Kazuaki Yamagata. 2014. “Case Study of Management System Improvement for Plant Construction-User Needs Identification through Field Observation.” 2014 International Conference on Industrial Engineering and Operations Management, January.

Taleb, Nassim. 2007. “The Black Swan: The Impact of the Highly Improbable.” NY: Random House.

Tankard, Colin. 2016. “What the GDPR Means for Businesses.” Network Security 2016 (6): 5–8.

Tavitiyaman, Pimtong, Hanqin Qiu Zhang, Vincent T. Law, and Pearl MC Lin. 2016. “Exploring the Environmental Scanning of the Hotel Industry in China.” Journal of China Tourism Research, 1–18.

“Techcast.” 2017. Techcast Global (blog). 2017. https://www.techcastglobal.com/.

Tiffany & Co. 2018a. “Tiffany & Co. History.” About Tiffany & Co. 2018. http://press.tiffany.com/ViewBackgrounder.aspx?backgrounderId=33.

———. 2018b. “Tiffany Blue.” Tiffany.Com. 2018. http://press.tiffany.com/ViewBackgrounder.aspx?backgrounderId=6.

“TimesUp.” 2017. Time’s Up Now. 2017. https://www.timesupnow.com/.

Toffler, Alvin. 1970. “Future Shock, 1970.” Sydney. Pan.

Towle, H. Ledyard. 1941. “‘Art and Color in Body Design.’” SAE Technical Paper 410040. Warrendale, PA: SAE Technical Paper. https://doi.org/10.4271/410040.

Trafalis, Theodore B., and Huseyin Ince. 2000. “Support Vector Machine for Regression and Applications to Financial Forecasting.” In Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference On, 6:348–353. IEEE.

Turing, Alan M. 1948. “Intelligent Machinery, a Heretical Theory.” The Turing Test: Verbal Behavior as the Hallmark of Intelligence 105.

Unger, Darian W., and Steven D. Eppinger. 2009. “Comparing Product Development Processes and Managing Risk.” International Journal of Product Development 8 (4): 382–402.

227

Uysal, Muzaffer. 2004. “Advancement in Computing: Implications for Tourism and Hospitality.” Scandinavian Journal of Hospitality and Tourism 4 (3): 208–224.

Venkatesh, Alladi. 1995. “Ethnoconsumerism: A New Paradigm to Study Cultural and Cross- Cultural Consumer Behavior.” In Chapter in Ja Costa and G. Bamossy (Eds.), Marketing in the Multicultural. Citeseer.

Vermeulen, Han F., Regna Darnell, and Stephen O. Murray. 2015. Before Boas: The Genesis of Ethnography and Ethnology in the German Enlightenment. University of Nebraska Press.

Vermeulen, Hendrik Frederik. 2008. “Early History of Ethnography and Ethnology in the German Enlightenment: Anthropological Discourse in Europe and Asia, 1710-1808.” Department of Cultural Anthropology and Development Sociology, Faculty of Social and Behavioural Sciences, Leiden University.

Vickers, Brent. 1992. “Using GDSS to Examine the Future European Automobile Industry.” Futures 24 (8): 789–812.

Von Hippel, Eric, Stefan Thomke, and Mary Sonnack. 1999. “Creating Breakthroughs at 3M.” Harvard Business Review 77: 47–57.

Voros, Joseph. 2009. “Morphological Prospection: Profiling the Shapes of Things to Come.” Foresight 11 (6): 4–20.

Walker, Gilbert. 1931. “On Periodicity in Series of Related Terms.” Proceedings of the Royal Society of London. Series A, Containing Papers of a Mathematical and Physical Character 131 (818): 518–532.

Wendel, Paul Joseph. 2008. “Models and Paradigms in Kuhn and Halloun.” Science & Education 17 (1): 131–141.

Weston, Danny. 2013. “Ethnography: When and How to Use It.” Spotless. October 10, 2013. https://www.spotless.co.uk/insights/ethnography-when-and-how/.

Winsor, Charles P. 1932. “The Gompertz Curve as a Growth Curve.” Proceedings of the National Academy of Sciences 18 (1): 1–8. Wissema, Johan G. 1976. “Morphological Analysis: Its Application to a Company TF Investigation.” Futures 8 (2): 146–153.

Wold, Herman. 1939. A Study in the Analysis of Stationary Time Series. JSTOR.

Woudenberg, Fred. 1991. “An Evaluation of Delphi.” Technological Forecasting and Social Change 40 (2): 131–150.

228

Yule, G. Udny. 1897. “On the Theory of Correlation.” Journal of the Royal Statistical Society 60 (4): 812–854.

———. 1910. “On the Interpretation of Correlations between Indices or Ratios.” Journal of the Royal Statistical Society 73 (6/7): 644–47. https://doi.org/10.2307/2339906.

Zapa\la-Kraj, Marta. 2014. “Women of 1950s. The Truth behind White Picket Fence.”

Zarnowitz, Victor. 1994. “Business Cycles: Theory, History, Indicators, and Forecasting.”

Zimmer, Olivier, and Yarden Horwitz. 2015. “Fashion Trends for Spring 2015 as Told by Google Data.” Think with Google. March 2015.

Zuehlke, Jeffrey. 2006. Muscle Cars. Lerner Publications.

Zwicky, F. 1971. “22. Projections into the Future.” Symposium - International Astronomical Union 42 (January): 155–64. https://doi.org/10.1017/S0074180900097217.

Zwicky, Fritz. 1947. “Morphology and Nomenclature of Jet Engines.” Aeronautical Engineering Review 6 (6): 49–50.

———. 1948a. “Morphological Astronomy.” The Observatory 68: 121–143.

———. 1948b. The Morphological Method of Analysis and Construction. California inst. of technol.

———. 1967. “The Morphological Approach to Discovery, Invention, Research and Construction.” In New Methods of Thought and Procedure, 273–297. Springer.

———. 1969. “Discovery, Invention, Research through the Morphological Analysis.” McMillan, New York.

229

BIOGRAPHICAL SKETCH

Priyanka Sharma is a Trends Evolution Specialist with prime focus on crafting consumer experiences of the future. Priyanka partners with organizations and design teams on new product development, innovation projects and macro trends research.

After spending over a decade working as a Trend Strategist for multimillion-dollar brands and

Fortune 500 companies, Priyanka believes in the power of trends forecasting practice for informing design and development of new consumer products. She has worked with a variety of consumer durables and consumer goods industries ranging from automobile, electronics, retail, apparel and lifestyle goods.

Having started her career as a fashion designer, Priyanka showcased her work at fashion weeks during 2008-09. Priyanka has a Master’s degree in Design Research from Central Saint Martin’s

College of Art and Design, London, UK. Priyanka joined the Arts and Technology Ph.D. program at The University of Texas at Dallas in August 2014.

Priyanka currently serves as a Lead Online Marketing Manager for AT&T Mobility Services.

She supports digital marketing efforts within the Business Marketing Organization. Priyanka focuses on enhancing the digital user experience by defining digital product taxonomy, navigation flows, omnichannel experience, and personalization.

230

CURRICULUM VITAE

Priyanka Sharma

Summary

I work at the intersection of technology and design to craft consumer experiences which are a human- centered, proactive and yet leading edge. I integrate design thinking and actionable insights to achieve business growth. Key professional experience has been in the areas of innovation strategy, trends forecasting, market entry strategies, macro and micro trends and product development life cycle. Proven ability to convert qualitative and quantitative frameworks of data into actionable insights and business strategy. Having worked within a wide range of industries such as electronics, automotive, consumer goods, retail and telecom I feel confident switching gears quickly while keeping the focus on consumer experience.

Education

Ph.D. Candidate, 2016 School of Arts & Technology, University of Texas at Dallas, TX, USA

MA Design Studies. 2010. Merit Awardee. Central Saint Martin’s College of Art and Design University of The Arts London, London, UK

Bach. Of Design 2008. Gold Medal Awardee (Best Academic Performance) National Institute of Fashion Technology New Delhi, India

Professional Experience

• Lead Online Marketing Manager (2017-Present) Digital Marketing, AT&T.

As part of this role, I created short and long-term strategic research scopes to pre-empt changing consumer trends and preferences through site taxonomies, navigation and information architecture with the focus on optimizing and enhancing a future-forward digital experience.

Pioneered digital facelift across the primary business-to-business marketing channels encompassing nine product portfolios and over 100 products, to a new experience based on hierarchically networked taxonomies and user analytics.

231

• Digital CX Strategist (2016-2017) Digital Marketing, AT&T.

My role thrived at the intersection of digital marketing, content management, website optimization and IT ecosystems. As part of this role, I: - Drove the digital customer experience strategy for AT&T Business marketing platforms based on data analysis from varied sources such as Adobe Analytics, Adobe Test, and Target, Foresee, and other UX monitoring tools. - Informed the AT&T future forward Smart Business Strategy toward omnichannel personalization by translating customer and business needs into actionable technical and tactical requirements. - Designed short and long-term strategic research scopes to pre-empt changing consumer trends and preferences through improved site taxonomy, navigation and information architecture. - Crafted nuances of a seamless customer experience across interaction points through journey flow, personas and usability analysis. - Worked with individual stakeholders to capture desired performance goals through strategic optimization roadmap.

• Research Assistant (Jan – Dec 2015) Culture Science Lab, School of Arts & Technology, UT Dallas.

Explored the practices and methods within the area of consumer trends forecasting. The research focused on defining a composite forecasting framework. The composite forecasting framework would allow orchestration of forecasting methods to devise strategies for the design and development of new products. The objective of consumer trends forecasting is to help decision makers and stake holders take informed action today in order to create relevant and meaningful products in the future.

• Head of Consumer Insights & Experience. (Apr 2012 – Aug 2013) Kohler Co.

The cross-functional role brought together Marketing, Design, Business and Engineering divisions or the organization. The goal was to make the business pro-active (and not reactive) based on future trends research and facilitate leading-edge product innovation for consumers, dealers and channel partners. As part of this role, I: - Led consumer insight and experience research, trends analysis and specific in-project activities. This work generated qualitative and quantitative frameworks of information that allowed business to better define project scopes, ideate, innovate and make informed decisions on project planning and risks. - Conducted annual trends forecasting research. Presented key future directions for the business in Trends Workshops with cross-functional representation. These workshops culminated with clear directions for New Product Development. - Defined appropriate research methods for cultural and user insights. Lead research efforts in conjunction with internal teams and administered research via external providers.

232

- Guided product design and development across teams in Paris, London and Shanghai to ensure relevance to specific markets and consumers. - Synthesized, organized and created reports and presentations based on large amounts of abstract data and analytically derived conclusions. I provided the research foundation and innovation strategy that international design teams (Paris, Shanghai, and Wisconsin) required to make meaningful products informed by cultural insights and consumer trends.

• Trends Strategist. Futurologist. (Mar, 2011 – Apr, 2012) Onio Design Pvt. Ltd., India

- Conducted research to identify Macro and Micro trends, fore-sights, scenarios for global companies in sectors like automotive, consumer durables and consumer goods resulting in ahead of curve market and product strategies. - Translating trends signals into actual product attributes and features. - Conducted cultural insights research for invaluable market entry strategies. - Connecting the dots with in-depth interviews, interactive ethnography models, focus groups for innovation-worthy insights - Research reports based on strategic business foresight and cultural codes for- Tata Motors, Renault Motors, Dow Corning (Solar), LG Electronics, Samsung, Godrej Industries Ltd., Tata Sky TV, Hella Automotive Lighting.

• Design Researcher. (July 2010 – Feb 2011) Paul Smith Ltd., London, UK

Research focus on inherent emblematic codes and meaning, cultural decoding for signature hardware and branding design. This body of research resulted in inspirations for quintessentially British cultural design cues for the fashion house.

Seminars and Workshops Conducted

2012: Mega Trends for Business Strategy

2013: Macro Trends for Business Innovation for Kohler India

2014: Tech Trends for Restaurant & Retail Industry, organized by Trailblazer Capital. Tech Trends for Retail Industry 2017, Dallas

2015: “Trends Foresight for Design Research” at 11th International European Academy of Design Conference, Paris, France.

233

Other Achievements

• Co-created “Juxt” (iOS App). 2014, Dallas. Award-winning benefactor of Dept. of Energy (North Texas) grant of $20,000. Created an iOS app that allows drivers of alternative fuel vehicles to find fueling stations, parking spaces, and other points of interest along their route, real-time and share this information with other app users. Responsible for user experience (UX) research, design, and visualization strategy of the app.

• ‘Best use of open data’ for Transportation App concept by Socrata. 2014, Dallas.

• Chosen as the youth delegate (scholar) to represent India as state guest of President Hu Jintao, People’s Republic of China. 2007, New Delhi.

• Bach of Design- Gold medal awardee for best academic performance. 2008, New Delhi.

234