Ad Empathy: a Design Fiction
Total Page:16
File Type:pdf, Size:1020Kb
Paper Session: Design Fictions GROUP 2018, Jan. 7–10, 2018, Sanibel Island, FL, USA Ad Empathy: A Design Fiction Michael Skirpan Casey Fiesler ACM Classification Keywords University of Colorado Boulder University of Colorado Boulder H.5.m. Information interfaces and presentation (e.g., Boulder, CO Boulder, CO HCI): Miscellaneous [email protected] [email protected] Product Introduction Abstract Today’s competitive attention economy requires brands Industry demand for novel forms of personalization and to reach customers in personal and affective ways. Years audience targeting, paired with research trends in affective of research and experience establish that personalization computing and emotion detection, puts us on a clear path is effective for ad targeting and affecting user and toward emotion-sensitive technologies. Written as API consumer attitudes [20]. However, personalization is documentation for an AI marketing solution that provides also a saturated approach. The relative ease of obtaining “emotion-sensitive marketing decisions,” this design fiction consumer preference data makes it common for online presents one possible future application of today’s advertisers to know what a customer wants. Companies research. Offering a demonstrable grey area in technology wanting the competitive edge now need to know when a ethics, Ad Empathy should help to ground debates around product is best advertised and how it should be framed. fair use of data, and the boundaries of ethical design. Knowing this demand, we are happy to launch Ad Empathy, an AI marketing solution supporting brands to Author Keywords make emotion-sensitive marketing decisions. advertising; API; design fiction; emotion; ethics; social computing; speculative fiction; neural networks; machine Our API Resources are designed to help our clients learning; target marketing generate content for ad impressions, catering to the dynamic needs of the diverse individuals in their Permission to make digital or hard copies of all or part of this work for audience. We work with most major social media personal or classroom use is granted without fee provided that copies are not platforms and search engines to create connected made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components profiles of customers that can be accessed from any ad of this work owned by others than the author(s) must be honored. client via the Ad Empathy API. For each advertising Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or platform you would like to integrate with Ad Empathy, a fee. Request permissions from [email protected]. simply add your company’s registered OAuth tokens GROUP '18, January 7–10, 2018, Sanibel Island, FL, USA using the Ad Empathy Dashboard and within 48 hours we © 2018 Copyright is held by the owner/author(s). Publication rights licensed to ACM. will have trained models for each of your customers and ACM 978-1-4503-5562-9/18/01…$15.00 . 267 Paper Session: Design Fictions GROUP 2018, Jan. 7–10, 2018, Sanibel Island, FL, USA customer types. From that point onward, you can use the MOOD.PRODUCT Ad Empathy API to design your ad impressions on any list - GET /mood/product/<product-id>/<cookie-id> connected platform. To use Ad Empathy as a full-cycle marketing platform, you may also register your product Returns a list of product IDs and the mood that is most inventory with our platform to track emotional responses positively associated with a customer interaction. to product-specific brand interactions and improve our models. MOOD.TOPIC list - GET /mood/topic/<cookie-id> Getting Started Before making any requests using our models, you Returns a list of content topics and our highest should contact a member of our Sales Team to discuss confidence mood association for that topic. pricing options or obtain a free trial. All API Resource requests must contain a valid token pair <client-token> TREND and <client-secret>, a <cookie-id> for the user, and get - GET /trend/now/<cookie-id> optionally a <platform-id> to specify the ad client platform. Developers building platform-agnostic services Returns the predicted emotional states, ordered by can use our Accounts API to obtain valid <cookie-id>’s confidence, for upcoming 30-minute time interval. for building cross-platform ad campaigns and event triggers. list - GET /trend/daily/<cookie-id> API Resources Returns a list of 30-minute time intervals over 24-hours Once you have obtained valid token pairs, integrated with the most common emotional state associated to your external ad platform’s tokens, and see the green each interval. check mark at the top corner of your Ad Sense Dashboard, you can begin using any API Resource. RESPONSE get - GET /response/<product-id>/<cookie-id> MOOD get - GET /mood/now/<cookie-id> Returns the user’s last cached online emotional response to an interaction with <product-id>. (API Resource Returns current emotional state (mood) of user as a list available only to customers using Ad Empathy Trackers of top ten moods by confidence for their product inventory) list - GET /mood/list/<cookie-id> EXPRESSION.TEXT get - GET /expression/single/<emotion>/<cookie-id> Returns a list of frequencies for all moods categories that Ad Empathy has related to the specified user. 268 Paper Session: Design Fictions GROUP 2018, Jan. 7–10, 2018, Sanibel Island, FL, USA Returns the syntax tokens most commonly associated content and brand interactions available for your with the user’s online expression of the emotion. customer base. After mining all historical data about your customers, we place their user accounts into our reactive list - GET /expression/all/<cookie-id> event loop that keeps tabs on new activities across any connected platform. Prior to training, we run all the data Returns a paginated list of emotional states, sorted by through a noise reduction network trained specifically to their frequency, and the most common syntax tokens identify relevant emotional content. Using the filtered associated to that state. data set, we fork fresh versions of our base model and begin training a unique mood model for each of your How Does It Work? customers. This training continues until the confidence of Ad Empathy is a state-of-the-art multi-model AI our predictions meets a certain threshold. Testing is ecosystem that leverages the volume and velocity of done using a data set we capture and separate during online behavioral data by training user-specific machine the data-mining phase. Our central model (the one learning models. The core of the system is a Long Short underneath the Mood API) takes in time-structured Term Memory (LSTM) neural network trained specifically online activity for a user and outputs a likely current to predict the evolution of moods using temporally- mood given the most recent observation. This model is structured data coming from online activities (e.g., text then transferred into our second network, which chunks from posts, click content, reactions to others’ posts). Our your users’ history into 24-hour segments and trains a company began training this model nearly five years ago model that predicts the upcoming 24-hour emotional when researchers first found Gated Recurrent Units as a cycle (and provides the backbone of our Trends API!). solution to cutting through the noise of online data [15]. After years of fine-tuning and learning how to transfer Once we have accurate models for our Moods and Trends models between different users and incorporate multi- API, we do fine-grain analysis on specific data such as modal data, we found we had sown the seeds of text and photos. This process starts by performing a something much bigger than a mood prediction model. topic-modeling analysis on all user text and browsing In short, this core model became the heart of a system history to break up each user's’ history into topic-specific of interacting models. Developing our expertise in model data sets. Further, each user photo is analyzed for facial transfer allowed our team to take layers of our novel expression, object detection, and captioning to develop LSTM model and combine them with convolutional layers visual insights into the personal aesthetics of your or other Recurrent, language-processing layers, and train customer’s emotions. A core value that Ad Empathy them as Generative Adversarial Networks to blossom the offers is recognizing that each product a customer wide functionality of novel content creation you see purchases is embedded in a different context and thus today. requires a different cognitive model to understand underlying emotional relationships. We develop those When your company opens an account with Ad Empathy, models along many dimensions that account for complex our system begins by data mining all social media relationships between emotions and brand sentiments. 269 Paper Session: Design Fictions GROUP 2018, Jan. 7–10, 2018, Sanibel Island, FL, USA Important to understanding how Ad Empathy works is For this, we recommend analysis of your products with that each API your team uses is operating with different our Trends Resource to discover your most temporally custom models and parsing techniques that branch out stable products and to make inferences about how they from of our central mood-recognition network. Our are associated across time. Then using our Expressions Expression API, for instance, uses sentiment analysis in Resource, you can design context-sensitive Content Ads tandem with a generative adversarial network to parse that can portray your product regularly at the times user text and then learn how to generate novel text that associated to the emotion best suited for your product. expresses the same sentiments while staying within the known vernacular of your customer.