
Attentive Autoencoders for Multifaceted Preference Learning in One-class Collaborative Filtering Zheda Mai§, Ga Wu§, Kai Luo and Scott Sanner Department of Mechanical & Industrial Engineering, University of Toronto, Canada [email protected], fwuga, kluo, [email protected], Abstract—Most existing One-Class Collaborative Filtering systems can either increase the encoding complexity to en- (OC-CF) algorithms estimate a user’s preference as a latent vec- hance the modeling capability or extend the latent repre- tor by encoding their historical interactions. However, users often sentation to a higher dimension. Unfortunately, models with show diverse interests, which significantly increases the learning difficulty. In order to capture multifaceted user preferences, exist- complex encoding modules are fraught with increased training ing recommender systems either increase the encoding complexity difficulty, while models with high dimensional latent spaces or extend the latent representation dimension. Unfortunately, require strong regularization to prevent overfitting and prove these changes inevitably lead to increased training difficulty challenging to scale to large, real-world datasets. and exacerbate scalability issues. In this paper, we propose However, learning complex user preferences in low- a novel and efficient CF framework called Attentive Multi- modal AutoRec (AMA) that explicitly tracks multiple facets of dimensional latent spaces is not easy. Most popular recom- user preferences. Specifically, we extend the Autoencoding-based mender systems aim to model user preferences as either a recommender AutoRec to learn user preferences with multi- single point or a uni-modal distribution in the low-dimensional modal latent representations, where each mode captures one facet latent representation space, which we denote as uni-modal of a user’s preferences. By leveraging the attention mechanism, estimation. This approach can capture the preference for users each observed interaction can have different contributions to the preference facets. Through extensive experiments on three real- who have a single preference type. However, for users with world datasets, we show that AMA is competitive with state-of- multifaceted preferences, the uni-modal estimation may not the-art models under the OC-CF setting. Also, we demonstrate suffice since it is hard to capture the characteristics of all user how the proposed model improves interpretability by providing preferences with one mode. As we can see in Figure 1, for explanations using the attention mechanism. users with multifaceted preferences, the uni-modal estimation Index Terms—One-class Collaborative Filtering, Attention Model, Multifaceted Preference Recommendation will try to ”average” the user-item interactions and lead to an inaccurate estimation (Figure 1). Therefore, we should model I. INTRODUCTION user preferences with more than one mode in the latent space, 1 Collaborative Filtering (CF) is one of the most prevalent which we denote as multi-modal estimation . With this approaches for personalized recommendations that only relies approach, each mode in the latent representation explicitly on historical user-item interaction data. Many CF systems in captures one facet of the user preference and can better handle the literature perform well when provided with explicit user the multifaceted preference situation we mentioned (Figure 1). feedback such as 1-5 user ratings of a product. In many Nevertheless, directly modeling multifaceted preferences is applications, however, only positive feedback is available, such still non-trivial as the ground-truth mappings between item as view counts of a video or purchases of an item. One interactions and the facets of user preferences are not available, challenge of working with implicit feedback data is the lack and individual item interactions may contribute differently of negative signals. Although we can safely presume that the to different facets. To address this problem, we propose a purchases indicate the preference of a customer to a product, novel and efficient CF framework called Attentive Multi- arXiv:2010.12803v1 [cs.IR] 24 Oct 2020 we should not construe the unobserved interactions as negative modal AutoRec (AMA) to track multifaceted user preferences feedback since a customer would not be aware of all the items explicitly. AMA is based on AutoRec [3], one of the earliest on a platform and could not purchase every single item they Autoencoder based recommender systems that aims to learn like. The recommendation setting with only positive feedback a low dimensional latent representation that can reconstruct observed is known as One-Class Collaborative Filtering (OC- users’ observed interactions. But traditional AutoRec adopts a CF) [1]. uni-modal estimation approach and uses a single embedding In the OC-CF setting, many CF algorithms learn the pref- vector to represent diverse user preferences. erence representation of each user as a latent vector by To capture multifaceted user preferences, we replace the encoding their historical interactions. However, users often uni-modal estimation in AutoRec with multi-modal estimation, have diverse preferences, which significantly increases the where we have more than one latent embedding for each difficulty of modeling multifaceted preferences. In order to user (i.e., one for each preference facet). Additionally, we capture complicated user preferences, existing recommender 1The term multi-modal in our context means the latent representation has multiple modes in the latent space. Specifically, it is different from the one §Equal contribution in multi-modal learning [2], where data consists of multiple input modalities. User Representation • vj: Latent embedding vector of item j in shape of h × 1 <latexit sha1_base64="2SAl/YCXMPvNhlBIxb2VSYmmVt4=">AAACE3icbVDLSgMxFL1TX7W+qi7dBIvgqsxIQZcFNy4r2Ad0hpJJM21okhmSTKEM8xm61f9wJ279AH/DLzDTdmFbDwROzrmXezhhwpk2rvvtlLa2d3b3yvuVg8Oj45Pq6VlHx6kitE1iHqteiDXlTNK2YYbTXqIoFiGn3XByX/jdKVWaxfLJzBIaCDySLGIEGyv5vsBmHEbZNB94g2rNrbtzoE3iLUkNlmgNqj/+MCapoNIQjrXue25iggwrwwinecVPNU0wmeAR7VsqsaA6yOaZc3RllSGKYmWfNGiu/t3IsNB6JkI7WWTU614h/uf1UxPdBRmTSWqoJItDUcqRiVFRABoyRYnhM0swUcxmRWSMFSbG1rRyJRT56t8KtiRvvZJN0rmpe27de2zUmo1lXWW4gEu4Bg9uoQkP0II2EEjgBV7hzXl23p0P53MxWnKWO+ewAufrF52Fn1Q=</latexit>sha1_base64="N/IVEjWXrhtbuzzQMNkd2r1Au6E=">AAAB+XicbVDLSgNBEOyNrxijxrOXwSB4Crte9Ch48RjBPCBZwuxsbzJkHsvMrBCW3MWr/+FN/BZ/wy9w8jgYY0FDTVU33VNJLrh1YfgVVHZ29/YPqoe1o3rt+OS0Ue9aXRiGHaaFNv2EWhRcYcdxJ7CfG6QyEdhLpvcLv/eMxnKtntwsx1jSseIZZ9R5qT1qNMNWuATZJtGaNGGNUeN7mGpWSFSOCWrtIApzF5fUOM4EzmvDwmJO2ZSOceCpohJtXC7PnJNLr6Qk08aXcmSp/p4oqbR2JhPfKamb2L/eQvzPGxQuu41LrvLCoWKrRVkhiNNk8WeScoPMiZknlBnubyVsQg1lziezsSWR8823F3xG0d9Etkn3uhWFregxhCqcwwVcQQQ3cAcP0IYOMEjhFd6Cl+A9+FhlWQnWoZ7BBoLPH15ql7w=</latexit>sha1_base64="V+imswkH757IJZEsK7LPnYxHNdQ=">AAACCHicbZDNSgMxFIXv1L9aq1a3boJFcFUybnQpuHFZwf5AZyiZNNOGJpkhyRTKMI+hW30Pd+JL+Bo+gZm2C9t6IXByTsI9fFEquLEYf3uVnd29/YPqYe2ofnxy2jird02Saco6NBGJ7kfEMMEV61huBeunmhEZCdaLpg9l3psxbXiinu08ZaEkY8VjTol1VhBIYidRnM+KoT9sNHELLwZtC38lmrCa9rDxE4wSmkmmLBXEmIGPUxvmRFtOBStqQWZYSuiUjNnASUUkM2G+6FygK+eMUJxod5RFC/fvj5xIY+Yyci/LjmYzK83/skFm47sw5yrNLFN0uSjOBLIJKgGgEdeMWjF3glDNXVdEJ0QTah2mtS2RLNbvznCQ/E0k26J70/Jxy3/CUIULuIRr8OEW7uER2tABCim8whu8ey/eh/e5xFnxVlzPYW28r1/F3p3k</latexit>sha1_base64="7FL/9SW8ewIODxtcPQZ2SL6UWao=">AAACE3icbVDLSgMxFL1TX7W+qi7dBIvgqsyIoMuCG5cVbCt0hpJJM21okhmSTKEM8xm61f9wJ279AH/DLzDTzsK2HgicnHMv93DChDNtXPfbqWxsbm3vVHdre/sHh0f145OujlNFaIfEPFZPIdaUM0k7hhlOnxJFsQg57YWTu8LvTanSLJaPZpbQQOCRZBEj2FjJ9wU24zDKpvnAG9QbbtOdA60TryQNKNEe1H/8YUxSQaUhHGvd99zEBBlWhhFO85qfappgMsEj2rdUYkF1kM0z5+jCKkMUxco+adBc/buRYaH1TIR2ssioV71C/M/rpya6DTImk9RQSRaHopQjE6OiADRkihLDZ5ZgopjNisgYK0yMrWnpSijy5b8VbEneaiXrpHvV9Nym9+A2WtdlXVU4g3O4BA9uoAX30IYOEEjgBV7hzXl23p0P53MxWnHKnVNYgvP1C5xFn1A=</latexit> v<latexit sha1_base64="2SAl/YCXMPvNhlBIxb2VSYmmVt4=">AAACE3icbVDLSgMxFL1TX7W+qi7dBIvgqsxIQZcFNy4r2Ad0hpJJM21okhmSTKEM8xm61f9wJ279AH/DLzDTdmFbDwROzrmXezhhwpk2rvvtlLa2d3b3yvuVg8Oj45Pq6VlHx6kitE1iHqteiDXlTNK2YYbTXqIoFiGn3XByX/jdKVWaxfLJzBIaCDySLGIEGyv5vsBmHEbZNB94g2rNrbtzoE3iLUkNlmgNqj/+MCapoNIQjrXue25iggwrwwinecVPNU0wmeAR7VsqsaA6yOaZc3RllSGKYmWfNGiu/t3IsNB6JkI7WWTU614h/uf1UxPdBRmTSWqoJItDUcqRiVFRABoyRYnhM0swUcxmRWSMFSbG1rRyJRT56t8KtiRvvZJN0rmpe27de2zUmo1lXWW4gEu4Bg9uoQkP0II2EEjgBV7hzXl23p0P53MxWnKWO+ewAufrF52Fn1Q=</latexit>sha1_base64="N/IVEjWXrhtbuzzQMNkd2r1Au6E=">AAAB+XicbVDLSgNBEOyNrxijxrOXwSB4Crte9Ch48RjBPCBZwuxsbzJkHsvMrBCW3MWr/+FN/BZ/wy9w8jgYY0FDTVU33VNJLrh1YfgVVHZ29/YPqoe1o3rt+OS0Ue9aXRiGHaaFNv2EWhRcYcdxJ7CfG6QyEdhLpvcLv/eMxnKtntwsx1jSseIZZ9R5qT1qNMNWuATZJtGaNGGNUeN7mGpWSFSOCWrtIApzF5fUOM4EzmvDwmJO2ZSOceCpohJtXC7PnJNLr6Qk08aXcmSp/p4oqbR2JhPfKamb2L/eQvzPGxQuu41LrvLCoWKrRVkhiNNk8WeScoPMiZknlBnubyVsQg1lziezsSWR8823F3xG0d9Etkn3uhWFregxhCqcwwVcQQQ3cAcP0IYOMEjhFd6Cl+A9+FhlWQnWoZ7BBoLPH15ql7w=</latexit>sha1_base64="V+imswkH757IJZEsK7LPnYxHNdQ=">AAACCHicbZDNSgMxFIXv1L9aq1a3boJFcFUybnQpuHFZwf5AZyiZNNOGJpkhyRTKMI+hW30Pd+JL+Bo+gZm2C9t6IXByTsI9fFEquLEYf3uVnd29/YPqYe2ofnxy2jird02Saco6NBGJ7kfEMMEV61huBeunmhEZCdaLpg9l3psxbXiinu08ZaEkY8VjTol1VhBIYidRnM+KoT9sNHELLwZtC38lmrCa9rDxE4wSmkmmLBXEmIGPUxvmRFtOBStqQWZYSuiUjNnASUUkM2G+6FygK+eMUJxod5RFC/fvj5xIY+Yyci/LjmYzK83/skFm47sw5yrNLFN0uSjOBLIJKgGgEdeMWjF3glDNXVdEJ0QTah2mtS2RLNbvznCQ/E0k26J70/Jxy3/CUIULuIRr8OEW7uER2tABCim8whu8ey/eh/e5xFnxVlzPYW28r1/F3p3k</latexit>sha1_base64="7FL/9SW8ewIODxtcPQZ2SL6UWao=">AAACE3icbVDLSgMxFL1TX7W+qi7dBIvgqsyIoMuCG5cVbCt0hpJJM21okhmSTKEM8xm61f9wJ279AH/DLzDTzsK2HgicnHMv93DChDNtXPfbqWxsbm3vVHdre/sHh0f145OujlNFaIfEPFZPIdaUM0k7hhlOnxJFsQg57YWTu8LvTanSLJaPZpbQQOCRZBEj2FjJ9wU24zDKpvnAG9QbbtOdA60TryQNKNEe1H/8YUxSQaUhHGvd99zEBBlWhhFO85qfappgMsEj2rdUYkF1kM0z5+jCKkMUxco+adBc/buRYaH1TIR2ssioV71C/M/rpya6DTImk9RQSRaHopQjE6OiADRkihLDZ5ZgopjNisgYK0yMrWnpSijy5b8VbEneaiXrpHvV9Nym9+A2WtdlXVU4g3O4BA9uoAX30IYOEEjgBV7hzXl23p0P53MxWnHKnVNYgvP1C5xFn1A=</latexit>
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages10 Page
-
File Size-