Liferay Portal 6 Enterprise Intranets

Total Page:16

File Type:pdf, Size:1020Kb

Liferay Portal 6 Enterprise Intranets Liferay Portal 6 Enterprise Intranets Build and maintain impressive corporate intranets with Liferay Jonas X. Yuan BIRMINGHAM - MUMBAI Liferay Portal 6 Enterprise Intranets Copyright © 2010 Packt Publishing All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews. Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing, and its dealers and distributors will be held liable for any damages caused or alleged to be caused directly or indirectly by this book. Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information. First published: April 2010 Production Reference: 1230410 Published by Packt Publishing Ltd. 32 Lincoln Road Olton Birmingham, B27 6PA, UK. ISBN 978-1-849510-38-7 www.packtpub.com Cover Image by Karl Swedberg ([email protected]) Credits Author Editorial Team Leader Jonas X. Yuan Aanchal Kumar Reviewer Project Team Leader Amine Bousta Lata Basantani Acquisition Editor Project Coordinator Dilip Venkatesh Shubhanjan Chatterjee Development Editor Proofreaders Mehul Shetty Aaron Nash Lesley Harrison Technical Editors Aditya Belpathak Graphics Alfred John Geetanjali Sawant Charumathi Sankaran Nilesh Mohite Copy Editors Production Coordinators Leonard D'Silva Avinish Kumar Sanchari Mukherjee Aparna Bhagat Nilesh Mohite Indexers Hemangini Bari Cover Work Rekha Nair Aparna Bhagat About the Author Dr. Jonas X. Yuan is a Senior Technical Analyst at CIGNEX. He holds a Ph. D. in Computer Science from the University of Zurich, specializing in Integrity Control in Federated Database Systems. He earned his M.S. and B.S. degrees from China, where he conducted research on expert systems for predicting landslides. Jonas is experienced in Systems Development Life Cycle (SDLC). Previously, he worked as a Project Manager and a Technical Architect in Web GIS (Geographic Information System). He has hands-on skills in J2EE technologies. Most importantly, he has developed a BPEL (Business Process Execution Language) Engine called BPELPower from scratch in the NASA data center. He has a lot of experience with content management and publishing such as Media/Games/Publishing. He is also an expert on Liferay Portal, Content Management Systems (CMS) Alfresco, Ad Server OpenX, BPM Intalio, Business Intelligence Pentaho, LDAP, and SSO. Acknowledgement I would like to thank all my team members at Liferay, specially Raymond Auge, Brian Chan, Bryan Cheung, Jorge Ferrer, Michael Young, Jerry Niu, Ed Shin, Craig Kaneko, Brian Kim, Bruno Farache, Thiago Moreira, Amos Fong, Scott Lee, David Truong, Alexander Chow, Mika Koivisto, Julio Camarero, Douglas Wong, Ryan Park, Eric Min, John Langkusch, Marco Abamonga, Ryan Park, Eric Min, John Langkusch, Marco Abamonga, Michael Han, Samuel Kong, Nate Cavanaugh, Arcko Duan, Richard Sezov, Joshua Asbury, Shuyang Zhou of Liferay for providing all the support and valuable information. A special thanks to all my team members at CIGNEX for making this book a reality. I would like to thank Paul Anthony, Munwar Shariff, and Rajesh Devidasani for their encouragement and great support; our sales and presales team, Amit Babaria, Harish Ramachandran, helped me understand what customers are looking at. Our consulting team Robert Chen, Mike Walker, Venkata Challagulla, Michael Venford and Frank Yu presented me the various flavors of Liferay implementations with real- life examples. I am thankful to them. I sincerely thank and appreciate Dilip Venkatesh and Mehul Shetty, Senior Acquisition Editor and Development Editor respectively at Packt Publishing for criticizing and fixing my writing style. Thanks to Aditya Belpathak, Alfred John, Charumathi Sankaran, Shubhanjan Chatterjee and entire team at Packt Publishing; it is really joyful to work with them. Last but not least, I would like to thank my parents and my wife, Linda, for their love, understanding and encouragement. My special thanks to my wonderful and understanding kid Joshua. About the Reviewer Amine Bousta is a French Senior Technical Analyst in Groupe Open. He holds a French diploma of Computer Engineering (equivalent to master's degree) and a Technology Research Diploma in Artificial Intelligence (neural networks applied to e-commerce). Specialized in JEE, he monitors and studies most of the related technologies and open source products in order to apply them as solutions for various companies' needs. From 2007, his technology watch on open source Java Portal Engines led to Liferay Portal as the most flexible and complete free portlets engine. The Liferay Community Site gave him the opportunity to meet Dr Jonas X. Yuan and read his previous publications, then become the technical reviewer of this book. I would like to warmly thank my family, my dear co-workers, and my friends for not spamming me with their usual pointless technical questions while I was reviewing this book! So... my special thanks to the following List of Nominees in Java Harassment Category: Julien Nicolet, Laurent Witt, Jacques Scius, Nicolas Burg, Olivier Spieser, Angelo Zerr, Olivier Dub, Diana Garzon, Hadrien Beaufils, Mohamadou Kane, and Mohamed Cherroud. And in the Microsoft-Software-Time-Wasting-Questions Category: my beloved mom and sister. ;-) Table of Contents Preface 1 Chapter 1: Introducing Liferay for your Intranet 9 What are we going to build? 9 Hosted discussions 9 Collaborative documents using wikis 10 Team and individual blogs 10 Shared calendars 11 Document stores—CMS 11 Web Content Management—WCM 11 Personalization and internalization 12 Workflow, staging, scheduling, and publishing 12 Social network and Social Office 13 And more... 13 Everything in one place 13 Introducing Palm Tree Publications 14 Liferay portal framework 16 Service Oriented Architecture 16 Enterprise Service Bus 17 Why use Liferay to build an intranet? 18 It has got the features we need 18 It's easy to set up and use 18 It's free and open source 18 It will grow with you 19 It works with other tools 19 It is based on "standards" 20 What else can Liferay do? 21 Corporate extranets 21 Collaborative websites 21 Table of Contents Content management and web publishing 21 Infrastructure portals 22 Finding more information 22 Summary 23 Chapter 2: Setting Up a Home Page and Navigation Structure for the Intranet 25 Experiencing the portal 26 Getting portal pages 26 What's happening? 28 Experiencing portlets 28 Using the portlet container 29 How does the portal work? 29 Working of portlets 30 Portlet lifecycle 31 Terminologies, scope, and hierarchy 32 Setting up the portal 33 Using the Liferay portal bundled with JBoss 5.x in Windows 34 Using Liferay portal bundled with Tomcat 6.x in Linux 35 Clean and restart 36 Setting up production servers 36 Setting up domains 37 Shortening the URL 37 More options for portal installation 38 Flexible deployment matrix 38 Building pages 39 Adding pages 40 Removing pages 41 Adding portlets 42 Removing portlets 42 Changing layout templates 43 Setting up portal pages 44 Managing pages 44 Adding a child page 45 Changing the display order 46 Editing a page 46 Multiple languages 46 Friendly URL 47 Page icon 47 Copying pages 47 Deleting a page 48 Changing the look and feel: themes 48 Customizing portlets 49 Portlet Look and Feel 50 [ ii ] Table of Contents Setting up portlet configuration 51 Navigating the structure of an intranet site 52 Configuring portal 53 Configuring portal paths 54 Customizing portal configuration 55 Adding extended properties files 55 Portal structure 56 Portal context 57 Terms of Use 58 Reminder queries 59 Session settings 60 Layout types 62 Default user public layouts and private layouts 63 Default admin 65 Guest layouts 65 Friendly URL 66 Look and Feel 66 Dock bar menu 68 What's happening? 68 Dock bar menu in themes 70 My Places 70 Database connections 71 JNDI name 72 Properties 72 Populating with default data 73 Transaction isolation 74 Custom SQL 74 Mail configuration 75 Mail session 75 Java-mail 77 Mail hook 78 Bring pages together in action 79 Page Comments 79 Page Ratings 80 Page Templates 81 Page Flags 82 How do you share portlets within a portal page? 83 Configuring Control Panel 84 Administration 85 My Account 85 My Pages 86 Configuration and settings 87 Adding custom portlets 88 What's happening? 89 Summary 89 [ iii ] Table of Contents Chapter 3: Bringing in Users 91 Enterprise, departments, and locations 92 Role-based access control 93 Role and permission 94 User 94 Organization and Location 94 UserGroup and Community 95 Organizations 95 Adding a top-level Organization 95 Managing organizations 97 Adding child organizations 97 Viewing organizations 98 Searching organizations 98 Editing an organization 99 Deleting an organization 100 Assigning users to an organization 101 Using organizations in an effective way 102 Locations 102 Adding a location for the enterprise 102 Adding locations for main organizations 103 Viewing locations 104 Searching locations 104 Editing a location 105 Deleting a location 105 Assigning users to a location 106 Using locations in an effective
Recommended publications
  • Copyright 2009 Cengage Learning. All Rights Reserved. May Not Be Copied, Scanned, Or Duplicated, in Whole Or in Part
    Index Note: Page numbers referencing fi gures are italicized and followed by an “f ”. Page numbers referencing tables are italicized and followed by a “t”. A Ajax, 353 bankruptcy, 4, 9f About.com, 350 Alexa.com, 42, 78 banner advertising, 7f, 316, 368 AboutUs.org, 186, 190–192 Alta Vista, 7 Barack Obama’s online store, 328f Access application, 349 Amazon.com, 7f, 14f, 48, 247, BaseballNooz.com, 98–100 account managers, 37–38 248f–249f, 319–320, 322 BBC News, 3 ActionScript, 353–356 anonymity, 16 Bebo, 89t Adobe Flash AOL, 8f, 14f, 77, 79f, 416 behavioral changes, 16 application, 340–341 Apple iTunes, 13f–14f benign disinhibition, 16 fi le format. See .fl v fi le format Apple site, 11f, 284f Best Dates Now blog, 123–125 player, 150, 153, 156 applets, Java, 352, 356 billboard advertising, 369 Adobe GoLive, 343 applications, see names of specifi c bitmaps, 290, 292, 340, 357 Adobe Photoshop, 339–340 applications BJ’s site, 318 Advanced Research Projects ARPA (Advanced Research Black Friday, 48 Agency (ARPA), 2 Projects Agency), 2 blog communities, 8f advertising artistic fonts, 237 blog editors, 120, 142 dating sites, 106 ASCO Power University, 168–170 blog search engines, 126 defi ned, 397 .asf fi le format, 154t–155t Blogger, 344–347 e-commerce, 316 AskPatty.com, 206–209 blogging, 7f, 77–78, 86, 122–129, Facebook, 94–96 AuctionWeb, 7f 133–141, 190, 415 family and lifestyle sites, 109 audience, capturing and retaining, blogosphere, 122, 142 media, 373–376 61–62, 166, 263, 405–407, blogrolls, 121, 142 message, 371–372 410–422, 432 Blue Nile site,
    [Show full text]
  • Address Munging: the Practice of Disguising, Or Munging, an E-Mail Address to Prevent It Being Automatically Collected and Used
    Address Munging: the practice of disguising, or munging, an e-mail address to prevent it being automatically collected and used as a target for people and organizations that send unsolicited bulk e-mail address. Adware: or advertising-supported software is any software package which automatically plays, displays, or downloads advertising material to a computer after the software is installed on it or while the application is being used. Some types of adware are also spyware and can be classified as privacy-invasive software. Adware is software designed to force pre-chosen ads to display on your system. Some adware is designed to be malicious and will pop up ads with such speed and frequency that they seem to be taking over everything, slowing down your system and tying up all of your system resources. When adware is coupled with spyware, it can be a frustrating ride, to say the least. Backdoor: in a computer system (or cryptosystem or algorithm) is a method of bypassing normal authentication, securing remote access to a computer, obtaining access to plaintext, and so on, while attempting to remain undetected. The backdoor may take the form of an installed program (e.g., Back Orifice), or could be a modification to an existing program or hardware device. A back door is a point of entry that circumvents normal security and can be used by a cracker to access a network or computer system. Usually back doors are created by system developers as shortcuts to speed access through security during the development stage and then are overlooked and never properly removed during final implementation.
    [Show full text]
  • Adversarial Web Search by Carlos Castillo and Brian D
    Foundations and TrendsR in Information Retrieval Vol. 4, No. 5 (2010) 377–486 c 2011 C. Castillo and B. D. Davison DOI: 10.1561/1500000021 Adversarial Web Search By Carlos Castillo and Brian D. Davison Contents 1 Introduction 379 1.1 Search Engine Spam 380 1.2 Activists, Marketers, Optimizers, and Spammers 381 1.3 The Battleground for Search Engine Rankings 383 1.4 Previous Surveys and Taxonomies 384 1.5 This Survey 385 2 Overview of Search Engine Spam Detection 387 2.1 Editorial Assessment of Spam 387 2.2 Feature Extraction 390 2.3 Learning Schemes 394 2.4 Evaluation 397 2.5 Conclusions 400 3 Dealing with Content Spam and Plagiarized Content 401 3.1 Background 402 3.2 Types of Content Spamming 405 3.3 Content Spam Detection Methods 405 3.4 Malicious Mirroring and Near-Duplicates 408 3.5 Cloaking and Redirection 409 3.6 E-mail Spam Detection 413 3.7 Conclusions 413 4 Curbing Nepotistic Linking 415 4.1 Link-Based Ranking 416 4.2 Link Bombs 418 4.3 Link Farms 419 4.4 Link Farm Detection 421 4.5 Beyond Detection 424 4.6 Combining Links and Text 426 4.7 Conclusions 429 5 Propagating Trust and Distrust 430 5.1 Trust as a Directed Graph 430 5.2 Positive and Negative Trust 432 5.3 Propagating Trust: TrustRank and Variants 433 5.4 Propagating Distrust: BadRank and Variants 434 5.5 Considering In-Links as well as Out-Links 436 5.6 Considering Authorship as well as Contents 436 5.7 Propagating Trust in Other Settings 437 5.8 Utilizing Trust 438 5.9 Conclusions 438 6 Detecting Spam in Usage Data 439 6.1 Usage Analysis for Ranking 440 6.2 Spamming Usage Signals 441 6.3 Usage Analysis to Detect Spam 444 6.4 Conclusions 446 7 Fighting Spam in User-Generated Content 447 7.1 User-Generated Content Platforms 448 7.2 Splogs 449 7.3 Publicly-Writable Pages 451 7.4 Social Networks and Social Media Sites 455 7.5 Conclusions 459 8 Discussion 460 8.1 The (Ongoing) Struggle Between Search Engines and Spammers 460 8.2 Outlook 463 8.3 Research Resources 464 8.4 Conclusions 467 Acknowledgments 468 References 469 Foundations and TrendsR in Information Retrieval Vol.
    [Show full text]
  • Efficient and Trustworthy Review/Opinion Spam Detection
    International Journal on Recent and Innovation Trends in Computing and Communication ISSN: 2321-8169 Volume: 5 Issue: 4 86 – 94 _____________________________________________________________________________________________ Efficient and Trustworthy Review/Opinion Spam Detection Sanketi P. Raut Prof. Chitra Wasnik ME II (Computer), Computer Dept. Lokmanya Tilak College of Engineering, Lokmanya Tilak College of Engineering, New Mumbai, New Mumbai, Email:[email protected] Email:[email protected] Abstract - The most common mode for consumers to express their level of satisfaction with their purchases is through online ratings, which we can refer as Online Review System. Network analysis has recently gained a lot of attention because of the arrival and the increasing attractiveness of social sites, such as blogs, social networking applications, micro blogging, or customer review sites. The reviews are used by potential customers to find opinions of existing users before purchasing the products. Online review systems plays an important part in affecting consumers' actions and decision making, and therefore attracting many spammers to insert fake feedback or reviews in order to manipulate review content and ratings. Malicious users misuse the review website and post untrustworthy, low quality, or sometimes fake opinions, which are referred as Spam Reviews. In this study, we aim at providing an efficient method to identify spam reviews and to filter out the spam content with the dataset of gsmarena.com. Experiments on the dataset collected from gsmarena.com show that the proposed system achieves higher accuracy than the standard naïve bayes. Keywords - Review Spam detection, Opinion, Text mining, WordNet, naïve bayes classifier, logistic Regression. __________________________________________________*****_________________________________________________ making purchase. Opinions/Reviews are middle of almost all I.
    [Show full text]
  • The History of Digital Spam
    The History of Digital Spam Emilio Ferrara University of Southern California Information Sciences Institute Marina Del Rey, CA [email protected] ACM Reference Format: This broad definition will allow me to track, in an inclusive Emilio Ferrara. 2019. The History of Digital Spam. In Communications of manner, the evolution of digital spam across its most popular appli- the ACM, August 2019, Vol. 62 No. 8, Pages 82-91. ACM, New York, NY, USA, cations, starting from spam emails to modern-days spam. For each 9 pages. https://doi.org/10.1145/3299768 highlighted application domain, I will dive deep to understand the nuances of different digital spam strategies, including their intents Spam!: that’s what Lorrie Faith Cranor and Brian LaMacchia ex- and catalysts and, from a technical standpoint, how they are carried claimed in the title of a popular call-to-action article that appeared out and how they can be detected. twenty years ago on Communications of the ACM [10]. And yet, Wikipedia provides an extensive list of domains of application: despite the tremendous efforts of the research community over the last two decades to mitigate this problem, the sense of urgency ``While the most widely recognized form of spam is email spam, the term is applied to similar abuses in other media: instant remains unchanged, as emerging technologies have brought new messaging spam, Usenet newsgroup spam, Web search engine spam, dangerous forms of digital spam under the spotlight. Furthermore, spam in blogs, wiki spam, online classified ads spam, mobile when spam is carried out with the intent to deceive or influence phone messaging spam, Internet forum spam, junk fax at scale, it can alter the very fabric of society and our behavior.
    [Show full text]
  • History of Spam
    1 Max W. Mosing1 THE UPS AND DOWNS IN THE HISTORY OF EU-SPAM-REGULATIONS AND THEIR PRACTICAL IMPACT “Two years from now, spam will be solved. I promise a spam-free world by 2006” (Bill Gates; January 2004) Table of content: 1. (Legal) History of Spam................................................................................................................................................................................... 1 1.1. „Monty Python“and „Canter and Seigel“............................................................................................................................................ 2 1.2. Definition of Spam from a Legal Perspective ...................................................................................................................................... 2 1.3. Harm of Spam and– if the Internet community wants to ban spam – why is spam so successful?........................................... 3 1.4. Legal Framework and legal background of spam .............................................................................................................................. 4 2. “Spam-Regulations” in EU-Directives .......................................................................................................................................................... 6 2.1. Directive 97/66/EC concerning the processing of personal data and the protection of privacy in the telecommunications sector – ISDN-Directive and National Laws..............................................................................................................................................
    [Show full text]
  • Spam in Blogs and Social Media
    ȱȱȱȱ ȱ Pranam Kolari, Tim Finin Akshay Java, Anupam Joshi March 25, 2007 ȱ • Spam on the Internet –Variants – Social Media Spam • Reason behind Spam in Blogs • Detecting Spam Blogs • Trends and Issues • How can you help? • Conclusions Pranam Kolari is a UMBC PhD Tim Finin is a UMBC Professor student. His dissertation is on with over 30 years of experience spam blog detection, with tools in the applying AI to information developed in use both by academia and systems, intelligent interfaces and industry. He has active research interest robotics. Current interests include social in internal corporate blogs, the Semantic media, the Semantic Web and multi- Web and blog analytics. agent systems. Akshay Java is a UMBC PhD student. Anupam Joshi is a UMBC Pro- His dissertation is on identify- fessor with research interests in ing influence and opinions in the broad area of networked social media. His research interests computing and intelligent systems. He include blog analytics, information currently serves on the editorial board of retrieval, natural language processing the International Journal of the Semantic and the Semantic Web. Web and Information. Ƿ Ȭȱ • Early form seen around 1992 with MAKE MONEY FAST • 80-85% of all e-mail traffic is spam • In numbers 2005 - (June) 30 billion per day 2006 - (June) 55 billion per day 2006 - (December) 85 billion per day 2007 - (February) 90 billion per day Sources: IronPort, Wikipedia http://www.ironport.com/company/ironport_pr_2006-06-28.html ȱȱǵ • “Unsolicited usually commercial e-mail sent to a large
    [Show full text]
  • Detecting Web Robots with Passive Behavioral Analysis and Forced Behavior by Douglas Brewer (Under the Direction of Kang Li)
    Detecting Web Robots with Passive Behavioral Analysis and Forced Behavior by Douglas Brewer (Under the direction of Kang Li) Abstract The use of web robots has exploded on today's World Wide Web (WWW). Web robots are used for various nefarious activities including click fraud, spamming, email scraping, and gaining an unfair advantage. Click fraud and unfair advantage present a way for bot writers to gain a monetary advantage. Click fraud web bots allow their authors to trick advertisement (ad) networks into paying for clicks on ads that do not belong to a human being. This costs ad networks and advertisers money they wanted to spend on advertisements for human consumption. It also affects the reputation for ad networks in the eyes of advertisers. These problems make combating web robots an important and necessary step on the WWW. Combating web robots is done by various methods that provide the means to make the distinction between bot visits and human visits. These methods involve looking at web server logs or modifying the pages on a website The enhanced log method for identifying web robots expands on previous work using web server access logs to find attributes and uses AI techniques like decision-trees and bayesian- networks for detection. Enhanced log analysis first breaks down attributes into passive and active methods; it also goes further looking at deeper levels of logging including traffic traces. This allows for previously unexplored attributes like web user agent sniffing, condi- tional comments, OS sniffing, javascript requests, and specific HTTP responses. Expanding the attribute footprint in log analysis allows for unique web bot identification.
    [Show full text]
  • D2.4 Weblog Spider Prototype and Associated Methodology
    SEVENTH FRAMEWORK PROGRAMME FP7-ICT-2009-6 BlogForever Grant agreement no.: 269963 BlogForever: D2.4 Weblog spider prototype and associated methodology Editor: M. Rynning Revision: First version Dissemination Level: Public Author(s): M. Rynning, V. Banos, K. Stepanyan, M. Joy, M. Gulliksen Due date of deliverable: 30th November 2011 Actual submission date: 30th November 2011 Start date of project: 01 March 2011 Duration: 30 months Lead Beneficiary name: CyberWatcher Abstract: This document presents potential solutions and technologies for monitoring, capturing and extracting data from weblogs. Additionally, the selected weblog spider prototype and associated methodologies are analysed. D2.2 Report: Weblog Spider Prototype and Associated Methodology 30 November 2011 Project co-funded by the European Commission within the Seventh Framework Programme (2007-2013) The BlogForever Consortium consists of: Aristotle University of Thessaloniki (AUTH) Greece European Organization for Nuclear Research (CERN) Switzerland University of Glasgow (UG) UK The University of Warwick (UW) UK University of London (UL) UK Technische Universitat Berlin (TUB) Germany Cyberwatcher Norway SRDC Yazilim Arastrirma ve Gelistrirme ve Danismanlik Ticaret Limited Sirketi (SRDC) Turkey Tero Ltd (Tero) Greece Mokono GMBH Germany Phaistos SA (Phaistos) Greece Altec Software Development S.A. (Altec) Greece BlogForever Consortium Page 2 of 71 D2.2 Report: Weblog Spider Prototype and Associated Methodology 30 November 2011 History Version Date Modification reason Modified
    [Show full text]
  • A Survey on Web Spam and Spam 2.0
    International Journal of Advanced Computer Research (ISSN (print): 2249-7277 ISSN (online): 2277-7970) Volume-4 Number-2 Issue-15 June-2014 A Survey on Web Spam and Spam 2.0 Ashish Chandra1, Mohammad Suaib2 Abstract 2. Search Engine Spamming In current scenario web is huge, highly distributive, According to Fetterly [3] "web pages that hold no open in nature and changing rapidly. The open actual information value but created to lure web nature of web is the main reason for rapid growth searches to sites that would otherwise not visited". but it has imposed a challenge to Information Retrieval. The one of the biggest challenge is spam. 2.1 Search Engine Result Ranking We focus here to have a study on different forms of Search engines rank pages to search result according the web spam and its new variant called spam 2.0, to two main features. Relevancy of page with respect existing detection methods proposed by different to query (Dynamic Ranking) and Authoritativeness researches and challenges that are still unanswered. of the page (Static Ranking). Keywords Dynamic Ranking is calculated at query time. It depends on query, user's location, day, time, query Web spam, web 2.0, web spam 2.0, search engine, search history, location of page etc. Static Ranking engine ranking, social media spam. involves query independent features of the page to avoid spam. It is pre-computed at index- time. Some 1. Introduction among few hundred features are: Total length of the content, The term Spamming is defined by Gyongyi (2005)[1] Frequency of keyword (most frequent as: any deliberate action solely to boost ranking of a words), web page is known as spamming.
    [Show full text]
  • Pranam Kolari
    On Leveraging Social Media Pranam Kolari Tim Finin, Akshay Java & eBiquity folks! .. collaborations with IBM, NEC, JHU and others SOCIAL MEDIA Social media describes the online technologies and practices that people use to share opinions, insights, experiences, and perspectives with each other. Wikipedia 06 SOCIAL MEDIA Social media describes the online technologies and practices that people use to share opinions, insights, experiences, and perspectives and engage with each other. Wikipedia 07 SOCIAL MEDIA • Engagement protocols defined by platforms; – Blogs, Social Networks, Wiki, Micro-blogs • around content types; – text, audio, video, read-write Web, avatars • instantiated by applications; – Live Spaces, YouTube, Wikipedia, flickr • enabling online communities. SOCIAL MEDIA • Pew (2007) : 55 percent of American youth age 12 to 17 use online social networking sites • Hitwise (February 2007): 6.5% of all Internet visits for social networking sites • Andrew Tomkins at ICWSM 2007 – Professional vs. Personal (Social) Content – 4GB/day vs. 5-10GB/day (minus songs/videos) – 90% vs. 10% clicks – good ranking vs. crazy good ranking SOCIAL MEDIA RESEARCH Efforts best described by published papers in 3 workshops (2004, 2005, 2006) and at ICWSM 2007 A simple experiment… SOCIAL MEDIA RESEARCH Social Media: 2004, 2005, 2006 Web: www 2007 database, communities, ontology, server, analysis, ties, user, applications, moods, bloggers, databases, weblogs, topics, policies, services, blogs, weblog, personalized, blogosphere, scalable, mobile, blog networks,
    [Show full text]
  • A Co-Classification Framework for Detecting Web Spam
    A Co-classification Framework for Detecting Web Spam and Spammers in Social Media Web Sites Feilong Chen, Pang-Ning Tan, and Anil K. Jain Department of Computer Science & Engineering Michigan State University East Lansing, Michigan 48824 {chenfeil, ptan, jain}@cse.msu.edu ABSTRACT Despite the extensive research in this area, Web spam de- Social media are becoming increasingly popular and have tection remains a critically important but unsolved problem attracted considerable attention from spammers. Using a because spammers may adjust their strategies to adapt to sample of more than ninety thousand known spam Web the defense mechanisms employed against them. sites, we found between 7% to 18% of their URLs are posted Social media are becoming increasingly popular and have on two popular social media Web sites, digg.com and de- attracted considerable attention from spammers. From a licious.com. In this paper, we present a co-classi¯cation list of 94; 198 spam Web sites extracted from a benchmark framework to detect Web spam and the spammers who are email spam data [9], we found 6; 420 (¼ 7%) of them were responsible for posting them on the social media Web sites. posted at digg.com and 16; 537 (¼ 18%) of them were posted The rationale for our approach is that since both detection at delicious.com. While there has been considerable re- tasks are related, it would be advantageous to train them search to detect spam from hyperlinked Web pages to im- simultaneously to make use of the labeled examples in the prove search engine performance, spam detection from social Web spam and spammer training data.
    [Show full text]