ISEDJ) 9 (4) September 2011

Total Page:16

File Type:pdf, Size:1020Kb

ISEDJ) 9 (4) September 2011 Volume 9, No. 4 September 2011 ISSN: 1545-679X Information Systems Education Journal Research Articles: 4 Creating and Using a Computer Networking and Systems Administration Laboratory Built Under Relaxed Financial Constraints Michael P. Conlon, Slippery Rock University Paul Mullins, Slippery Rock University 11 Teach or No Teach: Is Large System Education Resurging Aditya Sharma, North Carolina Central University Marianne C. Murphy, North Carolina Central University 20 Assessing Blackboard: Improving Online Instructional Delivery Adnan A. Chawdhry, California University of PA Karen Paullet, American Public University System Daniel Benjamin, American Public University System 27 Towards an Innovative Web-based Lab Delivery System for a Management Information Systems Course Delivery Eric Breimer, Siena College Jami Colter, Siena College Robert Yoder, Siena College 37 Computer Ethics: A Slow Fade from Black and White to Shades of Gray Theresa A. Kraft, University of Michigan – Flint Judith Carlisle, University of Michigan – Flint 55 Exploring the Connection between Age and Strategies for Learning New Technology Related Gabriele Meiselwitz, Towson University Suranjan Chakraborty, Towson University 63 Selecting a Good Conference Location Based on Participants’ Interest Muhammed Miah, Southern University at New Orleans 73 Additional Support for the Information Systems Analyst Exam as a Valid Program Assessment Tool Donald A. Carpenter, Colorado Mesa University Johnny Snyder, Colorado Mesa University Gayla Jo Slauson, Colorado Mesa University Morgan K. Bridge, Colorado Mesa University Teaching Case: 80 Solving Relational Database Problems with ORDBMS in an Advanced Database Course Ming Wang, California State University Information Systems Education Journal (ISEDJ) 9 (4) September 2011 The Information Systems Education Journal (ISEDJ) is a double-blind peer-reviewed academic journal published by EDSIG, the Education Special Interest Group of AITP, the Association of Information Technology Professionals (Chicago, Illinois). Publishing frequency is quarterly. The first year of publication is 2003. ISEDJ is published online (http://isedjorg) in connection with ISECON, the Information Systems Education Conference, which is also double-blind peer reviewed. Our sister publication, the Proceedings of ISECON (http://isecon.org) features all papers, panels, workshops, and presentations from the conference. The journal acceptance review process involves a minimum of three double-blind peer reviews, where both the reviewer is not aware of the identities of the authors and the authors are not aware of the identities of the reviewers. The initial reviews happen before the conference. At that point papers are divided into award papers (top 15%), other journal papers (top 30%), unsettled papers, and non-journal papers. The unsettled papers are subjected to a second round of blind peer review to establish whether they will be accepted to the journal or not. Those papers that are deemed of sufficient quality are accepted for publication in the ISEDJ journal. Currently the target acceptance rate for the journal is about 45%. Information Systems Education Journal is pleased to be listed in the 1st Edition of Cabell's Directory of Publishing Opportunities in Educational Technology and Library Science, in both the electronic and printed editions. Questions should be addressed to the editor at [email protected] or the publisher at [email protected]. 2011 AITP Education Special Interest Group (EDSIG) Board of Directors Alan Peslak Wendy Ceccucci Tom Janicki Penn State University Quinnipiac University Univ of NC Wilmington President 2011 Vice President President 2009-2010 Scott Hunsinger Michael Smith Brenda McAleer Appalachian State University High Point University Univ of Maine Augusta Membership Director Secretary Treasurer Michael Battig George Nezlek Leslie J. Waguespack Jr Saint Michael’s College Grand Valley State University Bentley University Director Director Director Mary Lind Li-Jen Shannon S. E. Kruck North Carolina A&T St Univ Sam Houston State Univ James Madison University Director Director JISE Editor Kevin Jetton Texas State University FITE Liaison Copyright © 2011 by the Education Special Interest Group (EDSIG) of the Association of Information Technology Professionals (AITP). Permission to make digital or hard copies of all or part of this journal for personal or classroom use is granted without fee provided that the copies are not made or distributed for profit or commercial use. All copies must bear this notice and full citation. Permission from the Editor is required to post to servers, redistribute to lists, or utilize in a for-profit or commercial use. Permission requests should be sent to Wendy Ceccucci, Editor, [email protected]. ©2011 EDSIG (Education Special Interest Group of the AITP) Page 2 www.aitp-edsig.org /www.isedj.org Information Systems Education Journal (ISEDJ) 9 (4) September 2011 Information Systems Education Journal Editors Wendy Ceccucci Thomas Janicki Don Colton Senior Editor Publisher Emeritus Editor Quinnipiac University Univ NC Wilmington Brigham Young University Hawaii Nita Brooks George Nezlek Mike Smith Associate Editor Associate Editor Associate Editor - Cases Middle Tennessee Grand Valley High Point University State University State University ISEDJ Editorial Board Alan Abrahams Brenda McAleer Li-Jen Shannon Virginia Tech University of Maine at Augusta Sam Houston State University Mike Battig Monica Parzinger Karthikeyan Umapathy Saint Michael’s College St. Mary’s University University of North Florida San Antonio Gerald DeHondt II Laurie Werner Grand Valley State University Doncho Petkov Miami University Eastern Connecticut State Univ. Janet Helwig Bruce White Dominican University Samuel Sambasivam Quinnipiac University Azusa Pacific University Mark Jones Charles Woratschek Lock Haven University Mark Segall Robert Morris University. Metropolitan State College of Cynthia Martincic Denver Peter Y. Wu Saint Vincent College Robert Morris University ©2011 EDSIG (Education Special Interest Group of the AITP) Page 3 www.aitp-edsig.org /www.isedj.org Information Systems Education Journal (ISEDJ) 9 (4) September 2011 Computer Ethics: A Slow Fade from Black and White to Shades of Gray Theresa A. Kraft [email protected] Judith Carlisle [email protected] Computer and Information Systems, University of Michigan – Flint Flint, MI 48502, USA Abstract The expanded use of teaching case based analysis based on current events and news stories relating to computer ethics improves student engagement, encourages creativity and fosters an active learning environment. Professional ethics standards, accreditation standards for computer curriculum, ethics theories, resources for ethics on the internet, and possible topics for ethical case analyses are presented. The motivation for teaching Cyber Ethics and a number of success strategies and recommendations for undergraduate Computer Ethics courses are provided. Keywords: ABET Accreditation Standards, Computing Curricula 2001(CC2001), Professional Code of Ethics, Computer Ethics, Cyber Ethics, Computer Science, Information Technology, Computer Information Systems, Pedagogy, Computers and Society. 1. INTRODUCTION is working or the answer to an equation). Thus, the teaching of Ethics must broaden the The Accreditation Board for Engineering and student’s perspectives and understanding of Technology (ABET) has nine common outcomes certain real world scenarios, which have no which apply to Computer Science (CS), absolute correct answer but rather multiple Computer Information Systems (CIS) and viewpoints and opinions. For example a current Information Technology Programs (IT). event topic such as the Chinese Censorship of Computer Ethics is one of these nine outcomes Microsoft and Google, has the Chinese and graduating students must have an Government attempting to police and control understanding of professional, ethical, legal, information provided to the general public, social issues and responsibilities (Homkes & which is contradictory to the freedom of the Strikwerda, 2009). press and free speech values of American In additional to satisfying the accreditation Society (Chao, 2010). standard, an undergraduate CS or CIS course This paper describes the course format for an must engage the students and encourage Undergraduate CS/CIS course taught at the analytical thinking, creative writing and University of Michigan Flint. The paper research. Students in the Engineering and presents the basic goals and objectives of the Technology fields are problem solvers who course, the motivation for teaching such a typically look for a single solution (i.e. the code course, the relevant codes of ethics, ethic ©2011 EDSIG (Education Special Interest Group of the AITP) Page 37 www.aitp-edsig.org /www.isedj.org Information Systems Education Journal (ISEDJ) 9 (4) September 2011 theories, overview of topics covered, grading area” or a “CS body of knowledge.” They are rubrics and additional resources for teaching distributed among the following 10 units: Computer Ethics. A number of success SP1: History of computing strategies and recommendations for teaching SP2: Social context of computing undergraduate Computer Ethics are also SP3: Methods and tools of analysis provided. SP4: Professional and ethical 2. COURSE GOALS AND OBJECTIVES responsibilities SP5: Risks and liabilities of safety- This course introduces students to the ethical critical systems issues and controversies that comprise
Recommended publications
  • Cloak and Swagger: Understanding Data Sensitivity Through the Lens of User Anonymity
    Cloak and Swagger: Understanding Data Sensitivity Through the Lens of User Anonymity Sai Teja Peddinti∗, Aleksandra Korolova†, Elie Bursztein†, and Geetanjali Sampemane† ∗Polytechnic School of Engineering, New York University, Brooklyn, NY 11201 Email: [email protected] †Google, 1600 Amphitheatre Parkway, Mountain View, CA 94043 Email: korolova, elieb, [email protected] Abstract—Most of what we understand about data sensitivity In this work we explore whether it is possible to perform is through user self-report (e.g., surveys); this paper is the first to a large-scale behavioral data analysis, rather than to rely on use behavioral data to determine content sensitivity, via the clues surveys and self-report, in order to understand what topics that users give as to what information they consider private or sensitive through their use of privacy enhancing product features. users consider sensitive. Our goal is to help online service We perform a large-scale analysis of user anonymity choices providers design policies and develop product features that during their activity on Quora, a popular question-and-answer promote user engagement and safer sharing and increase users’ site. We identify categories of questions for which users are more trust in online services’ privacy practices. likely to exercise anonymity and explore several machine learning Concretely, we perform analysis and data mining of the approaches towards predicting whether a particular answer will be written anonymously. Our findings validate the viability of usage of privacy features on one of the largest question-and- the proposed approach towards an automatic assessment of data answer sites, Quora [7], in order to identify topics potentially sensitivity, show that data sensitivity is a nuanced measure that considered sensitive by its users.
    [Show full text]
  • Cloud Computing and Related Laws
    Cloud Computing and Related Laws Law Offices of Salar Atrizadeh Online Privacy In general, privacy falls under two categories: 1. Corporate privacy 2. Personal privacy Corporate Privacy . It concerns the protection of corporate data from retrieval or interception by unauthorized parties . Security is important for protection of trade secrets, proprietary information, and privileged communications . The failure to maintain confidentiality can result in a loss of “trade secret” status . See Civil Code §§ 3426 et seq. Corporate Privacy . The recent trends in outsourcing have increased the risks associated with “economic espionage” . In fact, manufacturers should be cautious when transferring proprietary technology to overseas partners because foreign governments sponsor theft . See 18 U.S.C. §§ 1831 et seq. (e.g., economic espionage, theft of trade secrets) Helpful Policies . Identify and label confidential information . Restrict access to confidential information . Use encryption – e.g., truecrypt.org, axantum.com . Use firewall and secure username/password . Use software that detects trade secret theft – e.g., safe-corp.biz . Include warnings in privileged correspondence (e.g., “this email contains privileged communications”) Helpful Policies . Provide computers without hard drives + Prohibit use of removable storage (e.g., flash drives) . Audit employee computers . Prohibit and/or monitor external web-based email services . Execute Confidentiality and Non-disclosure Agreements . Execute Computer-Use Policies Personal Privacy . Constitution . Federal: Fourth Amendment protects against unreasonable searches and seizures . State: California Constitution, under Art. I, § 1 recognizes right to individual privacy . Federal computer crimes . Electronic Communications Privacy Act – 18 U.S.C. §§ 2510 et seq. Privacy Act – 5 U.S.C. § 552a . Computer Fraud and Abuse Act – 18 U.S.C.
    [Show full text]
  • Statement on Internet Search Engines by the Spanish Data
    Statement on Internet Search Engines Spanish Data Protection Agency 1st December 2007 1. The importance of search engines in information society Technology developments have opened up new possibilities of creating and accessing information on the Internet, and this situation requires that we consider the repercussions of technology on the rights of individuals—in principle, such repercussions are neutral. The Spanish Data Protection Agency (AEPD) has taken the initiative of analysing the privacy policies of the largest global companies providing search engine services on the Internet. To this end it collected information from Google, Microsoft and Yahoo!, said information being completed via meetings with the global privacy controllers of these corporations in Europe and the United States. We must underscore the huge importance, owing to the volume of data that is processed and the characteristics of that processing, of search engine services on the Internet (hereinafter, search engines), the main function of which is to provide lists of results relating to a search, such as addresses and files stored on web serves, by entering key words, thus arranging all of the information available on the Internet and making it more accessible. In addition, search engines usually provide customised services, allowing people to register by giving an email address and a password.1 In Spain, the fact that these are sensitive issues was shown by the recent appearance of the Director of the AEPD before the Constitutional Commission of the Congress on 28th
    [Show full text]
  • Privacy Seminar 2
    14-2-2014 Dan Perjovschi Privacy Seminar 2. Privacy: an overview Jaap-Henk Hoepman Digital Security (DS) Radboud University Nijmegen, the Netherlands @xotoxot // [email protected] // www.cs.ru.nl/~jhh Dan Perjovschi, 2007 Jaap-Henk Hoepman // Radboud University Nijmegen // 14-2-2014 // Privacy: an overview 2 Privacy Dimensions the right to be let alone relational privacy informational privacy / what is privacy self determination according to you? corporeal privacy locational/spatial privacy privacy spheres Jaap-Henk Hoepman // Radboud University Nijmegen // 14-2-2014 // Privacy: an overview 3 Jaap-Henk Hoepman // Radboud University Nijmegen // 14-2-2014 // Privacy: an overview 4 Don’t confuse these concepts! 7 types of privacy privacy of ● the person, security privacy ● behaviour and action, ● personal communication, ● data and image, ● thoughts and feelings, ● location and space, and data protection ● association (including group privacy). Finn, R.L., Wright, D., and Friedewald, M.: Seven types of privacy. CPDP 2012 Clarke, R.: Introduction to Dataveillance and Information Privacy, and Definitions of Terms, 1997 Jaap-Henk Hoepman // Radboud University Nijmegen // 14-2-2014 // Privacy: an overview 5 Jaap-Henk Hoepman // Radboud University Nijmegen // 14-2-2014 // Privacy: an overview 6 1 14-2-2014 Different definitons Contextual integrity The right to be let alone ● [Warren & Brandeis, 1890] Informational self-determination: The right to determine for yourself when, how and to what extend information about you is communicated to others
    [Show full text]
  • Thesis.Pdf (2.311Mb)
    Faculty of Humanities, Social Sciences and Education c Google or privacy, the inevitable trade-off Haydar Jasem Mohammad Master’s thesis in Media and documentation science … September 2019 Table of Contents 1 Introduction ........................................................................................................................ 1 1.1 The problem ............................................................................................................... 1 1.2 Research questions ..................................................................................................... 1 1.3 Keywords ................................................................................................................... 2 2 Theoretical background ...................................................................................................... 3 2.1 Google in brief ........................................................................................................... 3 2.2 Google through the lens of exploitation theory .......................................................... 4 2.2.1 Exploitation ................................................................................................ 4 2.2.2 The commodification Google’s prosumers ................................................ 5 2.2.3 Google’s surveillance economy ................................................................. 7 2.2.4 Behaviour prediction .................................................................................. 8 2.2.5 Google’s ‘playbor’
    [Show full text]
  • The Problem of the Human Flesh Search Engine
    ONLINE PRIVACY AND ONLINE SPEECH: THE PROBLEM OF THE HUMAN FLESH SEARCH ENGINE Weiwei Shen* I. INTRODUCTION .......................................................................... 268 II. THE HUMAN FLESH SEARCH ENGINE AND THE WANG FEI CASE ...................................................................................... 272 A. Why the Human Flesh Search Engine Is Popular in China ................................................................................ 272 B. The Wang Fei Case .......................................................... 276 III. BALANCING PRIVACY AND FREE SPEECH ................................ 282 A. Privacy in the Digital Age................................................ 285 B. Speech: Moving from the Offline World to the Online World ............................................................................... 286 C. Speech about Private Individuals and of Private Concern vis-a-vis Speech about Public Officials and of Public Concern ................................................................ 289 IV. REPERCUSSIONS AND REMEDIES IN AMERICAN LAW .............. 293 A. Various Repercussions Against Victims in the Human Flesh Search Engine ........................................................ 293 B. Torts Remedies in American Law .................................... 294 V. POLICY IMPLICATIONS FOR REGULATING THE HUMAN FLESH SEARCH ENGINE ..................................................................... 299 VI. CONCLUSION..........................................................................
    [Show full text]
  • Gratton Eloise 2012 These.Pdf
    Université de Montréal et Université Panthéon-Assas Paris II Redefining Personal Information in the Context of the Internet par Éloïse Gratton Faculté de droit Thèse présentée à la Faculté des études supérieures en vue de l’obtention du grade de Docteur en droit de la Faculté de droit de l’Université de Montréal et Docteur en droit de l’Université Panthéon-Assas Paris II Octobre 2012 © Eloïse Gratton, 2012 Université de Montréal Faculté des études supérieures Université Panthéon-Assas Paris II École doctorale Georges Vedel Droit public interne, science administrative et science politique Cette thèse intitulée: Redefining Personal Information in the Context of the Internet présentée en date du 30 octobre 2012 par: Éloïse Gratton a été évaluée par un jury composé des personnes suivantes: Vincent Gautrais Professeur titulaire, Université de Montréal directeur de recherche Danièle Bourcier Responsable du groupe “Droit gouvernance et technologies” au CERSA directrice de recherche Karim Benyekhlef Professeur titulaire, Université de Montréal membre du jury Gilles Guglielmi Professeur, Université Panthéon-Assas Paris 2 membre du jury Ian Kerr Professeur titulaire, Université d’Ottawa rapporteur externe Francis Rousseaux Professeur, Université de Reims rapporteur externe Université de Montréal et Université Panthéon-Assas Paris II Redefining Personal Information in the Context of the Internet par Éloïse Gratton Faculté de droit Thèse présentée à la Faculté des études supérieures en vue de l’obtention du grade de Docteur en droit de la Faculté de droit de l’Université de Montréal et Docteur en droit de l’Université Panthéon-Assas Paris II Octobre 2012 © Eloïse Gratton, 2012 GRATTON Éloïse | LL.D.
    [Show full text]
  • The Privacy Panic Cycle: a Guide to Public Fears About New Technologies
    The Privacy Panic Cycle: A Guide to Public Fears About New Technologies BY DANIEL CASTRO AND ALAN MCQUINN | SEPTEMBER 2015 Innovative new technologies often arrive on waves of breathless marketing Dire warnings about hype. They are frequently touted as “disruptive!”, “revolutionary!”, or the privacy risks associated with new “game-changing!” before businesses and consumers actually put them to technologies practical use. The research and advisory firm Gartner has dubbed this routinely fail to phenomenon the “hype cycle.” It is so common that many are materialize, yet conditioned to see right through it. But there is a corollary to the hype because memories cycle for new technologies that is less well understood and far more fade, the cycle of hysteria continues to pernicious. It is the cycle of panic that occurs when privacy advocates repeat itself. make outsized claims about the privacy risks associated with new technologies. Those claims then filter through the news media to policymakers and the public, causing frenzies of consternation before cooler heads prevail, people come to understand and appreciate innovative new products and services, and everyone moves on. Call it the “privacy panic cycle.” Dire warnings about the privacy risks associated with new technologies routinely fail to materialize, yet because memories fade, the cycle of hysteria continues to repeat itself. Unfortunately, the privacy panic cycle can have a detrimental effect on innovation. First, by alarming consumers, unwarranted privacy concerns can slow adoption of beneficial new technologies. For example, 7 out of 10 consumers said they would not use Google Glass, the now discontinued wearable head-mounted device, because of privacy concerns.1 Second, overwrought privacy fears can lead to ill-conceived policy responses that either THE INFORMATION TECHNOLOGY & INNOVATION FOUNDATION | SEPTEMBER 2015 PAGE 1 purposely hinder or fail to adequately promote potentially beneficial technologies.
    [Show full text]
  • The “Right to Be Forgotten” and Search Engine Liability
    BRUSSELS PRIVACY HUB WORKING PAPER VOL. 2 • N° 8 • DECEMBER 2016 THE “RIGHT TO BE FORGOTTEN” AND SEARCH ENGINE LIABILITY by Hiroshi Miyashita1 Abstract his paper aims to conduct a comparative study on the right to be forgotten by analyzing the different approaches on the intermediary liability. In the EU, Google Spain case in the Court of Justice clarified the liability of search engine on the ground of data controller’s respon- sibility to delist a certain search results in light of fundamental right of privacy and data protection. On the contrary, in the U.S., the search engine liability is broadly exempted under Tthe Communications Decency Act in terms of free speech doctrine. In Japan, the intermediary liability is not completely determined as the right to be forgotten cases are divided in the point of the search engine liability among judicial decisions. The legal framework of the intermediary liability varies in the context from privacy to e-commerce and intellectual property. In the wake of right to be forgotten case in the EU, it is important to streamline the different legal models on the intermediary liability if one desires to fix its reach of the effect on right to be forgotten. This paper analyzes that the models of the search engine liability are now flux across the borders, but should be compromised by way of the appropriate balance between privacy and free speech thorough the right to be forgotten cases. Keywords: Privacy, Data Protection, Right to be Forgotten, Search Engine, Intermediary Liability © BRUSSELS PRIVACY HUB q WORKING PAPER q VOL.
    [Show full text]
  • How to Secure Windows and Your Privacy -- with Free Software
    How to Secure Windows and Your Privacy -- with Free Software An Easy Guide for the Windows User By Howard Fosdick Fosdick Consulting Inc. © 2008 July 26 Version 2.1 Distribution: You may freely reproduce and distribute this guide however you like – but you may not change its contents in any way. This product is distributed at no cost under the terms of the Open Publication License with License Option A -- “Distribution of modified versions of this document is prohibited without the explicit permission of the copyright holder.” Feedback: Please send recommendations for improving this guide to the author at email address “ContactFCI” at the domain name “sbcglobal.net”. Disclaimer: This paper is provided without warranty. Fosdick Consulting Inc. and the author accept no responsibility for any use of the data contained herein. Trademarks: All trademarks included in this document are the property of their respective owners. About the Author: Howard Fosdick is an independent consultant who works hands-on with databases and operating systems. He’s written many articles, presented at conferences, founded software users groups, and invented concepts like hype curves and open consulting. Acknowledgments: Thank you to the reviewers without whose expert feedback this guide could not have been developed: Bill Backs, Huw Collingbourne, Rich Kurtz, Priscilla Polk, Janet Rizner, and others who prefer anonymity. Thank you also to the Association of PC Users (APCU), Better Software Association, BitWise Magazine, IBM Database Magazine, OS News, Privacy
    [Show full text]
  • Onion Privacy and Anonymous Browser Apk
    Onion privacy and anonymous browser apk Continue The onion browser has been designed to protect your privacy and to quickly browse the internet in a safe and anonymous way. The onion browser was designed to protect your privacy while browsing the web. Onion Browser automatically blocks a wide range of online trackers, and it's very easy to erase your history, passwords and cookies, so you don't get followed by things like unwanted ads. Browsing on the Internet is faster and more anonymous. The onion browser can be used with an orbot or Tor connection for more security and greater protection. To visit .onion sites, you must use a browser with an orbot or Tor connection. Your privacy is important to us and we want to protect it even when you surf the web. Use the Onion browser, faster, more secure and anonymous.An anonymous private browser for Android assigns users an anonymous encrypted IP address that makes them virtually untraceable and enhances overall user privacy and security. All traffic in the app is routed through a user network anynomized, allowing users to safely browse the Internet/Deep Web without disclosing their virtual identity. An anonymous private browser prevents tracking, registering users' actions and collecting statistics. Extended ad-blocking support and anti-monitoring also protects and restricts data leakage and user profiling. It aims to be the safest browser for Android by automatically clearing cookies, stories, encrypting all data in/out, and assigning you a new encrypted IP address that cannot be tracked back to your phone, provider or operator.
    [Show full text]
  • 1 EFF Comments on the Draft Recommendation on the Protection
    EFF comments on the Draft Recommendation on the protection of human rights with regard to search engines Committee of Experts on New Media (MC-NM) MC-NM(2010)004_en Katitza Rodriguez, EFF International Rights Director, [email protected] The Electronic Frontier Foundation (EFF) is grateful for the opportunity to submit comments to the Council of Europe Committee of Experts on New Media on the currently available version of the Draft Recommendation on the protection of human rights with regard to search engines. EFF is an international civil society non-governmental organization with more than 14,000 members worldwide, dedicated to the protection of citizens’ online privacy and freedom of expression rights. EFF engages in strategic litigation in the United States and works in a range of international and national policy venues to promote balanced laws that protect human rights, foster innovation and empower consumers. EFF is located in San Francisco, California, and has members in 67 countries throughout the world. EFF has over 4,300 members in the EU. In relation to the proposed draft recommendation, EFF respectfully asks the Council of Europe to revise its guidelines and recommendations to ensure that search engines will protect privacy vis-à-vis the government, foster transparency on search records requests, and increase due process protections. We also ask the Council of Europe to ensure that freedom of expression rights are respected by search engines. I. The Council of Europe needs to ensure privacy protections are not curtailed by search engines At a time when individuals regularly turn to search engines to find information on the Internet, search privacy is of paramount importance.
    [Show full text]