Programming Case: a Methodology for Programmatic Web Data Extraction

Total Page:16

File Type:pdf, Size:1020Kb

Programming Case: a Methodology for Programmatic Web Data Extraction Journal of Technology Research Volume 7 Programming case: A methodology for programmatic web data extraction John N. Dyer Georgia Southern University ABSTRACT Web scraping is a programmatic technique for extracting data from websites using software to simulate human navigation of webpages, with the purpose of automatically extracting data from the web. While many websites provide web services allowing users to consume their services for data transfer, other websites provide no such service(s) and it is incumbent on the user to write or use existing software to acquire the data. The purpose of this paper is to provide a methodology for development of a relatively simple program using the Microsoft Excel Web Query tool and Visual Basic for Applications that will programmatically extract webpage data that are not readily transferable or available in other electronic forms. The case presents an overview of web scraping with an application to extracting historical stock price data from Yahoo’s Finance® website. The case is suitable for students that have experience in an object-oriented a programming course, and further exposes students to using Excel and VBA, along with knowledge of basic webpage structure, to harvest data from the web. It is hoped that this paper can be used as a teaching and learning tool, as well as a basic template for academicians, students and practitioners that need to consume website data when data extraction web services are not readily available. The paper can also add value to student’s programming experience in the context of programming for a purpose. Keywords: Data Extraction, Web Scraping, Web Query, Web Services Copyright statement: Authors retain the copyright to the manuscripts published in AABRI journals. Please see the AABRI Copyright Policy at http://www.aabri.com/copyright.html Programmatic web data extraction, Page 1 Journal of Technology Research Volume 7 INTRODUCTION Increasingly, many individuals and organizations have the need to extract massive amounts of data from the web. The basic technique for extracting the data is web scraping, which can be loosely defined as a computer program to extract data from a website. Wed scraping is commonly used to facilitate online price comparisons, contact scraping, online product catalog scraping, weather data monitoring, economic/demographic/statistical data extraction, and web mashups, among other uses. Web scraping is also a subset of People-Oriented programming, which empowers individuals to program web-based self-fashioned tools that ultimately suit the user’s own needs (International Journal of People-Oriented Programming (IJPOP), n.d.). Often times a company releases its application program interface (API) to the public so that software developers can design products that are powered by its service for extracting data (Roos, 2007). In many cases an API doesn’t exist and the developers must write their own. Web scraping is commonly used to either extract data from a legacy system (which has no other mechanism to transfer data), or to extract data from a website which does not provide a more convenient API (Data scraping, 2015). Since most webpages are designed for human end-users, navigation and data extraction are not necessarily easily automated. As such, web scraping is typically considered a “last resort” tool, with high programming and processing overhead. It focuses on acquiring HTML formatted data from a webpage and storing the data in a variety of formats, including a text file, a worksheet, or a database. Since websites are built using HTML or XHTML, web scraping utilizes software to simulate human exploration and extraction of data from the web, pulling the data directly out of the HTML. The program then goes through all available pages and collects data and images as a human would do manually using mouse clicks and copy-and-paste. As such, any content that can be viewed on a webpage or is contained in the source code can be scraped. It is for this reason that web scraping tools programmatically automate data extraction/acquisition from a website. The focus on this case study is on developing a methodology for programmatic web data extraction when an API or other web service is not available. Note that although there are software programs and web-browser add-ins that facilitate web scraping (paid subscription and free-ware), this paper illustrates writing one’s own program. A quick Internet search will reveal many of the available web scraping programs, as well as a highly-rated Google Chrome browser extension named Web Scraper® (Web Scraper, n.d.) and a program named Data Toolbar® (Web Data Extraction Software Made Simple, n.d.). A good overview of web scraping can be found at Brody, H. (2012), while a good instructional resource can be found from Brody, H. (2013). To visualize web scraping, consider an online product catalog wherein a certain category of products will result in many items displayed on a webpage, but the catalog will display only a subset of all the items per webpage (usually displayed in an HTML table element). That is, a single web page may display 20, 50 or 100 items per page, with paging links allowing navigation across all subsequent webpages. Such is the case with auction sites such as eBay®, wherein a search may result in 50 items per page across multiple webpages. Although many websites allow the user to choose a fixed number of items per page, few websites offer a single page view of unlimited size. Now consider the task of downloading the data for each item across each webpage. If a single page view of all the items were available the task would simply be that of a few keystrokes allowing one to select the desired data, copy the data, and then paste the data into Programmatic web data extraction, Page 2 Journal of Technology Research Volume 7 document file or worksheet. But, it is easy to imagine a website wherein thousands of items are displayed across many webpages, hence one is limited to a small subset in each webpage view. Without some type of automation, it would require one to navigate through each webpage and linked page, selecting the desired data, copying, and then pasting each of the webpage’s contents. This is obviously not a practical approach, especially if one wanted to acquire the complete contents of a large website on a frequent basis. Fortunately, if a website is structured appropriately, Excel contains all the tools necessary to automate the process of paging through and acquiring all the desired data; relatively quickly and efficiently. An advantage of using Excel is its relative ease of use and a high degree of familiarity among many business professionals and programmers, and relative familiarity among students. Excel is an excellent tool for analyzing data, including charts, sorting, filtering, and data modeling, among many used. Additionally, Excel has VBA programming capabilities that allow one to use a subset of Visual Basic to automate and manipulate Excel and other Microsoft applications, including access to the COM and ActiveX Objects, as well as a multitude of built- in functions. This case describes the methodology for programmatically extracting hundreds and thousands of historical stock price data over hundreds of webpages from Yahoo’s Finance website using the Excel Web Query tool. The stock price data example is being used since the website structure accommodates use of Excel in the manner described, and the task is very similar to how one would go about extracting data from numerous other structurally similar websites. Additionally, Yahoo has no express policies preventing programmatic data extraction. As such, Section 2 describes the manual technique for extracting a single page of records, while Section 6 describes the fully automated programmatic approach returning thousands of records across hundreds of webpages. Section 3 discusses webpage structure which is essential in the automation process, and Section 4 discusses discovery in the context of Yahoo’s Finance webpage structure. Section 5 provides the necessary overview of Excel and VBA required to automate the data extraction. One should note that the Yahoo Finance historical prices webpage has a downloadable csv file containing all the data over the specified period of data. Nevertheless, this case is to illustrate a program when no such file or other data extraction technology is readily available. EXTRACTING DATA USING THE EXCEL WEB QUERY TOOL The Excel Web Query (WQ) tool facilitates bringing data from a website into an Excel worksheet. Web queries are an easy, built-in way to bring data into Excel from the Web. The WQ tool allows the user to point a web query at an HTML document that resides on a Web server and pull part or all of the contents into your spreadsheet. The WQ tools can also retrieve refreshable data that is stored on the Internet, such as a single table, multiple tables, or all of the text on a webpage (Import external data from a complex web site into Excel, n.d.). The tool is based on discovering HTML tables on the webpage and allowing the user to select the table(s) containing the data that is desired to download. In its simplest deployment, one initiates a new web query in Excel and enters a uniform resource locator (url) into the WQ address field. The WQ navigates to the webpage and displays an icon image beside all HTML tables in the webpage. The user can select one or more tables by clicking the icon images. The text from all selected tables is downloaded and displayed in the Excel worksheet. A more extensive description of the WQ tool is available by Rice, F. (2004). As a quick example we will use a web Programmatic web data extraction, Page 3 Journal of Technology Research Volume 7 query to download one page of historical stock price data for Wal-Mart® from Yahoo Finance.
Recommended publications
  • BAS Respondent Guide
    Boundary and Annexation Survey (BAS) Tribal Respondent Guide: GUPS Instructions for using the Geographic Update Partnership Software (GUPS) Revised as of January 25, 2021 This page intentionally left blank. U.S. Census Bureau Boundary and Annexation Survey Tribal Respondent Guide: GUPS i TABLE OF CONTENTS Introduction ............................................................................................................................ix A. The Boundary and Annexation Survey .......................................................................... ix B. Key Dates for BAS Respondents .................................................................................... ix C. Legal Disputes ................................................................................................................ x D. Respondent Guide Organization .................................................................................... x Part 1 BAS Overview ....................................................................................................... 1 Section 1 Process and Workflow .......................................................................................... 1 1.1 Receiving the GUPS Application and Shapefiles ............................................................. 1 1.2 Getting Help ................................................................................................................... 2 1.2.1 GUPS Help .................................................................................................................
    [Show full text]
  • Rexroth Indramotion MLC/MLP/XLC 11VRS Indramotion Service Tool
    Electric Drives Linear Motion and and Controls Hydraulics Assembly Technologies Pneumatics Service Bosch Rexroth AG DOK-IM*ML*-IMST****V11-RE01-EN-P Rexroth IndraMotion MLC/MLP/XLC 11VRS IndraMotion Service Tool Title Rexroth IndraMotion MLC/MLP/XLC 11VRS IndraMotion Service Tool Type of Documentation Reference Book Document Typecode DOK-IM*ML*-IMST****V11-RE01-EN-P Internal File Reference RS-0f69a689baa0e8930a6846a000ab77a0-1-en-US-16 Purpose of Documentation This documentation describes the IndraMotion Service Tool (IMST), a Web- based diagnostic tool used to access a control system over a high speed Ethernet connection. The IMST allows OEMs, end users and service engineers to access and diagnose a system from any PC using Internet Explorer version 8 and Firefox 3.5 or greater. The following control systems are supported: ● IndraMotion MLC L25/L45/L65 ● IndraMotion MLP VEP ● IndraLogic XLC L25/L45/L65 VEP Record of Revision Edition Release Date Notes 120-3300-B316/EN -01 06.2010 First Edition Copyright © Bosch Rexroth AG 2010 Copying this document, giving it to others and the use or communication of the contents thereof without express authority, are forbidden. Offenders are liable for the payment of damages. All rights are reserved in the event of the grant of a patent or the registration of a utility model or design (DIN 34-1). Validity The specified data is for product description purposes only and may not be deemed to be guaranteed unless expressly confirmed in the contract. All rights are reserved with respect to the content of this documentation and the availa‐ bility of the product.
    [Show full text]
  • Boundary and Annexation Survey (BAS) Tribal Respondent Guide: GUPS
    Boundary and Annexation Survey (BAS) Tribal Respondent Guide: GUPS Instructions for using the Geographic Update Partnership Software (GUPS) Revised as of November 12, 2020 This page intentionally left blank. U.S. Census Bureau Boundary and Annexation Survey Tribal Respondent Guide: GUPS i TABLE OF CONTENTS Introduction ............................................................................................................................ix A. The Boundary and Annexation Survey .......................................................................... ix B. Key Dates for BAS Respondents .................................................................................... ix C. Legal Disputes ................................................................................................................ x D. Respondent Guide Organization .................................................................................... x Part 1 BAS Overview ....................................................................................................... 1 Section 1 Process and Workflow .......................................................................................... 1 1.1 Receiving the GUPS Application and Shapefiles ............................................................. 1 1.2 Getting Help ................................................................................................................... 2 1.2.1 GUPS Help .................................................................................................................
    [Show full text]
  • Attachment a NSC BBSPV Userguide
    Block Boundary Suggestion Project Verification GUPS User’s Guide Instructions for Using the Geographic Update Partnership Software (GUPS) U.S. Department of Commerce Economic and Statistics Administration U.S. CENSUS BUREAU census.gov This page intentionally left blank Block Boundary Suggestion Project GUPS User Guide Table of Contents Paperwork Reduction Act Statement: ......................................................................... iii Introduction ................................................................................................................... 1 Part 1. BBSP Overview .............................................................................................. 2 Section 1. Planned 2020 Census Tabulation Block Boundaries ........................... 2 Section 2. Suggested Workflow .............................................................................. 4 Section 3. File Submission through Secure Web Incoming Module .................. 12 Part 2. Participating in the Block Boundary Suggestion Project Using GUPS .... 13 Section 4. Getting Started ..................................................................................... 14 Section 5. GUPS Basics: Map Management, View and Tools ............................. 25 Section 6. BBSP Update Activities in GUPS ........................................................ 69 6.1 Linear Feature Review ................................................................................... 71 6.2 Area Landmark Review .................................................................................
    [Show full text]
  • LARSA 4D User's Manual
    LARSA 4D User’s Manual A manual for LARSA 4D Finite Element Analysis and Design Software Last Revised May 7, 2021 Copyright (C) 2001-2021 LARSA, Inc. All rights reserved. Information in this document is subject to change without notice and does not represent a commitment on the part of LARSA, Inc. The software described in this document is furnished under a license or nondisclosure agreement. No part of the documentation may be reproduced or transmitted in any form or by any means, electronic or mechanical including photocopying, recording, or information storage or retrieval systems, for any purpose without the express written permission of LARSA, Inc. LARSA 4D User’s Manual Table of Contents Using LARSA 4D 9 Overview 15 About LARSA 4D 17 Overview of Using LARSA 4D 19 The LARSA 4D Look and Feel 19 Defining Properties 20 Creating Geometry 22 Applying Loads 23 Analyzing the Model 24 Viewing the Results 25 About the Manual 15 Graphics & Selection 27 An Overview of Graphics & Selection 29 Selection 29 The Graphics Tools 29 Keyboard Commands 34 Select Special 35 Select by Plane 37 Select by Polygon 39 Graphics Display Options 41 Show 41 Shrink 44 Orthographic/Perspective Projection 44 Rendering 45 Graphics Window Grid 47 Hide Unselected 49 Using the Model Spreadsheets 51 Using the Spreadsheet 52 Editing Multiple Cells at Once 52 Adding Rows 52 Inserting Rows 53 Deleting Rows 54 Cut and Copy 54 Paste 55 Apply Formulas 56 Graphing Data 56 Files & Reports 57 Project Properties 59 Import Project (Merge) 63 Import DXF (AutoCAD) File 65 3 LARSA
    [Show full text]
  • Species and Habitat Viewer User Manual
    User Guide User Guide: Species and Habitat Viewer Version 2.0 For: Government of Northwest Territories Environment and Natural Resources Wildlife Division By: Government of Northwest Territories Centre for Geomatics and Caslys Consulting Ltd. Unit 10 – 6782 Veyaness Road Saanichton, B.C., V8M 2C2 May 2021 User Guide: Species and Habitat Viewer TABLE OF CONTENTS 1.0 INTRODUCTION .................................................................................................................................. 1 1.1 Overview 1 1.2 Internet Browser Compatibility ............................................................................................................................................. 1 1.3 Quick Reference Guide ............................................................................................................................................................. 1 1.4 Website Modules ........................................................................................................................................................................ 1 2.0 GETTING STARTED ............................................................................................................................. 3 2.1 First Steps 3 2.2 Key Features .................................................................................................................................................................................. 4 2.3 Common Tasks............................................................................................................................................................................
    [Show full text]
  • Quick Start Guide for Program Administrators
    CCRS Quick Start Guide for Program Administrators September 2017 www.citihandlowy.pl Bank Handlowy w Warszawie S.A. CitiManager Quick Start Guide for Program Administrators | Table of Contents Table of Contents Introduction………………………………………………………................................................................................................................................................................… 3 Basic Navigation………………………………………………………………………………………………………………………………………………………………………………….……………….….……... 4 Getting Started…….…………………………………………………………………………………………………………….………………………………………………………………………………….……….. 8 Access CCRS………………………………………………………………………………………………………………………………………………………………………………………………….….…………... 12 Run Standard Reports Using a Template………………………………………………………………………………………………………………………………………………….….………….…….. 14 Edit a Report from the Report Viewer……………..……………………………………………………………………………………………………………………………………………..………...….. 17 Export a Report…………………………………………………………………………………………………………………………………………………………………………………………….……...…….…. 20 Add/View Report in the History List……………………………………………………………………………………………………………………………………………………………………..……….. 21 Subscribe to a Report………………………………………………………………………………………………………………………………………………………………………………………………….... 23 Save Report Templates — My Reports………………………………………………………………………………………………………………………………………………….………………….…... 27 Appendix…………………………………………………………….…………………………………………………………………………………………………………………………………………………..……. 29 Report Viewer Toolbars…………….…………………………………………………………………………………………………………………………………………………………………………….……. 29 Set User Preferences………………….………………………………………………………………………………………………………………………………………………………………………….……..
    [Show full text]
  • CIMP Inventory of Landscape Change Viewer User Guide
    User Guide Inventory of Landscape Change Map Viewer Version 3.0 For: Government of Northwest Territories Cumulative Impact Monitoring Program By: Government of Northwest Territories Centre for Geomatics and Caslys Consulting Ltd. Unit 10 – 6782 Veyaness Road Saanichton, B.C., V8M 2C2 September 2019 Inventory of Landscape Change Map Viewer TABLE OF CONTENTS 1.0 INTRODUCTION .................................................................................................................................. 1 1.1 Overview ....................................................................................................................................................................................... 1 1.2 Internet Browser Compatibility ............................................................................................................................................. 1 1.3 Quick Reference Guide ............................................................................................................................................................. 1 2.0 GETTING STARTED ............................................................................................................................. 3 2.1 First Steps ....................................................................................................................................................................................... 3 2.2 Key Features .................................................................................................................................................................................
    [Show full text]
  • Web Scraping the Easy Way Yolande Neil
    Georgia Southern University Digital Commons@Georgia Southern University Honors Program Theses 2016 Web Scraping the Easy Way Yolande Neil Follow this and additional works at: https://digitalcommons.georgiasouthern.edu/honors-theses Part of the Business Administration, Management, and Operations Commons, Databases and Information Systems Commons, and the Management Information Systems Commons Recommended Citation Neil, Yolande, "Web Scraping the Easy Way" (2016). University Honors Program Theses. 201. https://digitalcommons.georgiasouthern.edu/honors-theses/201 This thesis (open access) is brought to you for free and open access by Digital Commons@Georgia Southern. It has been accepted for inclusion in University Honors Program Theses by an authorized administrator of Digital Commons@Georgia Southern. For more information, please contact [email protected]. Web Scraping the Easy Way An Honors Thesis submitted in partial fulfillment of the requirements for Honors in Information Systems By Yolande Neil Under the mentorship of Dr. John N. Dyer ABSTRACT Web scraping refers to a software program that mimics human web surfing behavior by pointing to a website and collecting large amounts of data that would otherwise be difficult for a human to extract. A typical program will extract both unstructured and semi-structured data, as well as images, and convert the data into a structured format. Web scraping is commonly used to facilitate online price comparisons, aggregate contact information, extract online product catalog data, extract economic/demographic/statistical data, and create web mashups, among other uses. Additionally, in the era of big data, semantic analysis, and business intelligence, web scraping is the only option for data extraction as many individuals and organizations need to consume large amounts of data that reside on the web.
    [Show full text]
  • Web Data Extractors 2021 White Paper Link Compilation
    Web Data Extractors 2021 A White Paper Link Compilation By Marcus P. Zillman, M.S., A.M.H.A. Executive Director – Virtual Private Library [email protected] Extracting data from the World Wide Web (WWW) has become an important issue in the last few years as the number of web pages available on the visible Internet has grown to billions of pages with trillions of pages available from the invisible web. Tools and protocols to extract all this information have now come in demand as researchers as well as web browsers and surfers want to discover new knowledge at an ever increasing rate! As robots (bots) and intelligent agents are at the heart of many extraction tools I decided to create a compilation of the latest sources and sites that extract information from the web. Figure 1: Web Data Extractors 2021 1 Web Data Extractors 2021 – A White Paper Link Compilation [Updated August 18, 2021] http://www.WebDataExtractors.com/ [email protected] 239-206-3450 © 2007 - 2021 Marcus P. Zillman, M.S., A.M.H.A. Web Data Extractors 2021: 80legs - Powerful and Economical Service Platform for Crawling and Processing Web Content http://www.80legs.com/ Agenty – Hosted Web Scraping Tool https://www.agenty.com/ Anthracite http://freecode.com/projects/anthracite Apify – Web Scraping Platform for Coders https://www.apify.com/ Aristo - Answer Questions with a Knowledgeable Machine http://allenai.org/aristo/ artoo.js - The Client-Side Scraping Companion http://medialab.github.io/artoo/ AutoMate - Automate Data Extraction https://www.networkautomation.com/
    [Show full text]
  • Reusability of Volunteered Geographic Information Supported by Semantic Web Technologies: a Case Study for Environmental Applications
    Reusability of volunteered geographic information supported by Semantic Web technologies: a case study for environmental applications by Swarish Vinaash Marapengopi Master of Science thesis June 7th 2017 Professor: Prof. Dr. M.J. (Menno-Jan) Kraak Department of Geoinformation Processing Faculty of Geoinformation Science and Earth Observation University of Twente Supervisor: Dr. Ir. R. L. G. (Rob) Lemmens Department of Geoinformation Processing Faculty of Geoinformation Science and Earth Observation University of Twente Utrecht University student number: 4189779 University of Twente student number: S6010016 Acknowledgements This research would not have been possible without the inspiration and guidance of dr. ir. R. L. G. (Rob) Lemmens. I have to thank him for his help through this process and not giving up on me. I want to thank my parents for always supporting my academic endeavours, without them this would not have been possible. Lastly, I want to acknowledge my fellow (GIMA) students with whom I have had the pleasure of working with. I found it a profound pleasure, exchanging knowledge and collaborating on our various projects. ii Abstract New infrastructures, technologies and standards contribute to an internet that is more complex, dynamic and diverse than ever. It has never been easier to contribute to the growing networks of websites and (social media) platforms. All over the internet there is geographical information; sometimes explicitly, often implicit. To signify this, the term volunteered geographic information (VGI) was popularised in the academic community by Michael Goodchild a decade ago. The amount of VGI keeps growing, and therefore it is timely to start thinking about how we can maintain the reusability of this data for the future.
    [Show full text]
  • Good Records for Less
    Good Records for Less: Acquiring Quality MARC Records Without Breaking the Bank Lisa Robinson, Head Cataloging & Metadata Services Michigan State University Libraries CBHL Conference May 7, 2013 Three Sources of Inexpensive Catalog Records Bibliographic Utility SkyRiver Z39.50 Search In a catalog BookWhere MarcEdit Screen Scraping Tools Amazon-to-MARC Converter Data Toolbar SkyRiver Bibliographic utility www.theskyriver.com works with any local library system capable of importing MARC records Annual subscription Less expensive than OCLC WorldCat Smaller database than OCLC WorldCat SkyRiver Search Results Record Editing View in SkyRiver Z39.50 Search Z39.50 is a client–server protocol for searching and retrieving information from remote library catalogs Can search multiple catalogs at one time for FREE See whatever record is in other library’s catalog Can produce uneven search results; not all library catalogs set up for same searches (author, title, etc.) Not all library catalogs permit Z39.50 searching Doesn’t always handle diacritics well Z39.50 search may be included in your catalog system Z39.50 Search – Select Catalogs Z39.50 Search Results Z39.50 – Record View BookWhere BookWhere is a copy cataloging tool from WebClarity Software http://www.webclarity.info/products/bookwhere/ Uses Z39.50 search to locate MARC records in library catalogs Can search multiple library catalogs at one time View records in a consistent format, regardless of the format of the source database WebClarity has webinars about this product BookWhere – Demo or Purchase Can download a free working demo version from Balboa Software http://www.balboa-software.com/bwdemo.html Demo lets you do actual searches, but allows a limited number of matches per search.
    [Show full text]