PREFACE Everyday innumerable are invented and developed all over the world in many fields. Added to this, many technologies have resulted in failure also. One, who is born, has to die one day. During this short span of life, the thrust for new technologies and developments have not quenched yet. As a result of this, many latest technologies were introduced.

This magazine “ INFOLINE ” was aimed to provide basic necessary information about the latest technologies developed and it also creates awareness to the one who is reading this. Your comments and valuable suggestions for the improvement from the students, teachers and friends are warmly welcomed and will be gratefully acknowledged. Infoline Team

ACKNOWLEDGEMENT

We wish to thank Thiru A.Venkatachalam B.Sc., Correspondent, Kongu Arts And Science College, Erode and our Management for the support to publish the magazine Dr.N. .Raman M.B.A., M.Com., M.Phil., B.Ed., PGDCA.,Ph.D., Principal, Kongu Arts And Science College, Erode has provided considerable support to us during this effort. We proudly thank our Chief Editor, Staff Advisor, Staff Members and the students of Department of and Information Technology for their guidance and suggestions to complete this magazine.

INFOLINE TECHNOLOGY NAVIGATOR Executive Committee

Chief Patron : Thiru A.Venkatachalam B.Sc.,

Patron : Dr. N.Raman M.B.A., M.Com., M.Phil., B.Ed., PGDCA.,Ph.D.,

Editor In Chief : S.Muruganantham M.Sc., M.Phil.,

Staff Advisor:

M.G.Annapoorani M.Sc.,

Assistant Professor, Department of CT & IT.

Staff Editor:

C.Indrani M.C.A., M.Phil.,

Assistant Professor, Department of CT & IT.

Student Editors:

Ramya.R III-B.Sc(CT)

Rameshkumar.R III-B.Sc(CT)

Ramya.B III-B.Sc(CT)

Kasthuri.H III-B.Sc(IT)

Kiruthika.S.M III-B.Sc(IT)

Organizing Members:

Senthilkumar.M II-B.Sc(CT)

Sathish.K II-B.Sc(CT)

Saranya.K II-B.Sc(CT)

Sasikumar.S II-B.Sc(CT)

Rahul Babu.B II-B.Sc(IT)

Sathiya.P II-B.Sc(IT)

Senthil kumar.V II-B.Sc(IT)

Shanmugam.P II-B.Sc(IT)

CONTENTS

Preface i Acknowledgment ii Executive Committee iii

FREE SPACE 1

PON TOPOLOGIES 3

TURANOR PLANET SOLAR 5

WEB 7

INTEL PUMPS $30 MILLION INTO CLOUD'S FUTURE 10

GRID COMPUTING 10

WINDOWS OS SECURITY 13

WINDOWS HOME SERVER 2011 15

ARTICLES INFOLINE

technology lost market momentum when the Free Space Optical installation of networks for civilian uses was at its peak. Many simple and Communication inexpensive consumer remote controls use low- speed commnication using (IR) . An 8-beam free space optics laser link, This known as consumer IR technologies. rated for 1 Gbit/s at a distance of approximately 2 km. The receptor is the large disc in the middle, Usage and technologies the transmitters the smaller ones. To the top and right side a monocular for assisting the alignment Free-space point-to-point optical links can be of the two heads. Free-space optical implemented using infrared laser light, although communication (FSO) is an optical low-data-rate communication over short distances communication technology that uses light is possible using LEDs. Infrared Data Association propagating in free space to transmit data for (IrDA) technology is a very simple form of free- telecommunications or computer networking. space optical communications. Free Space Optics "Free space" means air, outer space, vacuum, or are additionally used for communications something similar. This contrasts with using between spacecraft. Maximum range for solids such as optical fiber cable or an optical terrestrial links is in the order of 2 to 3 km (1.2 to transmission line. The technology is useful where 1.9 mi) but the stability and quality of the link is the physical connections are impractical due to highly dependent on atmospheric factors such as high costs or other considerations. rain, fog, dust and heat. Amateur radio operators have achieved significantly farther distances History using incoherent sources of light from high- intensity LEDs. One reported 173 miles (278 km) Optical communications, in various forms, have in 2007. However, physical limitations of the been used for thousands of years. The Ancient equipment used limited bandwidths to about 4 Greeks polished their shields to send signals kHz. during battle. In the modern era, semaphores and solar telegraphs called heliographs were The high sensitivities required of the developed, using coded signals to communicate detector to cover such distances made the internal with their recipients.In 1880 Alexander Graham capacitance of the photodiode used a dominant Bell and his assistant Charles Sumner Tainter factor in the high-impedance amplifier which created the , at Bell's newly followed it, thus naturally forming a low-pass established Volta Laboratory in Washington, DC. filter with a cut-off frequency in the 4 kHz range. Bell considered it his most important invention. In outer space, the communication range of free- The device allowed for the transmission of sound space optical communication is currently in the on a beam of light. On June 3, 1880, Bell order of several thousand kilometers, but has the conducted the world's first wireless telephone potential to bridge interplanetary distances of transmission between two buildings, some 213 millions of kilometers, using optical telescopes as meters apart. Its first practical use came in beam expanders. The distance records for optical military communication systems many decades communications involved detection and emission later. Carl Zeiss Jena developed the of laser light by space probes. A two-way Lichtsprechgerät 80 (direct translation: light distance record for communication was set by the speaking device) that the German army used in Mercury laser altimeter instrument aboard the their World War II anti-aircraft defense units. MESSENGER spacecraft. This infrared diode The invention of lasers in the 1960s neodymium laser, designed as a laser altimeter revolutionized free space optics. Military for a Mercury orbit mission, was able to organizations were particularly interested and communicate across a distance of 15 million boosted their development. However the miles (24 million km), as the craft neared Earth

1 INFOLINE on a fly-by in May, 2005. The previous record improved electromagnetic interference (EMI) had been set with a one-way detection of laser behavior compared to using microwaves. light from Earth, by the Galileo probe, as two ground-based lasers were seen from 6 million km Advantages by the out-bound probe, in 1992. RONJA is a free implementation of FSO using Secure free-space optical high-intensity LEDs. communications have been proposed using a laser N-slit interferometer where the laser signal • Ease of deployment takes the form of an interferometric pattern. Any • License-free long-range operation (in attempt to intercept the signal causes the collapse contrast with radio communication) of the interferometric pattern. This technique has • High bit rates been demonstrated to work over propagation • Low bit error rates distances of practical interest and, in principle, it • Immunity to electromagnetic interference could be applied over large distances in space. • Full duplex operation • Protocol transparency Applications • Very secure due to the high directionality and narrowness of the beam(s) Typically scenarios for use are: • No Fresnel zone necessary

• LAN-to-LAN connections on campuses Disadvantages at Fast Ethernet or Gigabit Ethernet speeds For terrestrial applications, the principal limiting • LAN-to-LAN connections in a city, a factors are: metropolitan area network • To cross a public road or other barriers • Beam dispersion which the sender and receiver do not own • Atmospheric absorption • Speedy service delivery of high- • Rain access to optical fiber • Fog (10..~100 dB/km attenuation) networks • Snow • Converged Voice-Data-Connection • Scintillation • Temporary network installation (for • Background light events or other purposes) • Shadowing • Reestablish high-speed connection • Pointing stability in wind quickly (disaster recovery) • Pollution / smog • As an alternative or upgrade add-on to • If the sun goes exactly behind the existing wireless technologies transmitter, it can swamp the signal. • As a safety add-on for important fiber connections (redundancy) These factors cause an attenuated receiver signal • For communications between spacecraft, and lead to higher bit error ratio (BER). To including elements of a satellite overcome these issues, vendors found some constellation solutions, like multi-beam or multi-path • For inter- and intra -chip communication architectures, which use more than one sender and more than one receiver. Some state-of-the-art The light beam can be very narrow, which devices also have larger fade margin (extra makes FSO hard to intercept, improving security. power, reserved for rain, smog, fog). In any case, it is comparatively easy to encrypt any data traveling across the FSO connection for To keep an eye-safe environment, good additional security. FSO provides vastly FSO systems have a limited laser power density

2 INFOLINE and support laser classes 1 or 1M. Atmospheric solutions. and fog attenuation, which are exponential in 4. Operating in the downstream as a broadcast nature, limit practical range of FSO devices to network, PONs allow for video broadcasting as several kilometres. either IP video or analog video using a separate wavelength overlay. By 5. PONs eliminate the necessity to install active multiplexer at splitting locations thus relieving SARANYA.G network operators 6. Being optically transparent end to end PONs III–B.Sc(IT) allow upgrades to higher bit rates or additional wavelengths.

Multiple Access

Pon Topologies One possible way of separating the channels is to use wavelength division multiplexing There are several topologies suitable for (WDM) in which each ONU operates at a the access network: tree, ring, or bus. A PON can different wavelength. While a simple solution, it also be deployed in redundant configuration as remains cost prohibitive for an access network. A double ring or doubletree; or redundancy may be WDM solution would require either tunable added only to a part of the PON, say the trunk of receiver or a receiver array at the OLT to receive the tree. For the rest of this article, we will focus multiple channels. An even more serious problem our attention on the tree topology; however, most for network operators would be wavelength- of the conclusions made are equally relevant to specific ONU inventory instead of having just other topologies . All transmissions in a PON are one type of ONU, there would be multiple types performed between Optical Line Terminal (OLT) of ONUs based on their laser wavelength .It and Optical Network Units (ONU). Therefore, in would also be more problematic for an the downstream direction (from OLT to ONUs), a unqualified user to replace a defective ONU. PON is a point-to-multipoint network, and in the Using tunable lasers in ONUs is too expensive at upstream direction it is a multipoint-to-point the current state of technology. For these reasons network. The OLT resides in the local exchange a WDM PON network is not an attractive (central office), connecting the optical access solution in today's environment. By M.VIGNESH network to an IP, ATM, or SONET backbone. III - B.Sc(IT) The ONU is located either at the curb (FTTC solution), or at the end-user location (FTTH, Ipod Touch FTTB solutions), and provides broadband voice, data, and video services. In the downstream The iPod Touch (stylized and marketed direction, a PON is a P2MP network, and in the as lowercase iPod touch ; also colloquially upstream direction it is a MP2P networkThe referred to as the iTouch ,by analogy to the advantages of using PONs in subscriber access iPhone) is a portable media player, personal networksarenumerous. digital assistant, handheld game console, and Wi- 1. PONs allow for long reach between central Fi mobile platform designed and marketed by offices and customer premises, operating at Apple. The iPod Touch adds the multi-touch distances over 20km. graphical user interface to the iPod line. It is the first iPod with wireless access to the iTunes 2. PONs minimizes fiber deployment in both the Store, and also has access to Apple's App Store, local exchange office and local loop. enabling content to be purchased and downloaded 3. PONs provides higher bandwidth due to deeper directly on the device. As of March 2011, Apple fiber penetration, offering gigabit per second has sold over 60 million iPod Touch units.

3 INFOLINE

Software camera stills deliver higher quality photos to that of the iPod's. The iPod Touch runs iOS. The first major update after the initial release was iPhone OS 2.0. Requirements This update introduced the App Store, which allowed third-party applications for the first time. • iTunes 10 or later iPhone OS 2.0 debuted June 29, 2008. iPhone • Mac OS 10.5 or later Windows XP Home users received the update for free, while iPod or Professional with Service Pack 3 or later Touch users had to pay for the update. The second major update to the , Synchronization iPhone OS 3.0, was released June 17, 2009. iPhone OS 3.0 added features such as cut, copy, As supplied new, the iPod Touch must be and paste; data tethering; and push notification connected to a Macintosh or Windows computer. support. As with the previous major release, There is no official Linux support. On either OS, iPhone users received the update for free, while the iPod Touch must be connected through a iPod Touch users had to pay for the update. iOS USB port. This will charge the iPod Touch and 4.0 was made available to the public on June 21, sync music, videos, pictures and more. Special 2010. It was the first major iOS release to drop cables that plug into a wall can also be bought support for some devices; the first generation separately but can only be used to charge the iPod iPod Touch and original iPhone are not supported Touch. in iOS 4.0. The iPhone 3G and second generation iPod Touch had limited functionality under iOS 4.0, while the iPhone 4, iPhone 3GS, third Battery charging generation iPod Touch, and fourth generation iPod Touch had full functionality under iOS 4.0. The major features introduced in iOS 4.0 Starting with the second generation, iPod Touch included iBooks, FaceTime, and multitasking. can only be charged from the 5 V pin of the dock iOS 5.0 was previewed to the public on June 6, connector while most previous iPod models 2011, and is expected to be released in the fall of (including the original iPod Touch) could also be 2011The iPod touch and the iPhone share charged from the 12 V pin for FireWire essentially the same hardware and run the same power.This change dropped support for charging iOS operating system. The iPod touch lacks some in vehicles equipped with a FireWire-based iPod of the iPhone's features and associated apps, such connection. Most aftermarket manufacturers of as access to cellular networks, GPS navigation such equipment offer cables and/or adapters and the inbuilt compass. Older models also which convert the vehicle's 12 V to 5 V.Hacks lacked speakers and cameras. Although the SMS Shortly after the iPhone (then also the iPod and Phone apps are included on the iPod touch touch) was released, hackers were able to software, they are disabled and therefore not "jailbreak" the device through a TIFF exploit. visible. Also, the sleep/wake button was on the The application installed by this exploit enabled opposite side, up until the release of the iPod the user to download a selection of unofficial touch 4th generation. Since it doesn't need GPS third-party programs. Jailbreaking the iPod and cellular components, the iPod touch is Touch was the only way to get third-party slimmer and lighter than the iPhone. Steve Jobs programs when running 1.1.x OSes. These third- oncereferred to the iPod touch as "training wheels party programs could use additional functionality for the iPhone". Another major difference is the not supported by Apple (such as enabling quality of the back camera compared to the multitasking, applying themes to the home iPhone. While the iPod Touch, like the iPhone, screen, or enabling a battery percentage allows for HD video recording, the iPhone indicator). All officially released versions of iOS through 3.1.2 though 4.3.3 with some bugs, can be jailbroken, but

4 INFOLINE version 4.3.1could not at the time it was energy. Instead of using Bio Diesel like released.Recently, the 4.3.X firmware has Earthrace, Tûranor PlanetSolar plans to been jailbroken untethered by @i0nic. exclusively use solar power. In doing so, Servicing an iPod Touch after jail breaking or Tûranor PlanetSolar will become the first other modifications made by unofficial ever boat to circumnavigate the world using means is not covered by Apple's warranty solar power alone. [6] In order to fulfill this (however, the jailbreaking process is easily challenge, the boat is covered in over 500 undone by performing a restore through square meters of solar panels rated 93 iTunes).Today every firmware of the iPod kW, which in turn connect to one of the touch can be jailbroken from 1.1.1 to 4.3.3 two electric motorsin each hull. Although its and even 5.0b7. hull is capable of hosting 200 persons, the shape of the boat means that it will be able to By reach speeds of up to fourteen knots. The boat's hull has been model tested in wind MADHAN KUMAR.M tunnels and has been tank tested to determine the hydrodynamics and II-B.sc (CT ) aerodynamics of the hull. This 31 meter long boat has been designed to be used as a luxury yacht after the record attempt is finished. On 27 September 2010 Tûranor Planet Solar started on a journey around the world in Turanor Planet Solar Monaco. With this expedition, the iniatiors of the project would like to focus the public awareness on the importance of renewable energies for environmental protection.

The crew of six will circumnavigate the globe solely with the aid of solar power. Captain of the expedition is Frenchman Patrick Marchesseau. Other participants are Christian Ochsenbein (Bern, Switzerland) and Jens Langwasser (Kiel, Germany); as well as project initiator Raphael Domjan (Yverdon-les Bain, Switzerland). On the first leg across the Atlantic Ocean technician Daniel Stahl (Kiel, Germany) and first mate Mikaela von Koskull (Finland) were part of Tûranor PlanetSolar, also known under the the crew. On 27 November 2010 the solar project name PlanetSolar, is an entirely solar boat reached Miami. A significant stopover powered boat that was launched on 31 March was Cancún, during the United Nations 2010. It was built by Knierim Yachtbau World Climate Conference. At the centerline in Kiel, Germany, and was designed of the world tour, the French Canadian by LOMOcean Design, formerly known as Captain Erwann Le Rouzic took over in New Craig Loomes Design Group Ltd.. It is the Caledonia mid may 2011, for thefurther largest solar-powered boat in the world. circumnavigation sharing the master's Similarly to Earthrace, another boat designed responsibility with Captain Patrick by LOMOcean Design, Tûranor PlanetSolar Marchesseau. End of May 2011, the world's is planning to set a round-the-world record largest solar boat docked at Brisbane, where that will promote the use sustainable in an official ceremony the "PlanetSolar

5 INFOLINE

Relay for Hope" was launched, a global relay and 24 feet tall. For a sense of scale, peek children and young people can take part in into its front window, pictured above , and try and present their visions and hopes for a solar to spot the doll-like man working inside.(You energy driven world through essays, videos, might need to click to enlarge the photo.) music, drawing and models.On the first solar Really, PlanetSolar’s jumbo size is simply to boat expedition around the globe, the accommodate the 5,300 square feet of sun- TÛRANOR PlanetSolar has already reached soaking panels that run along its topside. The two records: fastest crossing of the Atlantic solar array pulls in 103kw ,five times more Ocean by solar boat and longest distance ever than the boat needs to run at its average speed covered by a solar electric vehicle. The of 9 m.p.hThat’s not exactly jetpack speed current routing around the globe (subject to but planet solaraims for the long haul. The favourable weather and nautical conditions) boat will life anchor in Europe around april foresees stopovers in several port cities to 2011 and attempt to circle the globe , fueled inform the public about the importance o by nothing but solar rays.Unlike the almost sustainable and renewable energies. The boat absurdly decadent Oculus and Infinitas super is registered in Switzerland andwas Zinanced yachts that we told you about here, the byaGerman entrepreneur.Construction cost interior of this boat leans toward the Spartan. was € 12.5 million. The name Tûranor, Only two men will make the worldwide derived from J.R.R. Tolkien's novel The Lord voyage."Today, the boat is the most used of the Rings, translates to "The Power of the means of transport of goods," the team Sun" writes. "It represents single-handedly almost 1.4 billions of tons of carbon dioxide (in PlanetSolar: The sun-powered super yacht 2008), that is 6% of the total carbondioxide emissions and twice more than the air World's largest solar boat, PlanetSolar, Transport." Thriller-seeker Raphaël will silently and cleanly carry two men Domjan will skipper the ship. And he picked around the globe. an excellent adventure buddy: Gerard d'Aboville, the first man to row across the entire Atlantic Ocean. Along the cruise from New York to San Francisco to Abu Dhabi, the world tour will share a message of environmental stewardship.

A Milestone In The Progress Of Solar Mobility

The launch of the largest solar yacht in the world, the TÛRANOR PlanetSolar is a powerful symbol for the advancement of solar shipping. The TÛRANOR PlanetSolar, with its PV modules covering approx. 500 The planetSolar team unveiled its massive m², can navigate up to three days even boat this week. To graps the scale of this without exposure to the sunlight. The boat’s super yacht, compare it to the forklift on the task is to demonstrate that motorised shipping far right or to the person working behind the can work without fuel. windshield. This green leviathan is the world’s largest solar-powered seacraft. The long-term performance of the Weighing in at 60 tons, the PlanetSolar TÛRANOR PlanetSolar is to be tested for the measures 102 feet long, about 50 feet wide, first time in a circumnavigation of the globe.

6 INFOLINE

The name TÛRANOR is derived from the webserver. One historical snapshot from Lord of the Rings Saga of J.R.R. Tolkien and 1992 remains. As more webservers went translates into "The Power of the Sun". online the central list could not keep up. On the NCSA site new servers were announced By under the title "What's New!" The very first tool used for searching on the was DHIVYA.N [4] Archie. The name stands for "archive" III-B.Sc(IT) without the "v". It was created in 1990 by Alan Emtage, Bill Heelan and J. Peter Deutsch, computer science students at McGill University in Montreal. The program Web Search Engine downloaded the directory listings of all the "Search engine" redirects here. For other files located on public anonymous FTP (File uses, see Search engine (disambiguation). Transfer Protocol) sites, creating a searchable database of file names; however, Archie did not index the contents of these sites since the amount of data was so limited it could be readily searched manually. The rise of Gopher (created in 1991 by Mark McCahill at the University of Minnesota) led to two new search programs, Veronica and Jughead. Like Archie, they searched the file names and titles stored in Gopher index systems. Veronica ( Very Easy Rodent-Oriented Net- wide Index to Computerized Archives)

provided a keyword search of most Gopher menu titles in the entire Gopher listings. Jughead ( Jonzy's Universal Gopher Search engine market share in the US, as of Hierarchy Excavation And Display) was a 2008. tool for obtaining menu information from specific Gopher servers. While the name of A web search engine is designed to search the search engine "Archie" was not a for information on the World Wide Web and reference to the Archie comic book series, FTP servers. The search results are generally "Veronica" and "Jughead" are characters in presented in a list of results and are often the series, thus referencing their predecessor. called hits. The information may consist of web pages, images, information and other In the summer of 1993, no search types of files. Some search engines also mine engine existed yet for the web, though data available in databases or open numerous specialized catalogues were directories. Unlike web directories, which are maintained by hand. Oscar Nierstrasz at the maintained by human editors, search engines University of Geneva wrote a series of operate algorithmically or are a mixture of scripts that would periodically mirror these algorithmic and human input. pages and rewrite them into a standard format which formed the basis for W3Catalog, the History web's first primitive search engine, released on September 2, 1993. In June 1993, During the early development of the web, Matthew Gray, then at MIT, produced what there was a list of webservers edited by Tim was probably the first web robot, the Perl- Berners-Lee and hosted on the CERN based World Wide Web Wanderer, and used

7 INFOLINE it to generate an index called 'Wandex'. The exclusive deal to be the featured search purpose of the Wanderer was to measure the engine on Netscape's web browser. There size of the World Wide Web, which it did was so much interest that instead a deal was until late 1995. The web's second search struck with Netscape by five of the major engine appeared in November 1993. search engines, where for $5 million per year Aliweb did not use a web robot, but instead each search engine would be in rotation on depended on being notified by website the Netscape search engine page. administrators of the existence at each site of an index file in a particular format. The five engines were Yahoo!, Magellan, , , and . JumpStation (released in December Search engines were also known as some of 1993) used a web robot to find web pages the brightest stars in the Internet investing and to build its index, and used a web form as frenzy that occurred in the late 1990s.Several the interface to its query program. It was thus companies entered the market spectacularly, the first WWW resource-discovery tool to receiving record gains during their initial combine the three essential features of a web public offerings. search engine (crawling, indexing, and searching) as described below. Some have taken down their public search engine, and are marketing enterprise- Because of the limited resources only editions, such as Northern Light. Many available on the platform on which it ran, its search engine companies were caught up in indexing and hence searching were limited to the dot-com bubble, a speculation-driven the titles and headings found in the web market boom that peaked in 1999 and ended pages the crawler encountered. One of the in 2001. Around 2000, Google's search first "full text" crawler-based search engines engine rose to prominence. was WebCrawler, which came out in 1994. Unlike its predecessors, it let users search for The company achieved better results any word in any webpage, which has become for many searches with an innovation called the standard for all major search engines PageRank. This iterative algorithm ranks web since. pages based on the number and PageRank of other web sites and pages that link there, on It was also the first one to be widely the premise that good or desirable pages are known by the public. Also in 1994, Lycos linked to more than others. Google also (which started at Carnegie Mellon maintained a minimalist interface to its University) was launched and became a search engine. major commercial endeavor. Soon after, many search engines appeared and vied for In contrast, many of its competitors popularity. These included Magellan (search embedded a search engine in a web portal. engine), Excite, Infoseek, , Northern By 2000, Yahoo! was providing search Light, and AltaVista. Yahoo! was among the services based on Inktomi's search engine. most popular ways for people to find web Yahoo! acquired Inktomi in 2002, and pages of interest, but its search function Overture (which owned AlltheWeb and operated on its web directory, rather than AltaVista) in 2003. full-text copies of web pages. Yahoo! switched to Google's search Information seekers could also engine until 2004, when it launched its own browse the directory instead of doing a search engine based on the combined keyword-based search. In 1996, Netscape technologies of its acquisitions. Microsoft was looking to give a single search engine an first launched MSN Search in the fall of 1998

8 INFOLINE using search results from Inktomi. In early automated Web browser which follows eve ry 1999 the site began to display listings from link on the site. Exclusions can be made by Looksmart blended with results from Inktomi the use of robots.txt . The contents of each except for a short time in 1999 when results page are then analyzed to determine how it from AltaVista were used instead. should be indexed (for example, words are extracted from the titles, headings, or special In 2004, Microsoft began a transition fields called meta tags ). Data about web to its own search techn ology, powered by its pages are stored in an index database for use own web crawler (called msnbot ). Microsoft's in later queries. rebranded search engine, Bing , was launched on June 1, 2009. On July 29, 2009, Yahoo! A query can be a single word. The and Microsoft finalized a deal in which purpose of an index is to allow information to Yahoo! Search would be powered by be found as quickly as possible. Some search technology. engines, such as Google , store all or part of the source page (referred to as a cache) as How web search engines work well as information about the web pages, whereas others, such as AltaVista , store every

word of every page they find. This cached page always holds the actual search text since it is the one that was actually indexed, so it can be very useful when the conte nt of the current page has been updated and the search terms are no longer in it.

This problem might be considered to be a mild form of linkrot , and Google's handling of it increases usability by satisfying user expectations that the search terms will be on the returned webpage. Thi s satisfies the principle of least astonishment since the user normally expects the search terms to be on the returned pages. Increased search relevance makes these cached pages very useful, even beyond the fact that they may High-level architecture of a standard Web contain data that may no longer be available crawler elsewhere.When a user enters a query into a sea rch engine (typically by using key words), A search engine operates in the following the engine examines its index and provides a order: listing of best-matching web pages according to its criteria, usually with a short summary 1. Web crawling containing the document's title and 2. Indexing sometimes parts of the text. 3. Searching. The index is built from the Web search engines work by storing information stored with the data and the information ab out many web pages, which method b y which the information is indexed. they retrieve from the html itself. These Unfortunately, there are currently no known pages are retrieved by a Web crawler public search engines that allow documents (sometimes also known as a spider) — an to be searched by date. Most search engines

9 INFOLINE support the use of the boolean operators related ads alongside the regular search AND, OR and NOT to further specify the engine results. The search engines make search query. Boolean operators are for literal money every time someone clicks on one of searches that allow the user to refine and these ads. extend the terms of the search. By The engine looks for the words or phrases exactly as entered. Some search PERIYASAMY.M engines provide an advanced feature called proximity search which allows users to define III-B.Sc(IT) the distance between keywords. There is also concept-based searching where the research involves using statistical analysis on pages containing the words or phrases you search for. As well, natural language queries allow the user to type a question in the same form Intel Pumps $30 one would ask it to a human. A site like this would be ask.com.The usefulness of a search Million Into Cloud's engine depends on the relevance of the result set it gives back. While there may be millions Future of web pages that include a particular word or phrase, some pages may be more relevant, Intel Labs is pumping the tires for popular, or authoritative than others. Most cloud computing with a $30 million search engines employ methods to rank the investment in a pair of new Intel Science and results to provide the "best" results first. Technology Centers (ISTC) at Carnegie Mellon University that will focus on cloud How a search engine decides which computing and embedded computing pages are the best matches, and what order research. The $30 million is part of Intel's the results should be shown in, varies widely five-year $100 million program launched to from one engine to another. The methods also accelerate innovation and increase university change over time as Internet usage changes research. The new ISTCs, Intel said, join the and new techniques evolve. There are two already announced centers for visual and main types of search engine that have secure computing. “These new ISTCs are evolved: one is a system of predefined and expected to open amazing possibilities," said hierarchically ordered keywords that humans Justin Rattner, CTO, Intel in a statement. have programmed extensively. The other is a “Imagine, for example, future cars equipped system that generates an "inverted index" by with embedded sensors and microprocessors analyzing texts it locates. This second form to constantly collect and analyze traffic and relies much more heavily on the computer weather data. That information could be itself to do the bulk of the work. shared and analyzed in the cloud so that drivers could be provided with suggestions Most Web search engines are for quicker and safer routes.” Cloud and commercial ventures supported by embedded computing represent two major advertising revenue and, as a result, some growth areas for Intel. In its second quarter employ the practice of allowing advertisers to earnings call last month, Intel said its Data pay money to have their listings ranked Center Group sales jumped 15 percent year- higher in search results. Those search engines over-year with cloud computing and which do not accept money for their search enterprise servers leading the way. And engine results make money by running search Intel's Embedded & Communications Group,

10 INFOLINE which includes processors like the Xeon and embedded computing side, the ISTC will Atom chips to power portable and other comprise leading researchers from Carnegie devices, leapt a whopping 25 percent. The Mellon University, Cornell University, new ISTCs will also build upon Intel's vision University of Illinois at Urbana Champaign, for cloud computing, dubbed Intel Cloud University of Pennsylvania, Pennsylvania 2015. The Cloud 2015 vision features centers State University, Georgia Institute of around three key elements; a world of Technology, the University of California at interoperable federated clouds; automated Berkeley and Intel to form a collaborative movement of software applications and community to drive research that can resources; and PC and device-savvy client- transform experiences in the home, in cars aware clouds that know what processing and in retail environments in the future. should take place in the cloud or on a mobile device such as a , smartphone or tablet. By Intel said the ISTCs will add new ideas from academic researchers to extend Intel's BABU.R existing cloud computing initiatives. The center will combine researchers from III–B.Sc(CT) Carnegie Mellon University, Georgia Institute of Technology, University of California Berkeley, Princeton University, and Intel that will explore cloud-impacting technologies like built-in application Grid Computing optimization; efficient and effective support of big data analytics on large amounts of History online data; and making the cloud more distributed and localized by extending cloud The term grid computing originated in the capabilities to the network edge and to client early 1990s as a metaphor for making devices. "In the future, these capabilities computer power as easy to access as an could enable a digital personal handler via a electric power grid in Ian Foster's and Carl device wired into your glasses that sees what Kesselman's seminal work, "The Grid: you see, to constantly pull data from the Blueprint for a new computing infrastructure" cloud and whisper information to you during (2004).CPU scavenging and volunteer the day -- telling you who people are, where computing were popularized beginning in to buy an item you just saw, or how to adjust 1997 by distributed.net and later in 1999 by your plans when something new comes up," SETI@home to harness the power of Intel said. A key area of research at the networked PCs worldwide, in order to solve ISTCs will be to make it easier for devices to CPU-intensive research problems. collect, analyze and act on data from sensors and online databases. For example, in cars, The ideas of the grid (including those data could be used to customize from distributed computing, object-oriented entertainment options for specific passengers programming, and Web services) were while also offering more tailored recommendations while traveling. "With the brought together by Ian Foster, Carl growing popularity of mobile real-time and Kesselman, and Steve Tuecke, widely regarded as the "fathers of the grid". They led personalized technology, there is a the effort to create the Globus Toolkit corresponding rise in demand for specialized incorporating not just computation embedded computing systems to support a broad range of new applications -- including management but also storage management, many not yet envisioned," Intel said. On the security provisioning, data movement, monitoring, and a toolkit for developing

11 INFOLINE additional services based on the same computing."Grid computing requires the use infrastructure, including agreement of software that can divide and farm out negotiation, notification mechanisms, trigger pieces of a program to as many as several services, and information aggregation. While thousand . Grid computing can be the Globus Toolkit remains the de facto thought of as distributed and large-scale standard for building grid solutions, a number cluster computing and as a form of network- of other tools have been built that answer distributed parallel processing. It can be some subset of services needed to create an confined to the network of computer enterprise or global grid. In 2007 the term workstations within a corporation or it can be cloud computing came into popularity, which a public collaboration (in which case it is also is conceptually similar to the canonical Foster sometimes known as a form of peer-to-peer definition of grid computing (in terms of computing). computing resources being consumed as electricity is from the power grid). Indeed, Grid computing is a term referring to grid computing is often (but not always) the combination of computer resources from associated with the delivery of cloud multiple administrative domains to reach a computing systems as exemplified by the common goal. The grid can be thought of as a AppLogic system from 3tera. As an example, distributed system with non-interactive we implemented a plugin-frontend couple workloads that involve a large number of that solves a partial differential equation on files. What distinguishes grid computing an elliptic domain with a random walkers from conventional high performance approach According to benchmarks, GPU computing systems such as cluster computing scales up to about 64 computers in a is that grids tend to be more loosely coupled, particular 2D-grid configuration where each heterogeneous, and geographically dispersed. node has degree 4. In a random topology, the Although a grid can be dedicated to a duplicates problem is still present and should specialized application, it is more common be fixed in order to achieve the normal that a single grid will be used for a variety of Gnutella scaling. Some theoretical different purposes. considerations are attempted: in particular, we discuss the coupon collector problem and Grids are often constructed with the we give an estimation for the small world aid of general-purpose grid software libraries problem using fractal theory As final. known as middleware.Grid size can vary by a considerable amount. Grids are a form of Grid computing (or the use of a distributed computing whereby a “super computational grid) is applying the resources virtual computer” is composed of many of many computers in a network to a single networked loosely coupled computers acting problem at the same time - usually to a together to perform very large tasks. scientific or technical problem that requires a Furthermore, “distributed” or “grid” great number of computer processing cycles computing, in general, is a special type of or access to large amounts of data. A well- parallel computing that relies on complete known example of grid computing in the computers (with onboard CPUs, storage, public domain is the ongoing SETI (Search power supplies, network interfaces, etc.) for Extraterrestrial Intelligence) @Home connected to a network (private, public or the project in which thousands of people are Internet) by a conventional network interface, sharing the unused processor cycles of their such as Ethernet. This is in contrast to the PCs in the vast search for signs of "rational" traditional notion of a supercomputer, which signals from outer space. According to John has many processors connected by a local Patrick, IBM's vice-president for Internet high-speed computer bus. strategies, "the next big thing will be grid

12 INFOLINE

Overview computer bus. The primary advantage of distributed computing is that each node can Grid computing combines computers be purchased as commodity hardware, which, from multiple administrative domains to when combined, can produce a similar reach a common goal, to solve a single task, computing resource as multiprocessor and may then disappear just as quickly.One supercomputer, but at a lower cost. This is of the main strategies of grid computing is to due to the economies of scale of producing use middleware to divide and apportion commodity hardware, compared to the lower pieces of a program among several efficiency of designing and constructing a computers, sometimes up to many thousands. small number of custom supercomputers. The Grid computing involves computation in a primary performance disadvantage is that the distributed fashion, which may also involve various processors and local storage areas do the aggregation of large-scale cluster not have high-speed connections. This computing-based systems.The size of a grid arrangement is thus well-suited to may vary from small—confined to a network applications in which multiple parallel of computer workstations within a computations can take place independently, corporation, for example—to large, public without the need to communicate collaborations across many companies and intermediate results between processors. The networks. "The notion of a confined grid may high-end scalability of geographically also be known as an intra-nodes cooperation dispersed grids is generally favorable, due to whilst the notion of a larger, wider grid may the low need for connectivity between nodes thus refer to an inter-nodes relative to the capacity of the public cooperation".Grids are a form of distributed Internet.There are also some differences in computing whereby a “super virtual programming and deployment. It can be computer” is composed of many networked costly and difficult to write programs that can loosely coupled computers acting together to run in the environment of a supercomputer, perform very large tasks. This technology has which may have a custom operating system, been applied to computationally intensive or require the program to address scientific, mathematical, and academic concurrency issues. If a problem can be problems through , and it adequately parallelized, a “thin” layer of is used in commercial enterprises for such “grid” infrastructure can allow conventional, diverse applications as drug discovery, standalone programs, given a different part of economic forecasting, seismic analysis, and the same problem, to run on multiple back office data processing in support for e- machines. This makes it possible to write and commerce and Web services . debug on a single conventional machine, and eliminates complications due to multiple Comparison of grids and conventional instances of the same program running in the supercomputers same shared memory and storage space at the same time. “Distributed” or “grid” computing in general is a special type of parallel Current Projects computing that relies on complete computers (with onboard CPUs, storage, power supplies, Biology and medicine network interfaces, etc.) connected to a network (private, public or the Internet) by a Folding@Home - seeks to cure Cancer, conventional network interface, such as ALS, Alzheimer's and many other Ethernet. This is in contrast to the traditional diseases by observing proteins folding. notion of a supercomputer, which has many Currently the fastest computer in the processors connected by a local high-speed world at 8 petaFLOPS.

13 INFOLINE

Docking@Home — models protein- QMC@Home — uses Quantum Monte ligand docking. Carlo to predict molecular geometry.

GPUGRID.net — conducts full-atom SETI@home — searches cosmic radio molecular biology simulations, designed emission data for extraterrestrial for CUDA-capable graphics processing intelligence. units. theSkyNet — searches data collected Malaria Control — performs stochastic from radio telescopes such as ASKAP modelling of the clinical epidemiology and natural history of malaria.[1] Mathematics

POEM@Home — models protein PrimeGrid — searches for various types folding using Anfinsen's dogma. of prime numbers

Rosetta@home — tests the assembly of Great Internet Mersenne Prime Search - specific proteins, using appropriate searches for Mersenne primes fragments of better-known proteins. Multi-application projects SIMAP — compiles a database of protein similarities. — studies , , neurodegenerative Earth sciences diseases caused by accumulation, the effect of light on Climateprediction.net — attempts to nanomaterials, fluid mechanics, reduce the uncertainty ranges of climate macromolecular docking, and the models. function of proteins in memory and learning. Quake-Catcher Network — uses in, or attached to, internet- connected Clean Energy Project — tries to find the computers to detect earthquakes. best organic compounds for solar cells and energy storage devices.On phase 2.

Computing for Clean Water — uses the Physics and astronomy techniques of molecular dynamics to determine the fluid dynamics of water AQUA@home — uses Quantum Monte filters that are composed of nanotubes . Carlo to predict the performance of superconducting adiabatic quantum FightAIDS@Home — identifies computers. candidate drugs that have the right shape and chemical characteristics to block Einstein@Home — uses data from HIV protease. LIGO and GEO 600 to search for gravitational waves. By

MilkyWay@Home — uses data from the ARULMOZHISELVI.S Sloan Digital Sky Survey to deduce the structure of the Milky Way galaxy. III –B.Sc(IT)

14 INFOLINE

• Offline mode from graphical and WindowsOs Security command-line interfaces • Support for Windows 7 and Window Security Tools Windows Server 2008 R2 • Updated graphical user interface Microsoft has been putting more • Full support for 64-bit platforms effort into security, which Windows Server • Improved support for Windows XP 2008 R2 and Windows 7 proves. They have Embedded platform been hardening the “out of the box” • Automatic Microsoft Update experience for some time and with the new registration and agent update for Firewall and User Account Control features graphical interface or from the that come preconfigured, it is no wonder why command-line many are moving to these more powerful and • Output completed scan reports to a secure operating systems. Although the new, user-selected directory path or and even older, Windows operating systems network share are and can be more secure, what tools are • Windows Server Update Services 2.0 available to help you configure your system and 3.0 compatibility for more than the firewall and UAC? As you can see from Figure 1, the tool is easy to configure and picking the computers you want to scan is easy too. You can either scan the computer where you are running MBSA, or you can pick a range of IP addresses.

Microsoft Baseline Security Analyzer

MBSA has now been around for quite some time. The tool had great hopes when it first arrived on the scene, but has never developed into anything more than a tool that can be used to scan for installed patches. Yeah, MBSA does more than scan for patches, but the overall sense of the tool from nearly everyone is that it was never really all that useful.

The latest version of MBSA is v2.2 and can Figure 1: MBSA 2.2 configuration options be downloaded here . The updated version of before scanning. MBSA is not all that shocking, as it now supports Windows 7 and Windows Server Once a computer is scanned, the 2008 R2, which everyone on the planet seems results are clearly displayed and easy to read to be migrating to. Other features that MBSA asshowninFigure2. v2.2 brings to the table include:

15 INFOLINE

Security template configuration areas.For more information on how to leverage security templates into a GPO go here. In a similar way that MBSA fails to impress me, security templates fall short of an amazing way to deploy security due to the lack of customization. Yes, Group Policy is customizable, but security templates are not. What you see is what you can configure.

Security Configuration Wizard Figure 2: MBSA 2.2 scan output and summary. Security Configuration Wizard (SCW) has been available for some time, back to the The major issues that I have with MBSA are Windows Server 2003 days. The tool used to that it just does not have any customization be an out of band download, but now is and the security scans seem to be arbitrary installed on every Windows Server 2008 and and not very extensive. I wish I could add 2008 R2 computer, available on the Start additional Registry entries to the scan, so I Menu-Administrative Tools list. The main can include all of the other security settings points I want to make about the SCW tool are that need to be configured. the following: SCW uses a security database, which is useful for what each Windows Security Templates Server Role includes, including required firewall rules Security templates are not new, actually, they were first introduced in • SCW touches on some hard to reach Windows NT! Yeah, this security tool has security areas, such as LM been around the block, but still provides authentication protocol, SMB some good centralized security options. signing, and firewall rules Security templates provide an administrator • SCW can consume security the opportunity to configure some key templates, adding to the security features, then use Group Policy via configuration baseline options Active Directory to deploy the settings. Since • SCW results can be ported into a Group Policy can configure multiple GPO using command-line options. computers with just one set of configurations, security templates provide a way to configure many computers with very little effort. Server 2003 days. The tool used to be Security templates have been leveraged, then an out of band download, but now is installed not used, then leveraged, then not used… as on every Windows Server 2008 and 2008 R2 each operating system has been updated. For computer, available on the Start Menu- example, in the Windows 2000 era there were Administrative Tools list. The main points I “pre-configured” security templates such as want to make about the SCW tool are the basicsv.inf, hisecdc.inf, securedc.inf, following: SCW uses a security database, compatws.inf, etc. These pre-configured which is useful for what each Windows security templates allowed an administrator Server Role includes, including required to easily implement a baseline of security firewall rules without much effort. Security templates provide a way to configure some of the most • SCW touches on some hard to reach common security features, which can be seen security areas, such as LM

16 INFOLINE

authentication protocol, SMB server, developed byMicrosoft: It is a signing, and firewall rules software product whose primary function is • SCW can consume security to store and retrieve data as requested by templates, adding to the other software applications, be it those on the configuration baseline options same computer or those running on another computer across a network (including the By Internet). There are at least a dozen different editions of Microsoft SQL Server aimed at PRABHU .M different audiences and for different workloads (ranging from small applications III-B.Sc(IT) that store and retrieve data on the same computer, to millions of users and computers that access huge amounts of data from the Windows Home Internet at the same time) Tools:SQLCMD:SQLCMD is a command Server 2011 line application that comes with Microsoft SQL Server, and exposes the management Windows Home Server operating features of SQL Server. It allows SQL system designed for Small Office/Home queries to be written and executed from the Office . Windows Home Server is intended to command prompt. It can also act as be a solution for homes with multiple a scripting language to create and run a set of connected PCs to offer file sharing, SQL statements as a script. Such scripts are automated backups, and remote access. stored as a .sql file, and are used either for Windows Home Server 2011 was released on managementofdatabases or to create the 6 April 2011. It follows the release of Power database schema during the deployment of a Pack 3 for the aging Windows Home Server database. SQLCMD was introduced with (equivalent to client edition service packs) SQL Server 2005 and this continues with which added support for Windows 7 to SQL Server2008. Its predecessor for earlier Windows Home Server. Windows Home versions was OSQL and ISQL, which is Server 2011 is considered a "major release". functionallyequivalentas it pertains to TSQL Windows Home Server 2011 is built on execution, and many of hecommand line the Windows Server 2008 R2 code baseits parameters are identical, although visual predecessor having been built on Windows studioMicrosoft Visual Studio includes native Server 2003 and only supports x86- support for data programming with Microsoft 64 hardware. Features: No new features have SQL Server. It can be used to write and been announced by Microsoft, but reportedly debug code to be executed by SQL CLR. It will include additional entertainment also includes a data designer that can be used capabilities including web-based media to graphically create, view or edit database functionality and an 'add in' feature with an schemas. Queries can be created either app store. Initial speculation by technology visually or using code. SSMS 2008 onwards, columnist Mary Jo Foley fueled the idea that provides intellisense for SQL queries as 'Vail' would integrate withWindows Media well.SQL server management studio SQL Center. This prompted the response "Time Server Management Studio is a GUI tool will tell" by Microsoft Windows Home included with SQL Server 2005 and later for Server Product Planner Todd Headrick but by configuring, managing, and administering all the time of the public beta Microsoft had components within Microsoft SQL Server. decided not to integrate Windows Media The tool includes both script editors and Center with 'Vail'. Microsoft SQL Server: graphical tools that work with objects and Microsoft SQL Server is a relational database features of the server SQL Server

17 INFOLINE

Management Studio replaces Enterprise Manager as the primary management interface for Microsoft SQL Server since SQL Server 2005. A version of SQL Server Management Studio is also available for SQL Server Express Edition, for which it is known as SQL Server Management Studio ExpressA central feature of SQL Server Management Studio is the Object Explorer, which allows the user to browse, select, and act upon any of the objects within the server. It can be used to visually observe and analyze query plans and optimize the database performance, among others. SQL Server Management Studio can also be used to create a new database, alter any existing database schema by adding or modifying tables and indexes, or analyze performance. It includes the query windows which provide a GUI based interface to write and execute queries sam. Business Intelligence Development Studio (BIDS) is the IDE from Microsoft used for developing data analysis and Business Intelligence solutions utilizing the Microsoft SQL Server Analysis Services, Reporting Services and Integration Services. It is based on the Microsoft Visual Studio development environment but customizes with the SQL Server services-specific extensions and project types, including tools, controls and projects for reports (using Reporting Services), Cubes and data mining structures. By

RAMYA.R

III–B.Sc(CT)

18