INFOLINE EDITORIAL BOARD EXECUTIVE COMMITTEE Chief Patron : Thiru A.K.Ilango Correspondent Patron : Dr. N.Raman, M.Com., M.B.A., M.Phil., Ph.D., Principal Editor in Chief : Mr. S.Muruganantham, M.Sc., M.Phil., Head of the Department

STAFF ADVISOR Ms. P.Kalarani M.Sc., M..A., M.Phil., Assistant Professor, Department of Computer Technology and Information Technology

STAFF EDITOR Ms. C.Indrani M.C.A., M.Phil., Assistant Professor, Department of Computer Technology and Information Technology

STUDENT EDITORS  B.Mano Pretha III B.Sc. (Computer Technology)  K.Nishanthan III B.Sc. (Computer Technology)  P.Deepika Rani III B.Sc. (Information Technology)  R.Pradeep Rajan III B.Sc. (Information Technology)  D.Harini II B.Sc. (Computer Technology)  V.A.Jayendiran II B.Sc. (Computer Technology)  S.Karunya II B.Sc. (Information Technology)  E.Mohanraj II B.Sc. (Information Technology)  S.Aiswarya I B.Sc. (Computer Technology)  A.Tamilhariharan I B.Sc. (Computer Technology)  P.Vijaya shree I B.Sc. (Information Technology)  S.Prakash I B.Sc. (Information Technology)

CONTENTS

Nano Air Vehicle 1

Reinforcement Learning 2

Graphics Processing Unit (GPU) 4

The History of C Programming 5

Fog Computing 7

Big Data Analytics 9

Common Network Problems 14

Bridging the Human-Computer Interaction 15

Adaptive Security Architecture 16

Digital Platform ` 16

Mesh App and Service Architecture 17

Blockchain Technology 18

Digital Twin 20

Puzzles 21

Riddles 22

right, forward and backward, rotate clockwise NANO AIR VEHICLE (NAV) and counter-clockwise; and hover in mid-air. The artificial hummingbird using its flapping The AeroVironment Nano wings for propulsion and attitude control. It has Hummingbird or Nano Air Vehicle (NAV) is a body shaped like a real hummingbird, a a tiny, remote controlled aircraft built to wingspan of 6.3 inches (160 mm), and a total resemble and fly like a hummingbird, flying weight of 0.67 ounces (19 g) less than an developed in the United States by AA battery. This includes the systems required AeroVironment, Inc. to specifications provided for flight: batteries, motors, and by the Defense Advanced Research Projects communications systems; as well as the video Agency (DARPA). The Hummingbird is camera payload. equipped with a small video camera for surveillance and reconnaissance purposes and, Technical goals for now, operates in the air for up to 11 minutes. It can fly outdoors, or enter a doorway DARPA established flight test to investigate indoor environments. milestones for the Hummingbird to achieve and the finished prototype met all of them, and even exceeded some of these objectives:

 Demonstrate precision hover flight within a virtual two-meter diameter sphere for one minute.  Demonstrate hover stability in a wind gust flight which required the aircraft to hover and tolerate a two-meter per second (five miles per hour) wind gust from the side, without drifting DARPA contributed $4 million to downwind more than one meter. AeroVironment since 2006 to create a  Demonstrate a continuous hover prototype "hummingbird-like" aircraft for the endurance of eight minutes with Nano Air Vehicle (NAV) program. The result no external power source. was called the Nano Hummingbird which can fly at 11 miles per hour (18 km/h) and move in  Fly and demonstrate controlled, three axes of motion. The aircraft can climb transition flight from hover to 11 and descend vertically, fly sideways left and 1

REINFORCEMENT LEARNING miles per hour fast forward flight and back to hover flight. Definition

 Demonstrate flying from Reinforcement Learning is a type of outdoors to indoors and back Machine Learning, and thereby also a branch outdoors through a normal-size of Artificial Intelligence. It allows machines doorway. and software agents to automatically determine  Demonstrate flying indoors the ideal behaviour within a specific context, in "heads-down" where the pilot order to maximize its performance. Simple operates the aircraft only looking reward feedback is required for the agent to at the live video image stream learn its behaviour this is known as the from the aircraft, without looking reinforcement signal. at or hearing the aircraft directly.  Fly the aircraft in hover and fast There are many different algorithms forward flight with bird-shaped that tackle this issue. As a matter of fact, body and bird-shaped wings. Reinforcement Learning is defined by a specific type of problem, and all its solutions The device is bigger and heavier than a are classed as Reinforcement Learning typical real hummingbird, but is smaller and algorithms. In the problem, an agent is lighter than the largest hummingbird varieties. supposed to decide the best action to select It could be deployed to perform reconnaissance based on his current state. When this step is and surveillance in urban environments or on repeated, the problem is known as a Markov battlefields, and might perch on windowsills or Decision Process. power lines, or enter buildings to observe its Why Reinforcement Learning? surroundings, relaying camera views back to its operator. According to DARPA, the Nano Air Motivation Vehicle's configuration will provide the Reinforcement Learning allows the warfighter with unprecedented capability for machine or software agent to learn its urban mission operations. behaviour based on feedback from the KALIKKUMAR V environment. This behaviour can be learnt once III B.Sc. (Computer Technology) and for all, or keep on adapting as time goes

by. If the problem is modelled with care, some

Reinforcement Learning algorithms can

2

converge to the global optimum; this is the When does Reinforcement Learning fail? ideal behaviour that maximises the reward. Limitations

This automated learning scheme There are many challenges in current implies that there is little need for a human Reinforcement Learning research. Firstly, it is expert who knows about the domain of often too memory expensive to store values of application. Much less time will be spent each state, since the problems can be pretty designing a solution, since there is no need for complex. Solving this involves looking into hand-crafting complex sets of rules as with value approximation techniques, such as Expert Systems, and all that is required is Decision Trees or Neural Networks. There are someone familiar with Reinforcement many consequence of introducing these Learning. imperfect value estimations, and research tries to minimise their impact on the quality of the How does Reinforcement Learning work? solution. Technology Moreover, problems are also generally As mentioned, there are many different very modular; similar behaviours reappear solutions to the problem. The most popular, often, and modularity can be introduced to however, allow the software agent to select an avoid learning everything all over again. action that will maximise the reward in the Hierarchical approaches are common-place for long term (and not only in the immediate this, but doing this automatically is proving a future). Such algorithms are known to have challenge. Finally, due to limited perception, it infinite horizon. is often impossible to fully determine the current state. This also affects the performance In practice, this is done by learning to of the algorithm, and much work has been done estimate the value of a particular state. This to compensate this Perceptual Aliasing. estimate is adjusted over time by propagating part of the next state's reward. If all the states Who uses Reinforcement Learning? and all the actions are tried a sufficient amount Applications of times, this will allow an optimal policy to be defined; the action which maximises the value The possible applications of of the next state is picked. Reinforcement Learning are abundant, due to the genericness of the problem specification. As a matter of fact, a very large number of problems in Artificial Intelligence can be

3

fundamentally mapped to a decision process.  Support for YUV color space This is a distinct advantage, since the same  Hardware overlays theory can be applied to many different domain  MPEG decoding specific problems with little effort. These features are designed to lessen the In practice, this ranges from controlling work of the CPU and produce faster video and robotic arms to find the most efficient motor graphics. A GPU is not only used in a PC on a combination, to robot navigation where video card or motherboard; it is also used in collision avoidance behaviour can be learnt by mobile phones, display adapters, workstations negative feedback from bumping into and game consoles. obstacles. Logic games are also well-suited to Reinforcement Learning, as they are traditionally defined as a sequence of decisions: games such as poker, back- gammom, othello, chess have been tackled more or less succesfully.

S.AISWARYA I B.Sc. (Computer Technology)

The first GPU was developed by GRAPHICS PROCESSING UNIT (GPU) NVidia in 1999 and called the GeForce 256. This GPU model could process 10 million A Graphics Processing Unit (GPU) is a polygons per second and had more than 22 single-chip processor primarily used to manage million transistors. The GeForce 256 was a and boost the performance of video and single-chip processor with integrated graphics. GPU features include: transform, drawing and BitBLT support,

 2-D or 3-D graphics lighting effects, triangle setup/clipping and  Digital output to flat panel display rendering engines. monitors GPUs became more popular as the  Texture mapping demand for graphic applications increased.  Application support for high-intensity Eventually, they became not just an graphics software such as AutoCAD enhancement but a necessity for optimum  Rendering polygons performance of a PC. Specialized logic chips 4

now allow fast graphic and video THE HISTORY OF THE C PROGRAMMING LANGUAGE implementations. Generally the GPU is connected to the CPU and is completely C is one of the most important separate from the motherboard. The random programming languages in the history of access memory (RAM) is connected through computing. Today, many different the accelerated graphics port (AGP) or the programming languages have popped up peripheral component interconnect express offering many different features, but in many (PCI-Express) bus. Some GPUs are integrated ways, C provided the basis for such languages. into the northbridge on the motherboard and use the main memory as a digital storage area, but these GPUs are slower and have poorer performance.

Most GPUs use their transistors for 3-D computer graphics. However, some have accelerated memory for mapping vertices, such as geographic information system (GIS) applications. Some of the more modern GPU technology supports programmable shaders implementing textures, mathematical vertices and accurate color formats. Applications such It is unclear whether its creators had as computer-aided design (CAD) can process envisioned the great things C would go on to over 200 billion operations per second and achieve. Like most innovations, C underwent deliver up to 17 million polygons per second. many changes over time. Probably one of its Many scientists and engineers use GPUs for greatest achievements has been its ability to more in-depth calculated studies utilizing stay relevant even in modern, dynamic times. It vector and matrix features. must be fulfilling for the creators of C to observe that their creation is not considered KALIKKUMAR V outdated or categorized as useful for only a few III B.Sc. (Computer Technology) niche areas. Instead, C has come to be recognized as a general-purpose, strong language which could be applied to many areas. (Find out more about the history of programming languages in Computer

5

Programming: From Machine Language to on Multics (Multiplexed Information and Artificial Intelligence.) Computer Services). Later, Unics changed to . UNIX was written in assembly The Beginnings of C language which, though ideal for machines, was a difficult proposition for human beings. Developing C was not originally the To interpret and operate UNIX, the languages objective of its founders. In fact, various Fortran and B were used. It is here that the idea circumstances and problems created the ideal of developing the C language began to form in situation for its creation. In the 1960s, Dennis the minds of its creators. Ritchie, who was an employee of Bell Labs (AT&T), along with some of his colleagues, Why C Was Developed? had been working on developing an which could be used by many users The B language was a useful one in the simultaneously. This operating system was context of the challenges the creators of UNIX known as Multics, and it was meant to allow faced with the operating system. The B many users share common computing language was taken from BCPL by Martin resources. Multics offered many benefits, but Richards. As already stated, UNIX was written also had many problems. It was a large system in assembly language. To perform even small and it seemed – from a cost-benefit perspective operations in UNIX, one needed to write many – that the costs outweighed the benefits. pages of code. B solved this problem. Unlike Gradually, Bell Labs withdrew from the assembly language, B needed significantly project. fewer lines of code to carry out a task in UNIX. Still, there was a lot that B could not do. Much That's when Ritchie joined Ken more was expected from B in the context of Thompson and Brian Kernighan in another rapidly changing requirements. For example, B project. The project involved developing a new did not recognize data types. Even with B, data file system. Thompson developed a new file types were expressed with machine language. system for DEC PDP-7 supercomputer in B also did not support data structures. assembly language. Thereafter, the creators of the file system made many improvements to it, It was clear that something had to resulting in the birth of the UNIX operating change. So, Ritchie and his colleagues got system. Even the origin of the name UNIX can down to overcoming the limitations. The C be traced to its predecessor, Multics. language was developed in 1971-73. Note that Originally, the name was Unics (Uniplexed for all its limitations, C owes its birth to B Information and Computing Service) as a pun because C retained a lot of what B offered,

6

while adding features such as data types and functions, classes and libraries to its rich data structures. The name C was chosen feature set. C is being used in some of the because it succeeded B. In its early days, C was biggest and most prominent projects and designed keeping UNIX in mind. C was used to products in the world. C has also influenced the perform tasks and operate UNIX. So, keeping development of numerous languages such as performance and productivity in mind, many of AMPL, AWK, csh, C++, C--, C#, Objective-C, the UNIX components were rewritten in C Bit C, D, Go, Java, JavaScript, Julia, Limbo, from assembly language. For example, the LPC, Perl, PHP, Pike, Processing, Python, UNIX kernel itself was rewritten in 1973 on a Rust, Seed7, Vala and Verilog. (To learn more DEC PDP-11. about languages, see The 5 Programming Languages That Built the Internet.) Ritchie and Kernighan documented their creation in the form of a book called "The Do you use ? Then C Programming Language." Though Kernighan you have C to thank, because its development claimed that he had no role in the design of C, is mostly in C. The same goes for MacOS, he was the author of the famous "Hello World" , Android, iOS and Windows Phone as program and many other UNIX programs. well, so nearly all modern operating systems are based on C. It's also widely used in Evolution of C embedded systems, such as those found in vehicles, smart TVs and countless internet of Over time, C began to be used in things (IoT) devices. personal computers for developing software applications and other purposes. The first VARSHA R change (even if only a little) came when the I B.Sc. (Information Technology)

American National Standards Institute (ANSI) formed a committee in 1983 to standardize C. FOG COMPUTING After a review of the language, they modified it a little so that it was also compatible with other Fog computing or fog networking, also programs that preceded C. So the new ANSI known as fogging, is an architecture that uses standard came into being in 1989, and is known one or more collaborative end-user clients or as ANSI C or C89. The International near-user edge devices to carry out a Organization for Standardization (ISO) has also substantial amount of storage (rather than contributed to the standardization of C. stored primarily in cloud data centers), communication (rather than routed over the C has evolved as it has added some internet backbone), control, configuration, significant features like memory management, measurement and management (rather than 7

controlled primarily by network gateways such phones, wearable health monitoring devices, as those in the LTE core network). connected vehicle and augmented reality using devices such as the Google Glass. Fog computing can be perceived both in large cloud systems and big data structures, SPAWAR, a division of the US Navy, making reference to the growing difficulties in is prototyping and testing a scalable, secure accessing information objectively. This results Disruption Tolerant Mesh Network to protect in a lack of quality of the obtained content. The strategic military assets, both stationary and effects of fog computing on cloud computing mobile. Machine control applications, running and big data systems may vary yet, a common on the mesh nodes, "take over", when internet aspect that can be extracted is a limitation in connectivity is lost. Use cases include Internet accurate content distribution, an issue that has of Things e.g. smart drone swarms. been tackled with the creation of metrics that ISO/IEC 20248 provides a method attempt to improve accuracy. whereby the data of objects identified by edge Fog networking consists of a control computing using Automated Identification plane and a data plane. For example, on the Data Carriers [AIDC], a barcode and/or RFID data plane, fog computing enables computing tag, can be read, interpreted, verified and made services to reside at the edge of the network as available into the "Fog" and on the "Edge" opposed to servers in a data-center. Compared even when the AIDC tag has moved on.Both to cloud computing, fog computing emphasizes cloud computing and fog computing provide proximity to end-users and client objectives, storage, applications, and data to end-users. dense geographical distribution and local However, fog computing has a bigger resource pooling, latency reduction and proximity to end-users and bigger geographical backbone bandwidth savings to achieve better distribution. quality of service (QoS) and edge Cloud Computing – the practice of analytics/stream mining, resulting in superior using a network of remote servers hosted on user-experience and redundancy in case of the Internet to store, manage, and process data, failure while it is also able to be used in AAL rather than a local server or a personal scenarios. computer. Cloud Computing can be a Fog networking supports the Internet of heavyweight and dense form of computing Things (IoT) concept, in which most of the power. devices used by humans on a daily basis will be Fog computing – a term created by connected to each other. Examples include Cisco that refers to extending cloud computing 8

to the edge of an enterprise's network. Also BIG DATA ANALYTICS known as Edge Computing or fogging, fog The volume of data that one has to deal computing facilitates the operation of compute, has exploded to unimaginable levels in the past storage, and networking services between end decade, and at the same time, the price of data devices and cloud computing data centres. storage has systematically reduced. Private While edge computing is typically referred to companies and research institutions capture the location where services are instantiated, fog terabytes of data about their users’ interactions, computing implies distribution of the business, social media, and also sensors from communication, computation, and storage devices such as mobile phones and resources and services on or close to devices automobiles. The challenge of this era is to and systems in the control of end-users. Fog sense of this sea of data. This is where computing is a medium weight and big data analytics comes into picture. intermediate level of computing power. Big Data Analytics largely involves Mist computing – a lightweight and collecting data from different sources, munge it rudimentary form of computing power that in a way that it becomes available to be resides directly within the network fabric at the consumed by analysts and finally deliver data extreme edge of the network fabric using products useful to the organization business. microcomputers and microcontrollers to feed into Fog Computing nodes and potentially onward towards the Cloud Computing platforms. National Institute of Standards and Technology initiated in 2017 the definition of the Fog computing that defines Fog computing as an horizontal, physical or virtual resource paradigm that resides between smart end- devices and traditional cloud computing or data center. This paradigm supports vertically- The process of converting large isolated, latency-sensitive applications by amounts of unstructured raw data, retrieved providing ubiquitous, scalable, layered, from different sources to a data product useful federated, and distributed computing, storage, for organizations forms the core of Big Data and network connectivity. Analytics.

J.JANSI RANI II B.Sc. (Computer Technology)

9

Traditional Data Mining Life Cycle and requirements from a business perspective, and then converting this knowledge into a data In order to provide a framework to mining problem definition. A preliminary plan organize the work needed by an organization is designed to achieve the objectives. A and deliver clear insights from Big Data, it’s decision model, especially one built using the useful to think of it as a cycle with different Decision Model and Notation standard can be stages. It is by no means linear, meaning all the used. stages are related with each other. This cycle has superficial similarities with the more Data Understanding: The data understanding traditional data mining cycle as described in phase starts with an initial data collection and CRISP methodology. proceeds with activities in order to get familiar with the data, to identify data quality problems, CRISP-DM Methodology to discover first insights into the data, or to The CRISP-DM methodology that stands for detect interesting subsets to form hypotheses Cross Industry Standard Process for Data for hidden information. Mining, is a cycle that describes commonly used approaches that data mining experts use to Data Preparation: The data preparation phase tackle problems in traditional BI data mining. It covers all activities to construct the final is still being used in traditional BI data mining dataset (data that will be fed into the modeling teams. tool(s)) from the initial raw data. Data preparation tasks are likely to be performed multiple times, and not in any prescribed order. Tasks include table, record, and attribute selection as well as transformation and cleaning of data for modelling tools.

Modelling : In this phase, various modelling techniques are selected and applied and their parameters are calibrated to optimal values. Typically, there are several techniques for the

same data mining problem type. Some techniques have specific requirements on the The stages involved in the CRISP-DM life form of data. Therefore, it is often required to cycle are: step back to the data preparation phase. Business Understanding: This initial phase focuses on understanding the project objectives 10

Evaluation : At this stage in the project, you  Data Preparation for Modelling and have built a model (or models) that appears to Assessment have high quality, from a data analysis  Modelling perspective. Before proceeding to final  Implementation deployment of the model, it is important to In this section, we will throw some light on evaluate the model thoroughly and review the each of these stages of big data life cycle. steps executed to construct the model, it properly achieves the business objectives. Business Problem Definition

Deployment: Creation of the model is This is a point common in traditional BI generally not the end of the project. Even if the and big data analytics life cycle. Normally it is purpose of the model is to increase knowledge a non-trivial stage of a big data project to of the data, the knowledge gained will need to define the problem and evaluate correctly how be organized and presented in a way that is much potential gain it may have for an useful to the customer. organization. It seems obvious to mention this,

Big Data Life Cycle but it has to be evaluated what are the expected gains and costs of the project. In today’s big data context, the previous approaches are either incomplete or Research suboptimal. For example, the SEMMA Analyze what other companies have methodology disregards completely data done in the same situation. This involves collection and pre-processing of different data looking for solutions that are reasonable for sources. These stages normally constitute most your company, even though it involves of the work in a successful big data project. adapting other solutions to the resources and A big data analytics cycle can be described by requirements that your company has. In this the following stage stage, a methodology for the future stages should be defined.  Business Problem Definition  Research Human Resources Assessment  Human Resources Assessment Once the problem is defined, it’s  Data Acquisition reasonable to continue analyzing if the current  Data Munging staff is able to complete the project  Data Storage successfully. Traditional BI teams might not be  Exploratory Data Analysis

11

capable to deliver an optimal solution to all the In order to combine both the data stages, so it should be considered before sources, a decision has to be made in order to starting the project if there is a need to make these two response representations outsource a part of the project or hire more equivalent. This can involve converting the people. first data source response representation to the second form, considering one star as negative Data Acquisition and five stars as positive. This process often requires a large time allocation to be delivered This section is key in a big data life with good quality. cycle; it defines which type of profiles would be needed to deliver the resultant data product. Data Storage Data gathering is a non-trivial step of the process; it normally involves gathering Once the data is processed, it unstructured data from different sources. To sometimes needs to be stored in a . Big give an example, it could involve writing a data technologies offer plenty of alternatives crawler to retrieve reviews from a website. regarding this point. The most common This involves dealing with text, perhaps in alternative is using the Hadoop File System for different languages normally requiring a storage that provides users a limited version of significant amount of time to be completed. SQL, known as HIVE Query Language. This allows most analytics task to be done in similar Data Munging ways as would be done in traditional BI data warehouses, from the user perspective. Other Once the data is retrieved, for example, storage options to be considered are MongoDB, from the web, it needs to be stored in an easy Redis, and SPARK. to-use format. To continue with the reviews examples, let’s assume the data is retrieved This stage of the cycle is related to the from different sites where each has a different human resources knowledge in terms of their display of the data. Suppose one data source abilities to implement different architectures. gives reviews in terms of rating in stars, Modified versions of traditional data therefore it is possible to read this as a mapping warehouses are still being used in large scale for the response variable y ∈ {1, 2, 3, 4, 5}. applications. For example, teradata and IBM Another data source gives reviews using two offer SQL that can handle terabytes arrows system, one for up voting and the other of data; open source solutions such as for down voting. This would imply a response postgreSQL and MySQL are still being used variable of the form y ∈ {positive, negative}. for large scale applications.

12

This stage a priori seems to be the most forward to solving the business problem at important topic, in practice, this is not true. It is hand. In practice, it is normally desired that the not even an essential stage. It is possible to model would give some insight into the implement a big data solution that would be business. Finally, the best model or working with real-time data, so in this case, we combination of models is selected evaluating only need to gather data to develop the model its performance on a left-out dataset. and then implement it in real time. So there would not be a need to formally store the data Implementation at all. In this stage, the data product developed Exploratory Data Analysis is implemented in the data pipeline of the company. This involves setting up a validation Once the data has been cleaned and scheme while the data product is working, in stored in a way that insights can be retrieved order to track its performance. For example, in from it, the data exploration phase is the case of implementing a predictive model, mandatory. The objective of this stage is to this stage would involve applying the model to understand the data, this is normally done with new data and once the response is available, statistical techniques and also plotting the data. evaluate the model. This is a good stage to evaluate whether the As mentioned in the big data life cycle, problem definition makes sense or is feasible. the data products that result from developing a Data Preparation for Modelling and big data product are in most of the cases some Assessment of the following :

This stage involves reshaping the Machine learning implementation: This cleaned data retrieved previously and using could be a classification algorithm, a regression statistical preprocessing for missing values model or a segmentation model. imputation, outlier detection, normalization, Recommender system: The objective is to feature extraction and feature selection. develop a system that recommends choices

Modelling based on user behaviour. Netflix is the characteristic example of this data product, The prior stage should have produced where based on the ratings of users, other several datasets for training and testing, for movies are recommended. example, a predictive model. This stage involves trying different models and looking 13

Dashboard : Business normally needs tools to You may be able to replace the network card visualize aggregated data. A dashboard is a you’re using, or update the drivers associated graphical mechanism to make this data with it, but in some cases, you’ll need to accessible. replace the hardware entirely.

Ad-Hoc analysis : Normally business areas Periodic outages: Few networks operate have questions, hypotheses or myths that can perfectly 100 percent of the time, but if you’re be answered doing ad-hoc analysis with data. seeing periodic total outages a complete inability for any devices to connect to the V. MOHANAPRIYA network you have a problem that requires III B.Sc. (Computer Technology) action. There are many root causes here, but you may be seeing a NetBIOS conflict

COMMON NETWORK PROBLEMS (especially if you’re using an older system). If you disable WINS/NetBT name resolution, you Slow internet: Slow internet is the bane of may be able to clear things up. You could also every worker’s existence. It could be caused by try renaming computers and domains to resolve a number of problems. For example, your the issue. router may not be working properly, you may have too many devices attempting to access the IP conflicts: Windows usually makes sure that internet at once, a specific app might be there’s only one IP address per device that has drawing too heavily on your total bandwidth, access to the network at any given time. In rare or your internet provider itself may be situations, however, two devices may end up experiencing service delays. Unfortunately, the with the same IP address; when this happens, only way to address these potential issues is to one device will usually be “blocked,” which investigate them and attempt to fix them, one prevents the device from accessing protected by one, until you find the root of the problem. files. To make matters worse, it can cause lag for all connected devices not just the ones with A signal without a connection: Occasionally, the IP conflict. Avoiding this problem is you’ll see a signal from a router, but your relatively easy if you reconfigure your DHCP device won’t be able to connect to the network. setup to exclude static IP addresses. This There are two potential causes for this first, should reconfigure IP addresses so that all your device might be out of range of the router. machines can access the network without issue. Move the device close to the router, and see if you can connect then. If you still can’t connect, VOIP quality issues: Issues with voice calls there’s probably a problem with the hardware. including delays, interruptions, and quality

14

issues can be caused by many variables not just traditional commands, but also non- (including some of the ones listed above). verbal ones gestures, body language and facial However, you may be experiencing a network expressions. stutter. Install jitter buffers, which create small The project, titled "Communication caches or packets of VOIP information, to through Gestures, Expression and Shared ensure a smooth stream of information from Perception," is led by Professor of Computer one point to another. You could also install Science Bruce Draper, and is bolstered by a new playback codes preferably ones with recent $2.1 million grant from the Defense packet loss concealment as a main feature. Advanced Research Projects Agency (DARPA) While you’re at it, update your drivers. under its "Communicating with Computers" funding program. “Current human-computer Connections with limited access. If you have interfaces are still severely limited," said a connection with limited access, you’re likely Draper, who is joined on the project by CSU receiving a Windows error message that’s researchers from the computer science and caused by a technical glitch. Windows has mathematics departments. First, they provide released updates that should prevent the essentially one-way communication users tell majority of these errors from occurring, but if the computer what to do. This was fine when you encounter this situation, your best bet is to computers were crude tools, but more and do a hard reset of your network router and the more, computers are becoming our partners and device trying to connect to it. assistants in complex tasks. Communication with computers needs to become a two-way MYTHILI T dialogue." III B.Sc. (Information Technology) The team has proposed creating a of what are called Elementary BRIDGING THE HUMAN-COMPUTER Compostable Ideas (ECIs). Like little packets INTERACTION of information recognizable to computers, each Compare that with how humans interact ECI contains information about a gesture or with each other, face to face smiling, frowning, facial expression, derived from human users, as pointing, tone of voice all lend richness to well as a syntactical element that constrains communication. With the goal of how the information can be read. To achieve revolutionizing everyday interactions between this, the researchers have set up a Microsoft humans and computers, Colorado State Kinect interface. A human subject sits down at University researchers are developing new a table with blocks, pictures and other stimuli. technologies for making computers recognize Their goal is to make computers smart enough

15

to reliably recognize non-verbal cues from changes its behaviour based on the resources humans in the most natural, intuitive way available. For example, in the C++ Standard possible. According to the project proposal, the Library, the stable partition [program] acquires work could someday allow people to as much memory as it can get and applies the communicate more easily with computers in algorithm using that available memory." noisy settings, or when a person is deaf or hard Adaptive infrastructure architecture: Used of hearing, or speaks another language. by infrastructure engineers in relation to the SELVA BHARATHI A configuration of processors. The computing I B.Sc. (Computer Technology) resources used by applications (the partition size, or the number of servers in a cluster, or the share of a processor, or the ADAPTIVE SECURITY ARCHITECTURE number of processes) are configured so that they shrink or grow with demand. Adaptive architecture is a system which changes its structure, behaviour or resources Adaptive business architecture: Could also according to demand. The adaptation made is be used (for example) in connection with usually to non-functional characteristics rather a workflow system that assigns human than functional ones. resources to a task or service to match the demand for that task or service. An organization structure that flexes in response to business changes.

RASIKA M III B.Sc. (Computer Technology)

DIGITAL PLATFORM

A digital platform consists of many services, representing a unique collection of Something of a misnomer, because the software or hardware services of a company thing that adapts is the working system, rather used to deliver its digital strategy. than the (more abstract) architecture which Organizations look for the services that provide defines the adaptability that is required of that businesses with the best performance to cost system. Adaptive software architecture: Used ratio, but some services are almost always by programmers in relation to a program. required for all applications or solutions. An adaptive algorithm "is an algorithm which Digital platforms have gained importance in 16

business due to this ‘performance to cost ratio’ retailers are following the same . The advantage. innovation around latest platform abilities are being driven by three transformative technologies: cloud, social, and mobile. The cloud enables a global infrastructure for production, allowing everyone to create content and applications for a global audience. Social networks connect people globally and maintain their identity online. Mobile devices such as mobiles and tablets allow connection to this global infrastructure anytime, anywhere. The result is a globally accessible network of

entrepreneurs, workers, and consumers who are The digital platform strategy will vary available to create businesses, contribute from company to company depending on their content, and purchase goods and services. The internal workflow strategy. Some companies future has set out to see more and more will develop a platform business model that companies shifting from products to platforms. encompasses providers, consumers, and Digital platforms allow for companies employees to create or exchange goods, to pursue business models that bring together services, and social interaction while some multiple buyers and sellers. For example, integrate with other organizations’ digital designers of smart cities will build ecosystems platforms. Regardless of the setup, the platform to bring together partners of all sizes. Banks strategy must be able to integrate business and will bring new products and services to IT needs to establish a collective leadership customers via an ecosystem of fintech vision. Companies must decide what makes (financial technology). sense for their organization and long-term business goals. G.LOKESHWAR II B.Sc. (Information Technology)

The use of platform thinking extends beyond the tech sector onto the marketing industry. The marketing retailers are swiftly MESH APP AND SERVICE shifting from distribution channels selling ARCHITECTURE products, to engagement platforms co-creating The device mesh refers to an expanding value. Online retailers like eBay, Etsy, and set of endpoints people use to access Amazon led the way, and now traditional

17

applications and information or interact with cryptography. Each block typically contains a people, social communities, governments and hash pointer as a link to a previous block, a businesses. The device mesh includes mobile timestamp and transaction data. By design, devices, wearable, consumer and home blockchains are inherently resistant to electronic devices, automotive devices and modification of the data. The Harvard Business environmental devices such as sensors in the Review describes it as "an open, distributed Internet of Things (IoT). ledger that can record transactions between two parties efficiently and in a verifiable and Mesh networks can relay messages permanent way." For use as a distributed using either a flooding technique or ledger, a blockchain is typically managed by a a routing technique. With routing, the message peer-to-peer network collectively adhering to a is propagated along a path by hopping from protocol for validating new blocks. Once node to node until it reaches its destination. To recorded, the data in any given block cannot be ensure that all its paths are available, the altered retroactively without the alteration of network must allow for continuous connections all subsequent blocks, which requires collusion and must reconfigure itself around broken of the network majority. paths, using self-healing algorithms such as Shortest Path Bridging. Self-healing allows a routing-based network to operate when a node breaks down or when a connection becomes unreliable. As a result, the network is typically quite reliable, as there is often more than one path between a source and a destination in the network. Although mostly used in wireless situations, this concept can also apply to wired networks and to software interaction

G.LOKESHWAR Blockchains are secured by design and II B.Sc. (Information Technology) are an example of a distributed computing

system with high Byzantine fault tolerance. Decentralized consensus has therefore been BLOCKCHAIN TECHNOLOGY achieved with a blockchain. This makes blockchains potentially suitable for the A blockchain, originally block chain, is recording of events, medical records, and other a continuously growing list of records, called records management activities, such as identity blocks, which are linked and secured using 18

management, transaction processing, database, blockchains prevent two transactions documenting provenance, food traceability[15] from spending the same single output in a or voting. blockchain. Opponents say that permissioned systems resemble traditional corporate The first blockchain was conceptualized databases, not supporting decentralized data in 2008 by an anonymous person or group verification, and that such systems are not known as Satoshi Nakamoto and implemented hardened against operator tampering and in 2009 as a core component of bitcoin where it revision. Computerworld claims that "many in- serves as the public ledger for all transactions. house blockchain solutions will be nothing The invention of the blockchain for bitcoin more than cumbersome databases." The made it the first digital currency to solve the Harvard Business Review defines blockchain double spending problem without the need of a as a distributed ledger or database open to trusted authority or central server. The bitcoin anyone. design has been the inspiration for other Permissionless applications. The great advantage to an open, permissionless, or public, blockchain network Openness is that guarding against bad actors is not Open blockchains are more user required and no access control is needed. This friendly than some traditional ownership means that applications can be added to the records, which, while open to the public, still network without the approval or trust of others, require physical access to view. Because all using the blockchain as a transport layer. early blockchains were permissionless, controversy has arisen over the blockchain Bitcoin and other cryptocurrencies definition. An issue in this ongoing debate is currently secure their blockchain by requiring whether a private system with verifiers tasked new entries include a proof of work. To and authorized (permissioned) by a central prolong the blockchain, bitcoin uses Hashcash authority should be considered a blockchain. puzzles developed by Adam Back in the 1990s. Proponents of permissioned or private chains argue that the term "blockchain" may be Financial companies have not applied to any data structure that batches data prioritised decentralized blockchains. In 2016, into time-stamped blocks. These blockchains venture capital investment for blockchain serve as a distributed version of multiversion related projects was weakening in the USA but concurrency control (MVCC) in databases. Just increasing in China. Bitcoin and many other as MVCC prevents two transactions from cryptocurrencies use open (public) blockchains. concurrently modifying a single object in a 19

As of November 2017, bitcoin has the highest decisions that favor some groups at the expense market capitalization. of others. and "the bitcoin blockchain is protected by the massive group mining effort. Permissioned (private) blockchain It's unlikely that any private blockchain will try Permissioned blockchains use an access to protect records using gigawatts of computing control layer to govern who has access to the power it's time consuming and expensive." He network. In contrast to public blockchain also said, "Within a private blockchain there is networks, validators on private blockchain also no ‘race’; there's no incentive to use more networks are vetted by the network owner. power or discover blocks faster than They do not rely on anonymous nodes to competitors. This means that many in-house validate transactions nor do they benefit from blockchain solutions will be nothing more than the network effect. Permissioned blockchains cumbersome databases." can also go by the name of 'consortium' or G.LOKESHWAR 'hybrid' blockchains. II B.Sc. (Information Technology)

The New York Times noted in both 2016 and 2017 that many corporations are using blockchain networks "with private DIGITAL TWIN blockchains, independent of the public system."

Disadvantages Nikolai Hampton pointed out in Computer world that "There is also no need for a ‘51 percent’ attack on a private blockchain, as the private blockchain already controls 100 percent of all block creation resources. If you could attack or damage the blockchain creation tools on a private corporate server, you could effectively control 100 percent of their network Digital twin refers to a digital replica of and alter transactions however you wished." physical assets, processes and systems that can This has a set of particularly profound adverse be used for various purposes. The digital implications during a financial crisis or debt representation provides both the elements and crisis like the financial crisis of 2007–08, the dynamics of how an Internet of Things where politically powerful actors may make device operates and lives throughout its life cycle. 20

Digital twins integrate artificial companions for the physical objects. It can be intelligence, machine earning and software used to view the status of the actual physical analytics with data to create living object, which provides a way to project digital simulation models that update and physical objects into the digital world. For change as their physical counterparts change. A example, when sensors collect data from a digital twin continuously learns and updates connected device, the sensor data can be used itself from multiple sources to represent its near to update a "digital twin" copy of the device's real-time status, working condition or position. state in real time the term "device shadow" is This learning system, learns from itself, also used for the concept of a digital twin. The using sensor data that conveys various aspects digital twin is meant to be an up-to-date and of its operating condition from human experts, accurate copy of the physical object's such as engineers with deep and relevant properties and states, including shape, position, industry domain knowledge; from other similar gesture, status and motion machines; from other similar fleets of A digital twin also can be used machines; and from the larger systems and for monitoring, diagnostics and prognostics to environment in which it may be a part of. A optimize asset performance and utilization. In digital twin also integrates historical data from this field, sensory data can be combined with past machine usage to factor into its digital historical data, human expertise and fleet and model. simulation learning to improve the outcome of In various industrial sectors, twins are prognostics. being used to optimize the operation and NIVEDHA N maintenance of physical assets, systems and I1I B.Sc. (Computer Technology) manufacturing processes. It is a formative technology for the Industrial Internet of Things, where physical objects can live and interact with other machines and people virtually. PUZZLE

An example of how digital twins are A solid, four-inch cube of wood is used to optimize machines is with the coated with blue paint on all six sides. Then the maintenance of power generation equipment cube is cut into smaller one-inch cubes. These such as power generation turbines, jet engines new one-inch cubes will have three blue sides, and locomotives. two blue sides, one blue side, or no blue sides.

Another example of digital twins is the How many of each will be there ? use of 3D modelling to create digital

21

Puzzle Solution: 2. What has a head and a tail, but no Subtract the outside squares, which will all body? have some paint on them Ans: Coin 16 top 3. What has an eye but cannot see? 16 bottoms Ans: Needle. 8 more left 8 more right 4. There was a green house. Inside the 4 more front green house there was a white house. Inside the white house there was a red 4 more back house. Inside the red house there were = 56 lots of babies. What is it? So, only 8 will have no blue side. Ans : Watermelon. The only cubes that will have blue on only one side will be the four center squares of 5. What kind of room has no doors or each side of the 64-cube so, 4 x 6 sides = 24 windows? cubes. Ans: Mushroom. The only cubes that will have blue on three sides are the corner pieces there are 8 6. Which creature walks with four legs in corners, so 8 cubes. the morning, two legs in the afternoon, and three legs in the evening? The cubes with two sides blue are the Ans: Man. He crawls on all fours as a edge cubes between the corners two on each baby, then walks on two feet as an side of the top and bottom, so 2 x 4 sides x 2 adult, and then walks with a cane as an (top and bottom) = 16, + the side edge/non- old man. corner pieces, which will be another 2 x 4 =

8,So No blue side = 8, 1 side = 24 2 sides = 7. If you have me, you want to share me. 24 3 sides = 8, Total = 64 cubes. If you share me, you haven't got me. What am I? NIVEDHA N Ans: Secret. I1I B.Sc. (Computer Technology)

8. Forward I am heavy, but backward I am

not. What am I? RIDDLES Ans: Forward I am ton, backwards I am 1. What can travel around the world while not. staying in a corner? Ans: A stamp. 22

9. What is as light as a feather, but even the world's strongest man couldn't hold it for more than a minute? Ans: His breath.

10. What is always coming but never arrives? Ans: Tomorrow. SINDUJA T I B.Sc. (Computer Technology)

23