Sintetiche riflessioni sulla evoluzione della tecnologia e ’Industry A cura di CT.TA.STC

Numero 13 – 7 giugno 2019

L’Archivio delle Triggering News è a questo link

5G, FutureNet e Tecnologie

• Il massacro economico dei telco europei in 10 anni rispetto al resto del mondo • Microsoft to build "sustainable" data centers in Sweden • plans fourth Belgian data center for €600 million • Gogo per collegare gli aerei in volo con 5G • AT&T uses climate-change models to deploy new cell sites • Key trends that will define the cloud in 2020 • Google to invest $670m to build a second data center in Hamina, Finland • The Voxlab: 20 startup per un'alternativa europea a Alexa • Amazon to launch its own MVNO with potential acquisition of Boost • Amazon is considering a bid for Boost Mobile: Reuters • Amazon seen spreading its tentacles to 5G with T-Mobile interest • Telecom operators based on Capex in 2018

Servizi e Terminali

• Your Apple iTunes Listening Data Is Only Worth About 8 Cents | Digital Trends • Apex Legends: How a video game supported a million concurrent players on its second day • MIT Scientists : e se vivessimo in una "simulazione" • An update on Sunday’s service disruption | Google Cloud Blog • iRobot's new Roomba and Bravaa mop can clean together automatically • Virgin Media launches Plume smart home service

5G, FutureNet e Tecnologie

Il massacro economico dei telco europei in 10 anni rispetto al resto del mondo I risultati delle politiche regolatorie europee, senza dubbio

However, European telcos are struggling the most and that pain shows few signs of abating. Over the past decade total mobile revenues and EBITDA have decreased 4.4% and 5.7%, on a Compound Annual Growth Rate, respectively (accounting for currency fluctuation the impact appears less pronounced). Some of the region’s most important figures are expressing their concern. Stéphane Richard, Orange’s CEO, said the value of European telcos is now at a 15-year low, raising questions about return on investment, strict regulation and risk of retreating behind other regions. “We are investing less in telecoms and technology than anywhere in the world. Europe is clearly not in a good position to win,” Richard said1. Amid heavy regulation, market consolidation, inflated spectrum valuations and increased competition, European markets have performed well below their US and Asian counterparts.

Exhibit 1: Mobile revenues growth, 2008-2018 ($ Billions)

Exhibit 2: Mobile EBITDA growth, 2008-2018 ($ Billions)

Tags: Economics https://www.deltapartnersgroup.com/state-european-telcos-what-left-europe-behind- Source:

race

Microsoft to build "sustainable" data centers in Sweden Purtroppo l'italia è fuodi da questi investimenti ... un peccato

With renewable energy supplied by Vattenfall Microsoft will build large-scale data centers in Gävle and Sandviken, Sweden, with the aim of creating the company’s most sustainable data centers to date. The company will work with Swedish utility provider Vattenfall to power the facilities with 100 percent renewable energy and work on achieving zero-waste for the data centers' operations.

Green giant Vattenfall will also work with Microsoft to develop new power infrastructure in the region to create more stable power for the facilities and the surrounding area. Andreas Regnell, SVP of strategic development at Vattenfall, said: “Vattenfall Distribution as the regional network owner will construct and build the distribution infrastructure required to connect the large-scale facilities. Over time, the new infrastructure will help further reduce the carbon footprint of the data centers, while at the same time reinforce an already strong electricity grid in Gävle and Sandviken to the benefit of the people who live there.” Gävle is a city in central Sweden with just over 100,000 inhabitants, with the smaller Sandviken found 25km to the west of it. Both towns are about two hours drive from Stockholm. Microsoft acquired 130 hectares of land across the two sites in Sweden in December of last year for SEK 269 million (US$29.6m).

Tags: Cloud Data Center https://www.datacenterdynamics.com/news/microsoft-build-sustainable-data-centers- Source:

sweden/

Google plans fourth Belgian data center for €600 million C'è chi vende c'è chi compra

Google will spend €600 million ($670m) building a fourth data center in Saint-Ghislain, Belgium. The investment in the Belgian campus will be Google’s largest in the country to date and construction is expected to finish in 2021.

Geld geld geld Saint Ghislain was Google’s first data center in Europe, the facility opened in 2010 and cost the company €250m ($280m). In 2015, a second building was added for €300m ($336m). Google’s third data center on the campus is still under construction and is due to open by the end of this year. Local business publication De Tijd reported last year that the tech giant spent €250m ($280m) on the facility. This fourth data center will bring Google’s total investment in the campus to €1.6 billion ($1.79bn). Interim Prime Minister of Belgium, Charles Michel, said: "This new investment in Saint- Ghislain is excellent news for Wallonia and Belgium and confirms our digital pioneering position acquired in recent years. Our country plays a leading role in Europe in developing the growing digital economy that creates many jobs.

“Our know-how and skilled workforce is driving service providers such as Google and many start-ups to continue investing and growing here." Less than a week ago, Google invested €600m ($670m) to build a second data center in Hamina, Finland. The tech giant’s original Finnish facility opened in 2011, making it Google’s second European data center. In 2013 Google invested €450m ($504m) to nearly double the size of the facility.

Tags: Cloud Data Center https://www.datacenterdynamics.com/news/google-plans-600-million-on-fourth- Source:

belgian-data-center/

Gogo per collegare gli aerei in volo con 5G per ora solo negli USA ... in Europa esiste un servizio analogo in 4G fatto da DT

The new air-to-ground (ATG) network will be designed for use on business aviation aircraft, commercial regional jets and smaller mainline jets operating within the contiguous United States and Canada. The 5G network will leverage Gogo's existing infrastructure of more than 250 towers and will use unlicensed spectrum in the 2.4GHz range, along with a proprietary modem and advanced beamforming technology. Gogo said its 5G infrastructure will support all spectrum types (licensed, shared, unlicensed) and bands (mid, high, low), and will allow Gogo to take advantage of new advances in technology as they are developed.

Tags: 5G Source: https://concourse.gogoair.com/gogo-to-launch-5g-network-for-aviation/

AT&T uses climate-change models to deploy new cell sites

AT&T is working with the U.S. Department of Energy's Argonne National Laboratory to assess the risks that climate change might pose for its business. The carrier has created a Climate Change Analysis Tool based Argonne's regional climate modeling, which is capable of visualizing climate impacts at the neighborhood level 30 years into the future. Kelly King, who was recently named executive vice president of postpaid wireless products at AT&T, did a short stint in AT&T’s Data Office. The group is using machine learning and artificial intelligence for a variety of initiatives, including climate-change modeling. He said AT&T is concerned about the impact that climate change will have on investment decisions for things such as cell tower siting. Working with Argonne, the carrier used historical weather models and flood-plain data to determine where to deploy infrastructure. “We worked with them and took climate modeling, and applied the variables associated with climate change and predicted the temperature rise over the course of 25 years,” said King. He said the original models took into account inland and coastal flooding among other factors, but the models could only predict within about a 200-square-meter accuracy. “That’s too big for us to make infrastructure decisions,” said King. The partners then started using machine learning and artificial intelligence techniques to consider additional variables such as soil conditions and hydrology. “We then applied those models and got to 12x12-meter accuracy,” he said. That granularity is helping AT&T to make decisions about where to place cell sites. King recently was promoted to executive vice president of postpaid wireless products at AT&T. Jason Porter is AT&T’s new chief data officer.

https://www.fiercewireless.com/operators/at-t-uses-climate-change-models-to-site- Source:

new-infrastructure?utm_source=internal&utm_medium=rss

Key trends that will define the cloud in 2020

Edge computing will reinvent the cloud Cloud computing can be described as a centralized data centre remotely running thousands of physical servers. However, one of the biggest opportunities brought by cloud computing is a distributed cloud infrastructure - known as edge computing. Organizations require near-instant to data and computing power to serve their customers, and they are increasingly looking to edge computing to provide a suitable infrastructure. IoT requires collection and processing of big amounts of data in real-time and with low latency level, and edge computing is an essential technology for enabling this. By sending only the most important information to the cloud, as opposed to raw streams of it, edge computing will help IoT systems to significantly lower connectivity costs. One of the most popular use cases sensor-enabled utility on field equipment analysing and filtering the data prior to sending it and taxing network and computing resources. Edge computing is not the final stage in the cloud development, but an important element driving mass adoption of the cloud.

Continued containerization Moving forward we will see wider adoption of cloud containers, the technology allowing developers to manage and migrate software code to the public cloud servers. In a recent study, Forrester estimated a third of enterprises to already use container technology in software development. Additionally, 451 Research recently forecasted a $2.7 billion annual growth in 2020, to 40 per cent comparing to this year. Containerization is the best solution for businesses using multi-cloud infrastructure, as the technology enables portability between AWS, Azure and Google Cloud, and speeds up software production by adjusting DevOps strategies. Kubernetes is becoming the biggest trend in container deployment by enabling OS-level virtualization over hardware virtualization. It is clear that container technology will become one of the key development frameworks in the near future.

Mainstream serverless adoption “Serverless computing” can be perceived as a misleading term, as applications still run on servers. Nevertheless, by using serverless computing, a cloud provider manages the code execution only when required and charges for it only when the code is running. By adopting serverless computing, businesses can avoid provisioning and maintaining servers while producing the code. Serverless computing gained mainstream popularity back into 2014 when AWS unveiled Lambda during its Reinvent Conference followed by the recent AWS announcement of its open source project Firecracker. Serverless computing is predicted to be one of the biggest developments in the cloud space, however, the serverless transition would require a strategic approach. Moving to serverless infrastructure requires an overhaul of traditional development and production paradigm, meaning outsourcing the entire infrastructure to the cloud.

Serverless computing will be adopted and developed together with a growing amount of use cases, so it’s full potential is hard to predict right now. Currently, available solutions tend to lock customers into a specific cloud provider, but a growing number of open source solutions in this space will provide more use cases for serverless computing across the industry.

The rise of open source Open source software is growing in popularity among enterprises, as an increasing number of organizations are integrating open source solutions into their IT operations or even building entire infrastructures around it. Up to 60 per cent of IT decision makers recently surveyed by Black Duck Software reported already using open source software, with more than half contributing to open source projects. Open source thrives in the ecosystem created by cloud computing. The growing number of open source DevOps tools and automation and infrastructure platforms such as OpenStack and Kubernetes are playing a key part in fast-growing open source adoption. As organizations continue to migrate their operations to the cloud, open source will play one of the key parts in IT innovation beyond 2020.

Source: https://www.datacenterdynamics.com/opinions/key-trends-will-define-cloud-2020/

Google to invest $670m to build a second data center in Hamina, Finland An expansion for the giant's sea-water cooled Nordic flagship

Google will invest €600 million ($670m) to build a second data center in Hamina, Finland. The new facility will be located in the old Summa paper factory, like its predecessor, opened by Google in 2011. The national and local governments welcomed the news, according to YLE.fi. Permanent secretary of the ministry of economic affairs and employment Jari Gustafsson, said: ”Google’s significant data center investment is good news and an indication of a stable and competitive environment. This investment strengthens Finland’s digital infrastructure." In Hamina, the municipal head Hannu Muhonen, said: ”This is the largest ever single investment in Hamina’s history and it will have a positive effect on employment during the construction phase and after. The investment also increases the attractiveness of the whole region. For over a decade, cooperation between Google and the City of Hamina has worked very well." Hamina is located on the coast, some 145 kilometers east of the Finnish capital Helsinki and has a population of slightly over 20,000 people. The existing Hamina facility was opened in 2011, and has been upgraded several times since, most recently in 2018, when Google opened a cloud region in Finland. The facility is cooled by sea water- a cooling system it inherited from the paper plant which preceded it - and Google has financed farms to help power it.

Tags: Cloud Data Center https://www.datacenterdynamics.com/news/google-invest-670m-to-build-second- Source:

facility-hamina-finland-/

The Voxlab: 20 startup per un'alternativa europea a Alexa Un tentativo paneuropeo di 20 startup di reagire al predominio USA con una iniziativa federata europea per gli Intelligent Assistant

Notre Manifeste La création d’applications vocales, et en particulier d’assistants vocaux, demande la mise en œuvre, parfois complexe, de nombreuses briques techniques (ASR, NLU, NLP, TTS, applicatifs,… ). Et ceci qu’elles fonctionnent sur des terminaux spécifiques destinés au grand public (Alexa d’Amazon, Google Home,…), sur des terminaux spécialisés (automobiles, systèmes domotiques,…) ou sur des terminaux mobiles ( smartphones, tablettes,…). Des grands acteurs industriels américains (GAFAM) et chinois (BATX) proposent des systèmes qui simplifient la création de telles applications en utilisant des systèmes favorisant leur propre stratégie industrielle. Toute entreprise qui développe et déploie des applications sur ces systèmes crée donc de la valeur au profit d’opérateurs qui sont ou seront leurs concurrents, voire leurs prédateurs. Les fournisseurs d’applications vocales, et en premier lieu les grandes entreprises françaises et européennes, souhaitent donc pouvoir développer et exploiter des services fonctionnant sur tout système, sans dépendre exclusivement de ceux proposés par les GAFAM/BATX. Cela est possible car de nombreux acteurs (éditeurs logiciels, sociétés de services, laboratoires de recherche, développeurs indépendants, …) proposent des composants pouvant être assemblés pour créer des applications vocales parfaitement adaptables à la

stratégie de chacun. Mais leur adoption nécessite de garantir que ces développements, sur lesquels vont rapidement reposer une part croissante de la stratégie de leurs promoteurs, vont bénéficier au moins de la même simplicité de développement, de la même fiabilité et de la même pérennité que les systèmes proposés par les GAFAMs. Cette exigence est difficilement compatible avec la fragmentation des offres techniques et la disparité des acteurs à mobiliser. C’est la raison pour laquelle l’ensemble de ces acteurs se sont regroupés au sein de l'initiative « Le Voice Lab », forum regroupant toute entreprise française productrice ou utilisatrice de technologies vocales. Le Voice Lab favorise la création d’offres cohérentes, fiables, scalables et interopérables, tant au niveau technique que commercial.

Elle formalise également le souhait de l’ensemble de ses membres que les applications vocales, prochain territoire de la révolution digitale, soient développées dans le respect des intérêts de leurs fournisseurs et de leurs utilisateurs, français et européens.

Source: http://www.levoicelab.org/

Amazon to launch its own MVNO with potential acquisition of Boost MVNO infrastrutturato con cui rimarrebbe un quarto operatore

Amazon is just one of a string of parties who are interested in acquiring Boost US web giant, Amazon, is reportedly preparing a bid for T-Mobile's pay-as-you-go subsidiary, Boost. T-Mobile is keen to shed Boost, as it looks to allay concerns that its proposed merger with Sprint would reduce competition in the pay-as-you-go sector. By selling off Boost, the US would retain a 4th, at-scale, carrier.

Sources familiar with the matter told journalists at Reuters that Amazon would be keen to acquire Boost as well as any additional spectrum that could be divested. Amazon has long been rumoured to be interested in expanding into the mobile telecoms sector, with reports in the UK suggesting that the US giant was considering launching MVNO services in Britain. A recent report suggested that Amazon would capture a huge portion of the pay-as-you- go market, if it was to launch MVNO services, coupled with its Amazon Prime services. "Amazon could well become an even bigger giant should it enter the mobile sector offering Amazon Prime as a perk. In a fiercely competitive market, it also has potential as a standalone network operator," said Henry Stott, director of Decision Technology, who published the report. Recent reports in the press suggested that T-Mobile had lined up a string of suitors who were prepared to pay up to $3 billion for the pay-as-you-go sub.

Tags: 5G https://www.totaltele.com/503023/Amazon-to-launch-its-own-MVNO-with-potential- Source:

acquisition-of-Boost

Amazon is considering a bid for Boost Mobile: Reuters Boost Mobile ha circa 8 milioni di utenti

Amazon is interested in getting into the wireless business. The online retail giant is considering a bid for Sprint’s prepaid business Boost Mobile, according to a report from Reuters that cited unnamed sources. Sprint is preparing to sell off the prepaid Boost Mobile as a condition of its proposed merger with T-Mobile. A handful of companies have already indicated their interest in acquiring the business, including a prepaid wireless firm Q Link, an unnamed private equity firm associated with FreedomPop and Boost Mobile’s founder, Peter Adderton. Reuters noted that Amazon is interested in the deal because it would give the retailer access to new T-Mobile’s wireless network for six years. If the merger is approved, Sprint and T-Mobile may also be forced to divest some spectrum assets. According to Reuters’ report, Amazon would also be interested in obtaining licenses for that spectrum should they become available. News of Amazon’s potential bid caused carrier stocks to drop Friday morning, signalling investor fears that Amazon may be able to use its scale to pose a significant competitive threat. Amazon’s interest in wireless spectrum and the prepaid business is surprising, though the company has already dabbled in the handset business. In 2014, Amazon unveiled the Fire phone, which was sold exclusively through AT&T for a period of time. The phone performed so badly that Amazon ended up reducing the price from $200 to just 99 cents a few weeks after releasing it. Amazon later took a $170 million write-down on the experiment. Amazon isn’t the only e-commerce giant interested in wireless. Japan-based Rakuten is building a new mobile network to challenge operators NTT DoCoMo, KDDI, and SoftBank in Japan.

Tags: 5G, cloud https://www.fiercewireless.com/wireless/amazon-considering-a-bid-for-boost-mobile- Source:

reuters

Amazon seen spreading its tentacles to 5G with T-Mobile interest Amazon vuole poter offrire ai clienti rete 5G e cloud in una unica offerta per la massima qualità

One influential analyst called Amazon.com’s reported interest in the U.S. wireless market “batshit crazy.” Another said it was a “trojan horse” to kill off the big wireless players. What’s becoming clear to the people who track the company is the ambitions of the world’s largest online retailer’s go far beyond replacing your phone service. Amazon is interested in acquiring Boost Mobile, which will be sold by U.S. wireless carriers T-Mobile US Inc and Sprint Corp as part of plans to win regulatory approval of their proposed $26.5 billion merger, primarily because a purchase would include a six-year contract that allows access to the combined company’s wireless network, Reuters reported on Thursday, citing a source familiar with the matter. The ambition signaled that Amazon is looking to dive deeper into the wireless industry, strengthen its cloud services and ultimately take advantage of the next-generation 5G networks that are expected to transform major facets of technology, according to analysts.

That means the tech giant, which has evolved from a mere bookseller to a superstore of physical and digital products, could be setting its sights on taking on carriers such as AT&T Inc and strengthening itself against rival cloud providers like Microsoft. Amazon did not immediately respond to a request for comment. Shares of Inc and AT&T both closed down more than 4% on Friday. The market assumes Amazon wants to become a wireless carrier itself, “but that thinking is too small,” said Colby Synesael, an analyst with Cowen. Amazon’s larger motivation could be that it expects 5G to be integral to cloud services in the future for industries like healthcare and autos, which will use 5G-enabled devices, he said. These devices will connect to Amazon’s cloud, which will store the data. If Amazon can control both the wireless network and the cloud, it can sell the full suite of products for customers who want to build 5G services, which would give it an advantage over competing cloud providers, he said. Barclays analyst Kannan Venkateshwar also saw a wide opportunity for Amazon. “In a 5G world, the opportunity from wireless for Amazon could include everything from managing warehouses … to reducing churn at Prime,” he said in a research note to clients on Friday. Prime is Amazon’s paid subscription service that includes free delivery and access to streaming music and video. Amazon has dabbled in this arena before. A 2014 attempt to take on Apple’s iPhone with the Fire Phone failed just a year later. In 2007 Amazon launched its free Whispernet wireless service with the first Kindle reading device that let readers download books over the air. Last month, Amazon announced plans to launch over 3,000 satellites to provide access to rural areas around the world.

WIRELESS BUSINESS If Amazon decides to compete as a more traditional wireless provider, “the impact for the wireless carriers, and particularly Verizon and AT&T, would be bleak,” said Jonathan Chaplin, an analyst with New Street Research. “They are happy to accept losses for years in order to eventually own a market,” he said. Craig Moffett of MoffettNathanson, the analyst who called the idea of Amazon’s wanting to enter wireless “batshit crazy,” said the prospect of Amazon wanting “to operate their own proprietary network for such purposes is economically insane.” The sheer cost of attempting to build a national network would be an obstacle, he said, noting that Verizon and AT&T have each spent some $120 billion on their networks in the last decade alone. T-Mobile and Sprint offered a series of concessions last week, including to sell Boost, in order to gain approval for the merger. The Federal Communications Commission has given the green light, the deal still needs approval from the U.S. Department of Justice.

Tags: 5G, cloud https://www.reuters.com/article/us-sprint-corp-m-a-t-mobile-us-amazon-co/amazon- Source:

seen-spreading-its-tentacles-to-5g-with-t-mobile-interest-idUSKCN1T12IN?rpc=401&

Telecom operators based on Capex in 2018

MTN Consulting, a leading research agency, has released the latest Capex report for top telecom operators.

https://telecomlead.com/telecom-services/telecom-operators-based-on-capex-in-2018- Source:

90163

Servizi e Terminali

Your Apple iTunes Listening Data Is Only Worth About 8 Cents | Digital Trends

8 centesimi di dollaro è il prezzo che Apple chiede per vendere un profilo cliente di Apple iTunes a terze parti

The lawsuit cites a company called Carney Direct Marketing, which advertised iTunes and Pandora user data at a base price of $80 per thousand records — or 8 cents per user in the database. Carney Direct Marketing offers additional demographic data — including age, education, gender, geography, household income, marital status, and presence of children — for added fees ranging from .8 to 1 cent per user. That’s $8 to $10 per thousand users.

Tags: Big Data https://www.digitaltrends.com/apple/apple-itunes-lawsuit-music-private-data- Source:

worth/?utm_source=sendgrid&utm_medium=email&utm_campaign=daily-brief

Apex Legends: How a video game supported a million concurrent players on its second day Se vuoi scalare e affidabilità, vai multicloud

Max 'Beast from the East' Smolaks covers storage, servers and open source, as well as all the news coming out of Eastern Europe. If you find him lost on the streets of London, feed him coffee and sugar The secret is multi-cloud infrastructure from the Big Three

Launching a new online video game is hard – besides making something that’s actually enjoyable to play, you have to think about latency and bandwidth, doing everything in your power to avoid lag – the delay between player input and action on the screen. Lag is born when networks and servers can’t keep up with demand – when developers don’t reserve enough capacity, can’t scale fast enough, or their data centers are located too far away. This is less of a problem than it used to be, but AAA game launches are an exception: player demand is hard to predict on day one, and a simultaneous worldwide release of a hotly anticipated title can, in effect, resemble a DDoS attack. Such launches rarely go smoothly [AAA is gamer-speak for "major" - Editor]. For this reason, the recent arrival of Respawn Entertainment’s Apex Legends was such a surprise – the game attracted 10 million players in 72 hours, and saw at least one million on its servers at the same time. Despite this unprecedented level of demand, the uptime remained (almost) rock-solid. Two weeks later, Apex is on track to become the most popular game in the world. What is its secret, you ask? Respawn went with public cloud – and to improve resilience and geographic reach, not just one, but the world’s three largest public cloud providers: Amazon, Google and Microsoft.

Uptime legends The struggle against lag is an important part of gaming history. “The PlayStation Network during the PS3 era was a terrible service, designed to test the limits of gamers' ability to handle error codes, unscheduled maintenance and loading screens,” said our own Seb Moss, who was reporting on the subject at the time. “The PS3 was Sony's first real 'online-ready' console, except Sony was not ready to go online. Slowly and sporadically it improved - the company struggled to invest in the free service as its gaming division hemorrhaged money. “These were not good times for Sony, nor for players. It was a great time to be a games journalist covering downtime, though.” At the same time, game developers that treat their infrastructure with the respect and attention it deserves pass into legend – like CCP and its Eve Online, a persistent, incredibly detailed online world that has been going since 2003 and still enjoys, on average, more than 20,000 players at any one time. Eve is famous for its network architecture, which enabled some of the largest virtual conflicts ever - like the infamous Bloodbath of B-R5RB in 2014 that saw thousands of players contest one of its 7,800 star systems for 21 hours straight. Apex Legends does things on a much smaller scale – it features 60 people per single game world, each with their unique combination of weapons and equipment. The game exploits the popular Battle Royale formula, seen in games like Fortnite and Player Unknown’s Battlegrounds – one round takes around 20 minutes, during which players hunt each other until only one team remains. Unlike Eve Online, a game about starships that is lovingly referred to as ‘spreadsheet simulator’ by its fans, Apex offers something called twitch gameplay - a reaction-based process where a fraction of a second can mean the difference between victory and defeat. In this scenario, responsiveness is paramount, and any technical issues will see players leave. The game wears its infrastructure credentials on its sleeve: as you log in, a tab called ‘Data Centers’ is one of the first things you will see. It shows a total of 44 different facilities around the world: Google Compute Engine sites are easy to identify by the ‘GCE’ tag – the rest are a combination of facilities from AWS and Microsoft Azure, plus some bare metal servers. With public cloud infrastructure, scaling up is easy – just add more virtual machines. Another one of Respawn’s achievements is non-disruptive updates. In online gaming, it is commonplace to schedule long maintenance windows, during which servers are inaccessible and updates can be applied. Thanks to its flexible infrastructure, Respawn has been rolling out updates without having to stop the game. “This has been a truly incredible journey. We tested and tweaked. We argued and agreed. We got to a point where we felt some magic,” Respawn CEO Vince Zamplella said in a blog post on the game’s sudden popularity. The same company was previously responsible for the hugely successful Titanfall series – another game with a strong online component – and that’s where it learned to love the cloud. Engadget reported that the original Titanfall relied on infrastructure from Microsoft. The sequel was a bigger, much more complex game – and the creators opted for a multi-cloud architecture, managed using software from UK-based Multiplay. Back in 2016, this was an act of considerable bravery.

"Since this launch is going to be even bigger than the last game, I really wanted to make sure that we had an insane amount of scalability and reliability," lead engineer Jon Shiring said at the time. More recently, he tweeted that Apex Legends’ network model and netcode was developed by a team of just three people. If this doesn’t suggest the maturity of the multi-cloud model, I don’t know what does. Cloud computing is already enabling a quiet revolution in gaming, but having a single cloud provider is not enough – just think about Pokemon Go, which relied exclusively on Google Cloud in 2016, and paid the price; developer Niantic was forced to pause the global launch, while it was beefing up its infrastructure. Apex Legends’ publisher Electronic Arts would be wise to look at the multi-cloud approach for its recently announced game streaming service, Project Atlas, as well as any other upcoming AAA titles. A pipe dream just a few years ago, today game streaming is being worked on by dozens of teams - from startups like Blade to Internet juggernauts like Amazon. Google is widely expected to unveil its own game streaming tech at the Game Developers Conference in San Francisco in March. Gaming-as-a-Service is the Holy Grail of digital entertainment, enabling customers to play the latest titles without the need to invest into expensive hardware. What I have experienced playing Apex Legends makes me think that our infrastructure is finally ready for this.

Tags: Cloud Data Center, gaming https://www.datacenterdynamics.com/opinions/apex-legends-how-video-game- Source:

supported-million-concurrent-players-its-second-day/

MIT Scientists : e se vivessimo in una "simulazione" Affascinante

What if I told you that physical reality is an illusion and we all live in a computer simulation? That hypothesis, famously probed in the 1999 film The Matrix, is the subject of a new book by Rizwan Virk, a computer scientist and video game developer who leads Play Labs at the Massachusetts Institute of Technology. In his book, The Simulation Hypothesis, Virk endeavors to unpack the heady arguments that call our physical world into question. ADVERTISEMENT Are we all just artificial intelligence (A.I.) programs running on the basement servers of some advanced future civilization? Or perhaps the Wachowskis were on to something when they depicted modern society as an illusion used to enslave our minds, as our bodies powered a dystopian planet ruled by robots. Maybe there really is no spoon. It may sound like a far-fetched idea, but the simulation hypothesis is today.

We spoke to Virk about the hypothesis, why it matters, and why it has gained traction 20 years after The Matrix hit theaters. The interview has been edited and condensed for clarity.

Digital Trends: The simulation hypothesis is a complex and controversial topic. What first got you interested in writing a book about it? Rizwan Virk: I had an experience playing virtual reality ping pong and the responsiveness was very real to the point where I forgot that I was in a room with VR glasses on. When the game ended, I put the paddle on the table but, of course, there was no paddle and there was no table, so the controller fell to the floor. I even leaned over onto the table and almost fell over. That experience really got me thinking about how video game technology is evolving and how it could end up being so fully immersive that we would be unable to distinguish it from reality.

Describe the simulation hypothesis for people who aren’t familiar with it. The basic idea is that everything we see around us, including the Earth and the universe, is part of a very sophisticated MMORPG (a massively multiplayer online roleplaying game) and that we are players in this game. The hypothesis itself comes in different forms. “The basic idea is that everything we see around us is part of a very sophisticated MMORPG.” In one version, we’re all A.I. within a simulation that’s running on somebody else’s computer. In another version, we are “player characters,” conscious things that exist outside the simulation and we inhabit characters, just like you might take on the character of an elf or dwarf in a fantasy RPG.

So, for example, in The Matrix there’s that famous scene where Morpheus gives Neo the choice between the red pill or the blue pill. When he takes the red pill, he wakes up (in a vat) in the real world, where he controlled his (simulation) character. He was jacked in through a physical cable in his neocortex. In that particular version of the simulation hypothesis, we are conscious or biological beings outside of the simulation and each of us controls a character.

When The Matrix first came out, the simulation hypothesis seemed purely science fictional. Why do you think it’s taken more seriously today? The first reason is that video game technology has advanced and we can now have millions of players on a shared server. Also, 3D-rendering technology has gotten really good. We can actually represent 3D objects in 3D worlds. In the ’80s and early ’90s, there wasn’t enough computing power to render a world like World of Warcraft or Fortnite. It relied on us being able to build optimization techniques that allowed us to render just what the character sees. A third of [my] book is dedicated to video game technology, how it evolved in the past, and what the stages are to get from where we are today to a “simulation point,” (where simulation is indistinguishable from reality).

“Probability says you are more likely a simulated being than a biological one.” The other big reason why scientists and academics are starting to take it seriously is Oxford professor Nick Bostrom, who wrote an article in 2003 called “Are You Living in a Simulation?” He came up with a clever statistical argument for the simulation hypothesis. He says, suppose some civilization somewhere gets to the simulation point and can create highly realistic “ancestor simulations.” With more computing power, they can spin off new servers and new civilizations really quickly. Each of those servers can have billions or trillions of simulated beings within them. Therefore, the number of simulated beings is way more than the number of biological beings. If just one civilization reaches the simulation point, probability says you are likely a simulated being because there are way more simulated beings in existence than biological ones.

Today, we use computer simulations to predict things like planetary interactions or hurricane paths. And we play video games because they’re fun. These simulations have some inherent value, so we have incentives to create them. Besides using our bodies as batteries, like in The Matrix, what incentives would a civilization have to create so many simulated beings? In today’s computer model simulations, the computer makes random choices to see, for example, what the weather will be like. It’s very possible that whoever created our simulation would like to see, given random choices, where we as a civilization would end up. Would we destroy ourselves? Would we end up creating nuclear weapons? Would we end up creating our own simulation? Looking at the video game version of the argument, we may ask why we play video games in the first place. It’s because we like to inhabit these characters in the virtual world and do things that we wouldn’t want to do in the “real world.” If we are in fact player characters rather than just a bunch of non-player characters, then whoever created the simulation might just want to be able to play us as characters and study what our civilization looks like.

Let’s say we are in a simulation. What consequences does that have for our everyday lives? If we are inside a video game that was set up say like in Fortnite, we would want to know what the goals of the game are and what our individual quests might be. One section of the book delves into Eastern mystical traditions, including Buddhism and karma, and Western religious traditions. It discusses that we might have scores that are being kept and desires being recorded in the same way we do in esports. We all want to know what our individual quests are and what our achievements are. There are things each of us individually feels called to do, whether it’s to be a writer or a video game designer. I don’t necessarily think we’re in a simulation that has one purpose, such as to see if we can handle climate change. Instead, just like in any multiplayer video game, every character has their own individual set of quests and the freedom of choice to decide what to do next.

So, in that view, each of us may be a social experiment in its own right? That’s right, and especially if each of us is a player character, which means there’s a part of us that is outside the game. We might have certain goals or experiences that we’re here to try to fulfill. As a video game designer, we think about what kinds of paths people can follow. We might give the illusion of free will or we might lay out a specific character.

A couple years ago Elon Musk brought attention to the simulation hypothesis when he claimed that we have a one in billions chance of living in “base reality.” What are your estimates? I would say it’s somewhere between 50 and 100 percent. I think it’s more likely that we’re in simulation than not.

Tags: gaming https://www.digitaltrends.com/cool-tech/we-spoke-to-an-mit-computer-scientists- Source: about-the-simulation-

hypothesis/?utm_source=sendgrid&utm_medium=email&utm_campaign=daily-brief

An update on Sunday’s service disruption | Google Cloud Blog In sintesi, un errore di configurazione che riguardava pochi server di Google si è propagato nella rete di Google cauando 4 ore disservizi massicci

Incident, Detection and Response In essence, the root cause of Sunday’s disruption was a configuration change that was intended for a small number of servers in a single region. The configuration was incorrectly applied to a larger number of servers across several neighboring regions, and it caused those regions to stop using more than half of their available network capacity. The network traffic to/from those regions then tried to fit into the remaining network capacity, but it did not. The network became congested, and our networking systems correctly triaged the traffic overload and dropped larger, less latency-sensitive traffic in order to

preserve smaller latency-sensitive traffic flows, much as urgent packages may be couriered by bicycle through even the worst traffic jam. Google’s engineering teams detected the issue within seconds, but diagnosis and correction took far longer than our target of a few minutes. Once alerted, engineering teams quickly identified the cause of the network congestion, but the same network congestion which was creating service degradation also slowed the engineering teams’ ability to restore the correct configurations, prolonging the outage. The Google teams were keenly aware that every minute which passed represented another minute of user impact, and brought on additional help to parallelize restoration efforts.

Impact Overall, YouTube measured a 2.5% drop of views for one hour, while Google Cloud Storage measured a 30% reduction in traffic. Approximately 1% of active Gmail users had problems with their account; while that is a small fraction of users, it still represents millions of users who couldn’t receive or send email. As Gmail users ourselves, we know how disruptive losing an essential tool can be! Finally, low-bandwidth services like Google Search recorded only a short-lived increase in latency as they switched to serving from unaffected regions, then returned to normal.

https://cloud.google.com/blog/topics/inside-google-cloud/an-update-on-sundays- Source:

service-disruption

iRobot's new Roomba and Bravaa mop can clean together automatically

Where can iRobot go next after giving us a Roomba that can dump its own dustbin? How about a Roomba that can also tag team floor cleaning with a robot mop? The new Roomba s9+ and Bravaa jet m6 work in tandem: Once your floors are vacuumed, the mop automatically wakes up and gets to work. The Roomba s9+ ($1,299) is even more powerful than the previous version, with a new design that finally lets it clean walls and corners better. The Bravaa ($499) is the company's largest robot mop yet, with the ability to clean around 1,000 square feet at once. Together, they're iRobot's biggest play yet to keep your home sparkling.

https://www.engadget.com/2019/05/29/irobot-roomba-s9-bravaa-jet- m6/?guccounter=1&guce_referrer=aHR0cHM6Ly93d3cyLnNtYXJ0YnJpZWYuY29tL3J lZGlyZWN0LmFjdGlvbj9saW5rPWh0dHBzJTNBJTJGJTJGd3d3LmVuZ2FkZ2V0LmNvb SUyRjIwMTklMkYwNSUyRjI5JTJGaXJvYm90LXJvb21iYS1zOS1icmF2YWEtamV0LW02 Source: JTJGJmVuY29kZWQ9bHlsa0N5bXJkT0RodXJoYkNpZVNkZUNpY05mRVF3&guce_ref errer_sig=AQAAAAVGAFH1QXj1qr8LxVJ5LHQ5vbMX0WFLhnv6JiayurG7qxfXjULQ98 5TYWCFnBfvvhozSA1bvPiU0cqBB3yr2wsqqT9qMYM0wHJCpuD5Z3QOOgE2nnn-

83U3Nyv6nOOe9JQeuaVtrhq-xPiDU9YI2dzgAc8XsQmzqzAvnLkBl4gO

Virgin Media launches Plume smart home service una soluzione per migliorare la copertura wifi in casa

Virgin Media is to launch an invitation-only service for Plume’s smart home services bundle. Virgin customers who sign up to Plume will be able to access Plume Adaptive WiFi, an integrated Wi-Fi network powered by cloud-based artificial intelligence which includes pods that can be placed throughout the home to improve broadband coverage and speeds. It will be billed separately from the standard Virgin bill. Richard Sinclair MBE, Executive Director of Connectivity at Virgin Media said: “As speed leaders in the UK, Virgin Media never stands still and is always looking to make Britain faster. We constantly explore new and innovative ways to improve connectivity for our

customers in a way that suits their needs in the home. This is why we are working with Plume to offer a range of products to selected customers.” The Plume technology is designed to enhance the online experience of users through guest access and parental controls as part of its HomePass feature as well as AI security. Fahri Diner, CEO of Plume, said: “Plume is honoured to be partnering with Virgin Media to build on their commitment to bring customers the best Internet experience possible through reliable whole-home Wi-Fi, personalised parental controls & guest access, and tightly secured & protected devices. Together, Virgin Media’s broadband offering and Plume’s suite of services create a powerful solution for the ever-changing smart home.” A range of options are available, including a three-pod Plume package for £6 per month, to give customers the flexibility to choose the right products for their home connectivity. A select number of Virgin Media customers will be invited to sign up to Plume and they will receive a notification via email or when logging in to their My Virgin Media account.

Tags: WiFi https://www.broadbandtvnews.com/2019/05/30/virgin-media-launches-plume-smart- Source:

home-service/

That’s all Folks!