DCIG Analysts Will Speak on Ransomware and Cloud File Services at Nasuni Cloudbound2020
Total Page:16
File Type:pdf, Size:1020Kb
DCIG Analysts Will Speak on Ransomware and Cloud File Services at Nasuni CloudBound2020 DCIG analysts will share findings from recent research into cloud file services and protecting organizations from ransomware at Nasuni CloudBound2020. Nasuni CloudBound2020 This two-day online event will educate and equip IT leaders with the tools and real-world expertise needed to modernize their file infrastructure. The sessions will provide attendees with tangible takeaways that will help accelerate the journey to the cloud. Ransomware Realities: It’s When, Not If, You’re Attacked October 6 | 8:20 AM EDT Presenter: Jerome Wendt, President & Founder, Data Center Intelligence Group (DCIG) Ransomware represents one of the primary threats every size organization currently faces. The latest surveys reveal the percentage of businesses experiencing ransomware attacks may be higher than anyone initially thought. These statistics suggest it is only matter of when, not if, an organization experiences a ransomware attack. In this session, Jerome Wendt of DCIG breaks down: the threats that ransomware presents how it enters and attacks organizations, and steps that organizations can take to repel and recover from ransomware attacks. Modern File Services Enabling Enterprise Cloud Transformation October 7 | 12:50 PM EDT Presenter: Ken Clipperton, Lead Analyst, Storage, Data Center Intelligence Group (DCIG) File-based workloads in the cloud are at the heart of business innovation and collaborative workflows globally. In this session, Ken Clipperton of DCIG discusses how modern file services overcome the limitations of legacy filers to enable enterprises in their journey to the cloud. Attend at No Cost To attend these sessions on ransomware and cloud file services, please register for Nasuni CloudBound2020. Registration is free at https://info.nasuni.com/get-cloudbound2020. To be apprised of future DCIG research results and presentations, please sign up for the weekly DCIG Newsletter. Put Data Protection in Context Almost every backup, hyperconverged, security, software, and storage vendor claim to provide data protection in some way. The challenge becomes identifying the exact context in which they use the term data protection to describe their offering. Data Protection’s Umbrella Providers of these software use the phrase “data protection” to describe the functionality that its product or solution delivers. While the features of these products may fall under the general umbrella of data protection, organizations should subscribe to the following axiom: All products offer data protection, they just deliver it in different ways. Here are just some of the different ways in which they deliver it: Anti-virus Software This software protects computers in general and data specifically from getting a virus. It scans files, emails, email attachments, and other data for malicious software that runs in the background. This malicious software harms data, such as encrypting it, deleting it, or in some way making it unusable or inaccessible. If anti-virus software detects malicious software, it attempts to stop it from acting on the data. Archiving Software Archiving data can take many forms. Archives often include important data that companies access very infrequently that they must retain to satisfy compliance, legal, or regulatory requirements. Archiving software may protect data by storing it in a format that cannot be easily changed or erased. It may also store data on immutable media that lasts for decades, such as on optical media. Backup Software Backup software makes one or more copies of production data at different points in time. Most backup software gives companies the flexibility to store and manage copies of data on one or more media types. These can include cloud, disk, tape, and perhaps optical. Continuous Data Protection (CDP) Like backup software, CDP makes an initial full copy of data. However, it then tracks and logs every change to the data. CDP gives companies the flexibility to recover to any point in time since the initial backup. This software only uses the cloud or disk as a storage target. Data Loss Prevention (DLP) DLP software monitors all data leaving an organization. Most data exiting an organization, such as emails and file transfers, leaves for legitimate reasons. DLP software monitors and looks for data that shouldNOT leave an organization. In some cases, the software is used to identify and prevent data that contains social security or credit card numbers from leaving an organization. Organizations can also use this software to identify data leaving an organization as a result of insiders committing espionage. RAID RAID (redundant array of independent or inexpensive disks) preventS data loss due to the failure of a disk drive. All storage arrays, enterprise servers, and even desktop computers often possess this technology. It distributes data across two or more disk drives so that should one drive fail, no data loss occurs. Put Data Protection in Context All these technologies provide “data protection” in some form and there are more iterations of data protection than what is listed here. Taken together, they illustrate the need for companies to critically examine each vendor’s claims as to how it delivers data protection. I do not dispute that a specific vendor’s claims to deliver data protection are accurate. In fact, every technology vendor in some way probably provides data protection as part of its offering. However, each company needs to quantify: The type of data protection a vendor delivers How the vendor delivers it If the company needs that form of data protection If the company already owns that type of data protection What additional value, if any, this vendor’s implementation of data protection delivers Only after a company establishes exactly the type of data protection that a product delivers can it determine: how the feature will contribute to its abilities to protect its data; if the offering complements or competes with its existing data protection techniques; and, ultimately, if it has implemented a comprehensive data protection strategy that protects its data from every way in which it could be compromised. Glitch is the Next Milestone in Recoveries No business – and I mean no business – regardless of its size ever wants to experience an outage for any reason or duration. However, to completely avoid outages means spending money and, in most cases, a lot of money. That is why, when someone shared with me earlier this week, that one of their clients has put in place a solution that keeps their period of downtime to what appears as a glitch to their end-users for nominal cost, it struck a chord with me. The word outage does not sit well with anyone in any size organization. It conjures up images of catastrophes, chaos, costs, lost data, screaming clients, and uncertainty. Further, anyone who could have possibly been involved with causing the outage often takes the time to make sure they have their bases covered or their resumes updated. Regardless of the scenario, very little productive work gets done as everyone scrambles to first diagnose the root cause of the outage, fix it, and then takes steps to prevent it from ever happening again. Here’s the rub in this situation: only large enterprises with money to buy top-notch hardware and software backed by elite staff to put solutions in place that come anywhere near guaranteeing this type of availability. Even then, these solutions are usually reserved for a handful of mission critical and maybe business critical applications. The rest of their applications remain subject to outages of varying lengths and causes. Organizations other than large enterprises daily face this fear. While their options for speed of recovery have certainly improved in recent years thanks to disk-based backup and virtualization, recovering any of their applications from a major outage such as hardware failure, ransomware attack, or just plain old unexpected human error, it may still take hours or longer to complete the recovery. Perhaps worse, everyone knows about it and cursing out the IT staff for this unexpected and prolonged interruption in their work day. Here’s what caught my attention on the phone call I had this week. While this company in question retains its ideal of providing uninterrupted availability for its end-users as its end game, its immediate milestone is to reduce the impact of outages down to a glitch from the perspective of their end- users. Granted, a temporary outage of any applications for even a few minutes is neither ideal nor will end-users or management greet any outage with cheers. However, recovering an application in a few minutes (say in 5-10 minutes,) will be more well-received than communicating that the recovery will take hours, days, or replying with an ambiguous “we are making a best faith effort to fix the problem.” This is where setting a milestone of having any application recovery appear as a glitch to the organization starts to make sense. Solutions that provide uninterrupted availability and instant recoveries often remain out of reach financially for all but the wealthiest enterprises. However, solutions that provide recoveries that can make outages appear as only a glitch to end-users are now within reach of almost any size business. No one likes outages of any type. However, if IT can in the near-term turn outages into glitches from a corporate visibility perspective, IT will have achieved a lot. The good news is that data protection solutions that span on-premises and the cloud are readily available now that when properly implemented can well turn many applications outages into a glitch. Exercise Caution Before Making Any Assumptions about Cloud Data Protection Products There are two assumptions that IT professionals need to exercise caution before making when evaluating cloud data protection products. One is to assume all products share some feature or features in common.